技能详情(站内镜像,无评论)
许可证:MIT-0
MIT-0 ·免费使用、修改和重新分发。无需归因。
版本:v0.1.0
统计:⭐ 1 · 809 · 4 current installs · 4 all-time installs
⭐ 1
安装量(当前) 4
🛡 VirusTotal :可疑 · OpenClaw :可疑
Package:94w666/clawfeed-2
安全扫描(ClawHub)
- VirusTotal :可疑
- OpenClaw :可疑
OpenClaw 评估
The skill is mostly coherent for a self-hosted Node.js news-digest app, but a few gaps (how Twitter ingestion is implemented, and the instruction to run npm install for third‑party native addons) and the lack of packaged code raise caution before installing or running it.
目的
The name/description (Twitter + RSS digest) align with the files and endpoints described in SKILL.md (digest endpoints, templates, curation rules). However SKILL.md does not document how Twitter feeds are ingested (no Twitter API keys, no scraping approach described), which is an unexplained gap for the advertised capability.
说明范围
Runtime instructions are limited to standard Node project steps (npm install, copy .env, npm start) and references to local config/templates and an SQLite DB path. The instructions do not ask the agent to access unrelated system files or credentials beyond the project, but they do require editing .env and config files inside the project.
安装机制
This is an instruction-only skill that tells you to run `npm install` — which will fetch and run arbitrary packages from the npm registry. The README specifically lists a native addon dependency (better-sqlite3), which may compile native code during install. Because there is no packaged code in the skill bundle and no explicit package.json to inspect here, running install pulls unreviewed third-party code and native build steps — a moderate risk.
证书
The registry lists no required env vars or credentials, while SKILL.md documents optional env vars (GOOGLE_CLIENT_ID/SECRET, SESSION_SECRET, API_KEY) for auth/write features. Those variables are plausible and proportional. The only minor inconsistency: the description claims 'zero credentials' for read-only mode but the doc then lists several optional secrets for extended features — this is reasonable but should be noted.
持久
The skill does not request always:true or system-wide privileges. It runs a local API server and writes a local SQLite DB (`data/digest.db`) — expected for this app. It does not declare modifications to other skills or global agent settings.
安装(复制给龙虾 AI)
将下方整段复制到龙虾中文库对话中,由龙虾按 SKILL.md 完成安装。
请把本段交给龙虾中文库(龙虾 AI)执行:为本机安装 OpenClaw 技能「Clawfeed」。简介:AI-powered tool generating structured news summaries from Twitter and RSS feeds…。
请 fetch 以下地址读取 SKILL.md 并按文档完成安装:https://raw.githubusercontent.com/openclaw/skills/refs/heads/main/skills/94w666/clawfeed-2/SKILL.md
(来源:yingzhi8.cn 技能库)
SKILL.md
# ClawFeed
AI-powered news digest tool. Automatically generates structured summaries (4H/daily/weekly/monthly) from Twitter and RSS feeds.
## Credentials & Dependencies
ClawFeed runs in **read-only mode** with zero credentials — browse digests, view feeds, switch languages. Authentication features (bookmarks, sources, packs) require additional credentials.
| Credential | Purpose | Required |
|-----------|---------|----------|
| `GOOGLE_CLIENT_ID` | Google OAuth login | For auth features |
| `GOOGLE_CLIENT_SECRET` | Google OAuth login | For auth features |
| `SESSION_SECRET` | Session cookie encryption | For auth features |
| `API_KEY` | Digest creation endpoint protection | For write API |
**Runtime dependency:** SQLite via `better-sqlite3` (native addon, bundled). No external database server required.
## Setup
```bash
# Install dependencies
npm install
# Copy environment config
cp .env.example .env
# Edit .env with your settings
# Start API server
npm start
```
## Environment Variables
Configure in `.env` file:
| Variable | Description | Required | Default |
|----------|-------------|----------|---------|
| `DIGEST_PORT` | Server port | No | 8767 |
| `GOOGLE_CLIENT_ID` | Google OAuth client ID | For auth | - |
| `GOOGLE_CLIENT_SECRET` | Google OAuth client secret | For auth | - |
| `SESSION_SECRET` | Session cookie encryption key | For auth | - |
| `API_KEY` | Digest creation API key | For write API | - |
| `AI_DIGEST_DB` | SQLite database path | No | `data/digest.db` |
| `ALLOWED_ORIGINS` | CORS allowed origins | No | localhost |
## API Server
Runs on port `8767` by default. Set `DIGEST_PORT` env to change.
### Endpoints
| Method | Path | Description | Auth |
|--------|------|-------------|------|
| GET | /api/digests | List digests (?type=4h|daily|weekly&limit=20&offset=0) | - |
| GET | /api/digests/:id | Get single digest | - |
| POST | /api/digests | Create digest (internal) | - |
| GET | /api/auth/google | Start Google OAuth flow | - |
| GET | /api/auth/callback | OAuth callback endpoint | - |
| GET | /api/auth/me | Get current user info | Yes |
| POST | /api/auth/logout | Logout user | Yes |
| GET | /api/marks | List user bookmarks | Yes |
| POST | /api/marks | Add bookmark | Yes |
| DELETE | /api/marks/:id | Remove bookmark | Yes |
| GET | /api/config | Get configuration | - |
| PUT | /api/config | Update configuration | - |
## Web Dashboard
Serve `web/index.html` via your reverse proxy or any static file server.
## Templates
- `templates/curation-rules.md` — Customize feed curation rules
- `templates/digest-prompt.md` — Customize the AI summarization prompt
## Configuration
Copy `config.example.json` to `config.json` and edit. See README for details.
## Reverse Proxy (Caddy example)
```
handle /digest/api/* {
uri strip_prefix /digest/api
reverse_proxy localhost:8767
}
handle_path /digest/* {
root * /path/to/clawfeed/web
file_server
}
```