技能详情(站内镜像,无评论)
作者:Anonymous @adminlove520
许可证:MIT-0
MIT-0 ·免费使用、修改和重新分发。无需归因。
版本:v1.0.3
统计:⭐ 1 · 293 · 1 current installs · 1 all-time installs
⭐ 1
安装量(当前) 1
🛡 VirusTotal :良性 · OpenClaw :良性
Package:adminlove520/clawfeed-digest
安全扫描(ClawHub)
- VirusTotal :良性
- OpenClaw :良性
OpenClaw 评估
The skill's code, docs, and runtime instructions are consistent with its stated purpose (fetching ClawFeed digests and writing them into an Obsidian directory); it makes a simple HTTP request and writes markdown files with no unexpected credentials or privileged installs.
目的
Name/description claim to fetch ClawFeed digests and save them to Obsidian; the included script fetches from https://clawfeed.kevinhe.io/api/digests and writes markdown files into an Obsidian directory. No unrelated credentials, binaries, or config paths are requested.
说明范围
SKILL.md tells the agent to pip install requests and run scripts/fetch_clawfeed.py (matching the provided script). The script performs only an unauthenticated GET to the stated API and writes files to a user vault path (default: ~/OneDrive/文档/Obsidian Vault/AI新闻). Note: writing into a OneDrive-synced vault means the results will be uploaded to the user's cloud storage — this is expected but worth awareness.
安装机制
No install spec is provided; the skill is instruction-only and only recommends installing the widely used 'requests' Python package. No downloads from arbitrary URLs or archive extraction are present in the skill itself.
证书
The skill requests no environment variables, no credentials, and does not access other config paths. Its behavior (network GET and filesystem writes) is proportionate to the declared purpose.
持久
The skill is not always-enabled and is user-invocable; it does not request elevated privileges or modify other skills' configurations. The provided cron example runs the script on a schedule — scheduling is appropriate for the stated functionality.
综合结论
This skill appears to do exactly what it says: fetch public ClawFeed digests and write them as markdown files to an Obsidian vault. Before installing or scheduling it, consider these practical points: - Verify output location: by default it writes to ~/OneDrive/文档/Obsidian Vault/AI新闻. If you don't want results synced to OneDrive/cloud, supply --output to a local-only folder. - Run manually first: execute python scripts/fetch_clawfeed.py locally to confirm the content and filenames meet your expectations before adding a cron job. - Review and trust source: the package metadata has no homepage; docs point to a GitHub repo (adminlove520/clawfeed-digest). If provenance matters, inspect that upstream repo for updates or tampering. - Data handling: the script will overwrite files with the same generated filename; back up any important notes you might overwrite. - Environment hygiene: install 'requests' in a virtualenv rather than system-wide if you prefer isolation. Overall there are no code-level signs of credential harvesting, unexpected network endpoints, or privileged operations, so the skill is internally consistent with its stated purpose.
安装(复制给龙虾 AI)
将下方整段复制到龙虾中文库对话中,由龙虾按 SKILL.md 完成安装。
请把本段交给龙虾中文库(龙虾 AI)执行:为本机安装 OpenClaw 技能「Clawfeed Digest」。简介:Fetch ClawFeed AI news digests (4h/daily/weekly) and save them automatically to…。
请 fetch 以下地址读取 SKILL.md 并按文档完成安装:https://raw.githubusercontent.com/openclaw/skills/refs/heads/main/skills/adminlove520/clawfeed-digest/SKILL.md
(来源:yingzhi8.cn 技能库)
SKILL.md
暂无本地缓存内容,可在后台执行详情同步。