openclaw 网盘下载
OpenClaw

技能详情(站内镜像,无评论)

首页 > 技能库 > Kb Collector

Knowledge Base Collector - save YouTube, URLs, text to Obsidian with AI summarization. Auto-transcribes videos, fetches pages, supports weekly/monthly digest...

媒体与内容

许可证:MIT-0

MIT-0 ·免费使用、修改和重新分发。无需归因。

版本:v1.2.1

统计:⭐ 0 · 269 · 1 current installs · 2 all-time installs

0

安装量(当前) 2

🛡 VirusTotal :可疑 · OpenClaw :可疑

Package:arbiger/kb-collector

安全扫描(ClawHub)

  • VirusTotal :可疑
  • OpenClaw :可疑

OpenClaw 评估

The skill mostly does what its README claims (download/transcribe/fetch/save notes) but the runtime scripts reference undeclared environment variables and hard-coded paths/recipients and include external network/email calls that are not reflected in the declared requirements — this mismatch is worth attention before installing.

目的

Name/description align with the included scripts: downloading YouTube audio, transcribing, fetching pages, saving to an Obsidian vault, and generating digests/nightly research. Some minor mismatches: SKILL.md claims it 'searches multiple sources (Hacker News, Reddit, Twitter)', but nightly-research.sh performs searches via the Tavily API only (it does not independently query those sites). Overall capabilities match the stated purpose.

说明范围

SKILL.md and scripts instruct the agent to fetch remote web pages and call external services (Tavily API via curl, and send email via the 'gog gmail send' tool). The scripts also write files into an Obsidian vault path and remove temporary audio files. The SKILL metadata declared no required env vars, but the scripts read/expect environment variables (TAVILY_API_KEY, OBSIDIAN_VAULT, RECIPIENT) and use a hard-coded email recipient and hard-code…

安装机制

There is no formal install spec in the registry (instruction-only), which is lower risk from an automatic installer perspective. The SKILL.md tells the user to pip install packages (yt-dlp, faster-whisper, requests, beautifulsoup4, optional openai/anthropic). That is consistent with the code. No downloads from arbitrary URLs or archive extraction are present. Because the code relies on external binaries (yt-dlp, whisper) and a third-party CLI …

证书

Registry lists no required environment variables or credentials, yet scripts make use of environment vars: TAVILY_API_KEY (sent to api.tavily.com), OBSIDIAN_VAULT (overrides VAULT), and RECIPIENT. digest.sh and nightly-research.sh also use or assume the presence of an email-sending tool ('gog') which uses credentials not declared here. The scripts also include hard-coded local paths and a hard-coded recipient email (george@precaster.com.tw). A…

持久

The skill does not request permanent platform-level privileges (always: false) and does not modify other skills' configuration. It writes notes to a user-visible Obsidian vault and temporary files in /tmp, which is expected given its purpose. Autonomous invocation is allowed (default) — combined with the environment/credential concerns above this increases potential impact, but the skill alone does not request 'always' or system-wide config ch…

安装(复制给龙虾 AI)

将下方整段复制到龙虾中文库对话中,由龙虾按 SKILL.md 完成安装。

请把本段交给龙虾中文库(龙虾 AI)执行:为本机安装 OpenClaw 技能「Kb Collector」。简介:Knowledge Base Collector - save YouTube, URLs, text to Obsidian with AI summari…。
请 fetch 以下地址读取 SKILL.md 并按文档完成安装:https://raw.githubusercontent.com/openclaw/skills/refs/heads/main/skills/arbiger/kb-collector/SKILL.md
(来源:yingzhi8.cn 技能库)

SKILL.md

打开原始 SKILL.md(GitHub raw)

---
name: kb-collector
description: Knowledge Base Collector - save YouTube, URLs, text to Obsidian with AI summarization. Auto-transcribes videos, fetches pages, supports weekly/monthly digest emails and nightly research.
---

# KB Collector

Knowledge Base Collector - Save YouTube, URLs, and text to Obsidian with automatic transcription and summarization.

## Features

- **YouTube Collection** - Download audio, transcribe with Whisper, auto-summarize
- **URL Collection** - Fetch and summarize web pages
- **Plain Text** - Direct save with tags
- **Digest** - Weekly/Monthly/Yearly review emails
- **Nightly Research** - Automated AI/LLM/tech trend tracking

## Installation

```bash
# Install dependencies
pip install yt-dlp faster-whisper requests beautifulsoup4

# For AI summarization (optional)
pip install openai anthropic
```

## Usage (Python Version - Recommended)

```bash
# Collect YouTube video
python3 scripts/collect.py youtube "https://youtu.be/xxxxx" "stock,investing"

# Collect URL
python3 scripts/collect.py url "https://example.com/article" "python,api"

# Collect plain text
python3 scripts/collect.py text "My note content" "tag1,tag2"
```

## Usage (Bash Version - Legacy)

```bash
# Collect YouTube
./scripts/collect.sh "https://youtu.be/xxxxx" "stock,investing" youtube

# Collect URL
./scripts/collect.sh "https://example.com/article" "python,api" url

# Collect plain text
./scripts/collect.sh "My note" "tag1,tag2" text
```

## Nightly Research (New!)

Automated AI/LLM/tech trend tracking - runs daily and saves to Obsidian.

```bash
# Save to Obsidian only
./scripts/nightly-research.sh --save

# Save to Obsidian AND send email
./scripts/nightly-research.sh --save --send

# Send email only
./scripts/nightly-research.sh --send
```

### Features
- Searches multiple sources (Hacker News, Reddit, Twitter)
- LLM summarization (optional)
- Saves to Obsidian with tags
- Optional email digest

### Cron Setup (optional)
```bash
# Run every night at 10 PM
0 22 * * * /path/to/nightly-research.sh --save --send
```

## Configuration

Edit the script to customize:

```python
VAULT_PATH = os.path.expanduser("~/Documents/YourVault")
NOTE_AUTHOR = "YourName"
```

## Output Format

Notes saved to: `{VAULT_PATH}/yyyy-mm-dd-title.md`

```markdown
---
created: 2026-03-03T12:00:00
source: https://...
tags: [stock, investing]
author: George
---

# Title

> **TLDR:** Summary here...

---

Content...

---
*Saved: 2026-03-03*
```

## Dependencies

- yt-dlp
- faster-whisper (for transcription)
- requests + beautifulsoup4 (for URL fetching)
- Optional: openai/anthropic (for AI summarization)

## Credits

Automated note-taking workflow for Obsidian.