openclaw 网盘下载
OpenClaw

技能详情(站内镜像,无评论)

首页 > 技能库 > OpenClaw Skills Weekly

OpenClaw Skills Weekly — tracks trending ClawHub skills, generates GitHubAwesome-style YouTube video scripts with two-track ranking (Movers + Rockets).

媒体与内容

许可证:MIT-0

MIT-0 ·免费使用、修改和重新分发。无需归因。

版本:v1.1.0

统计:⭐ 0 · 249 · 0 current installs · 0 all-time installs

0

安装量(当前) 0

🛡 VirusTotal :可疑 · OpenClaw :良性

Package:ademczuk/skills-weekly

安全扫描(ClawHub)

  • VirusTotal :可疑
  • OpenClaw :良性

OpenClaw 评估

The skill's code, declared requirements, and runtime instructions are coherent with its stated purpose (fetch ClawHub data, build snapshots, rank skills, and call Anthropic for script generation); nothing requests unrelated credentials or installs arbitrary remote code.

目的

Name/description match the implementation: the code fetches ClawHub API pages, stores snapshots, computes velocity, harvests READMEs, and uses an LLM (Anthropic) for script generation. Required binary (python3) and required env var (ANTHROPIC_API_KEY) align with the described functionality. Optional env vars (GITHUB_TOKEN, XAI_API_KEY, CLAWHUB_BASE_URL) are also justified by their stated uses.

说明范围

SKILL.md instructs installing requirements and running the pipeline (discovery → snapshot → rank → harvest → script). It also documents a 'community' mode that uses web searches and an X/Twitter capture step — this expands network access but is consistent with the stated community-signal feature. The pipeline writes local files (SQLite DB, markdown, script text). No instructions request unrelated system secrets or broad file-system scraping. N…

安装机制

There is no formal install spec (instruction-only skill) and files are pure Python. The SKILL.md asks to run pip install -r requirements.txt; installing third‑party Python packages is expected but always worth reviewing. No downloads from suspicious URLs or archive extraction are present in the manifest.

证书

Only one required environment variable is declared (ANTHROPIC_API_KEY), which is appropriate for the LLM script-generation step. Other environment variables are optional and have clear justifications (GITHUB_TOKEN for fetching READMEs, XAI_API_KEY for optional X capture, CLAWHUB_BASE_URL for testing). No unrelated cloud or system credentials are required.

持久

always:false (normal). The skill creates and writes to a local SQLite DB and a workspace data directory; it also can write to a host mount path (/mnt/host/skills-weekly) if available or when HOST_OUTPUT_DIR is set. This is consistent with a reporting pipeline but you should be aware it persists data to disk and may expose outputs to host mounts if the container is mounted.

综合结论

This skill appears to do exactly what it says: fetch public ClawHub data, record snapshots to a local SQLite DB, optionally fetch READMEs, compute rankings, and call Claude (Anthropic) to generate scripts. Before installing/run: 1) inspect requirements.txt to ensure no unexpected packages are installed; 2) keep your ANTHROPIC_API_KEY secret and provide it only if you want LLM script generation (without it the pipeline skips the LLM step); 3) s…

安装(复制给龙虾 AI)

将下方整段复制到龙虾中文库对话中,由龙虾按 SKILL.md 完成安装。

请把本段交给龙虾中文库(龙虾 AI)执行:为本机安装 OpenClaw 技能「OpenClaw Skills Weekly」。简介:OpenClaw Skills Weekly — tracks trending ClawHub skills, generates GitHubAwesom…。
请 fetch 以下地址读取 SKILL.md 并按文档完成安装:https://raw.githubusercontent.com/openclaw/skills/refs/heads/main/skills/ademczuk/skills-weekly/SKILL.md
(来源:yingzhi8.cn 技能库)

SKILL.md

打开原始 SKILL.md(GitHub raw)

---
name: skills-weekly
description: "OpenClaw Skills Weekly — tracks trending ClawHub skills, generates GitHubAwesome-style YouTube video scripts with two-track ranking (Movers + Rockets)."
emoji: "U0001F4CA"
user-invocable: true
metadata:
  openclaw:
    version: "1.0.0"
    requires:
      bins: ["python3"]
      env: ["ANTHROPIC_API_KEY"]
    triggers:
      - "skills weekly"
      - "trending skills"
      - "clawhub trending"
      - "skill metrics"
      - "weekly report"
      - "skill snapshot"
      - "openclaw skills"
      - "generate report"
      - "video script"
---

# OpenClaw Skills Weekly

Automated pipeline for tracking trending ClawHub skills and generating YouTube-ready video scripts in the GitHubAwesome format.

## What This Skill Does

1. **ClawHub API Discovery** — Fetches all ~13K skills from `GET https://clawhub.ai/api/v1/skills` with cursor pagination. No auth required.
2. **SQLite Time-Series Snapshots** — Records daily metrics (installs, downloads, stars) to build 7-day velocity history.
3. **Two-Track Ranking** — MOVERS (established skills, 30+ days, ranked by install velocity) and ROCKETS (new skills <30 days, with recency bonus). Author diversity cap prevents one author from dominating.
4. **Content Harvesting** — Fetches documentation and author info from ClawHub detail API for top-ranked skills.
5. **YouTube Script Generation** — Generates GitHubAwesome-style video segments via Claude Haiku: hook-first, technical specs, no popularity metrics, dry newscast tone.
6. **Dual Output** — Markdown report (`.md`) + voice-ready video script (`.txt`).

## Commands

Parse the user's request and route to the correct mode:

| User says | Mode | What happens |
|---|---|---|
| `weekly report` or `full report` or `generate report` | Full Pipeline | Discovery → snapshot → rank → harvest → scripts → output |
| `snapshot` or `daily snapshot` | Snapshot Only | Record ClawHub metrics to DB (no scripts) |
| `trending` or `what's trending` | Quick Trending | Show top 10 from existing DB data |
| `status` or `db status` | Status | Show DB health and snapshot history |
| `video script` or `generate script` | Script Only | Re-generate scripts from last snapshot (no re-fetch) |

## Full Pipeline (Weekly Report)

First, install dependencies if not already present:

```bash
cd "${SKILL_ROOT}" && pip install -r requirements.txt --quiet 2>/dev/null || pip3 install -r requirements.txt --quiet
```

Then run the full pipeline:

```bash
cd "${SKILL_ROOT}" && python3 run_weekly.py --top 10 --episode ${EPISODE_NUM:-1}
```

Replace `${EPISODE_NUM}` with the episode number the user specifies, or default to 1.

If the user says `--skip-x` or doesn't want X/Twitter capture, add `--skip-x`:

```bash
cd "${SKILL_ROOT}" && python3 run_weekly.py --top 10 --skip-x --episode ${EPISODE_NUM:-1}
```

**What this produces:**
- `openclaw_weekly_YYYYMMDD.md` — Data-rich markdown report with metrics, rankings, and scripts
- `openclaw_weekly_YYYYMMDD_script.txt` — Voice-ready video script in GitHubAwesome format

Present both file paths to the user when done.

**Expected output:**
```
============================================================
  OpenClaw Skills Weekly — Full Pipeline (v4)
  Week of Mar 01, 2026
============================================================
  PHASE 1: X/Twitter Signal Capture
  PHASE 2: ClawHub Data Pipeline
  [1/5] Discovering ClawHub skills...
  [2/5] Saving snapshot...
  [3/5] Ranking by 7-day velocity...
  [4/5] Harvesting content...
  [5/5] Generating YouTube scripts...
  DONE:
    Report: openclaw_weekly_20260301.md
    Script: openclaw_weekly_20260301_script.txt
```

## Snapshot Only (Daily Cron)

For daily snapshot accumulation without script generation:

```bash
cd "${SKILL_ROOT}" && python3 run_weekly.py --snapshot-only --skip-x
```

Tell the user how many skills were captured and how many snapshot dates exist in the DB.

## Quick Trending

Show what's trending from existing DB data without re-fetching:

```bash
cd "${SKILL_ROOT}" && python3 main.py --list-db
```

## Status

```bash
cd "${SKILL_ROOT}" && python3 main.py --list-db
```

Shows: DB path, total snapshot rows, distinct dates, top skills by current installs.

## CLI Options Reference

| Flag | Default | Description |
|---|---|---|
| `--top N` | 10 | Number of top movers to include |
| `--days N` | 7 | Trailing days for velocity calculation |
| `--episode N` | 1 | Episode number for video script cold open |
| `--skip-x` | false | Skip X/Twitter signal capture |
| `--snapshot-only` | false | Just record snapshot, no scripts |
| `--max-pages N` | 0 (all) | Limit API pages (for testing) |
| `--model MODEL` | claude-haiku-4-5-20251001 | Anthropic model for script gen |
| `--output FILE` | auto-dated | Custom output file path |
| `--mock` | false | Use synthetic data (offline dev) |

## Environment Variables

| Variable | Required | Description |
|---|---|---|
| `ANTHROPIC_API_KEY` | Yes | For YouTube script generation via Claude |
| `GITHUB_TOKEN` | No | For fetching source READMEs from GitHub |
| `XAI_API_KEY` | No | For X/Twitter signal capture via xAI |
| `CLAWHUB_BASE_URL` | No | Override ClawHub URL (default: https://clawhub.ai) |

## Video Script Format

Scripts follow the GitHubAwesome "GitHub Trending Weekly" format:
- **Cold open**: "It is time for OpenClaw Skills Weekly, episode number N..."
- **Per-skill segments** (~20 sec each): Hook first → technical specs → sharp closer
- **No popularity metrics** in narration (no download/install/star counts)
- **Dry, confident newscast tone** — no hype, no superlatives
- **No outro** — last item ends the episode

## Architecture

```
run_weekly.py          # Full pipeline orchestrator
main.py                # Alternative CLI with --list-db
discovery.py           # ClawHub API cursor pagination (~13K skills)
storage.py             # SQLite time-series (slug-scoped dedup, CTE velocity)
ranker.py              # Two-track: Movers + Rockets, author diversity cap
harvester.py           # ClawHub detail API content + author extraction
script_generator.py    # LLM script gen + markdown + video script rendering
community_signals.py   # X/Twitter signal loading and rendering
x_capture.py           # xAI x_search API integration
data/metrics.db        # SQLite database (auto-created)
```