openclaw 网盘下载
OpenClaw

技能详情(站内镜像,无评论)

首页 > 技能库 > SkillMetricScraper

OpenClaw Skills Weekly — tracks trending ClawHub skills, generates GitHubAwesome-style YouTube video scripts with two-track ranking (Movers + Rockets).

媒体与内容

许可证:MIT-0

MIT-0 ·免费使用、修改和重新分发。无需归因。

版本:v1.3.0

统计:⭐ 0 · 253 · 0 current installs · 0 all-time installs

0

安装量(当前) 0

🛡 VirusTotal :可疑 · OpenClaw :可疑

Package:ademczuk/skillmetricscraper

安全扫描(ClawHub)

  • VirusTotal :可疑
  • OpenClaw :可疑

OpenClaw 评估

The skill mostly does what it says (fetch ClawHub, rank, generate scripts), but its runtime requirements and instructions are inconsistent and it can access extra credentials (via the gh CLI) and system resources that are not declared — review before installing.

目的

The skill's stated purpose (ClawHub discovery, ranking, script generation) matches the code: discovery, ranking, harvesting, and Anthropic-based script generation are implemented. However the declared registry metadata is incomplete: the package uses the GitHub CLI ('gh') via subprocess in project_tracker.py and expects pip packages (httpx, dotenv, etc.) but the registry 'required binaries' only lists python3 and there is no install spec. That…

说明范围

Runtime instructions and code perform network operations across multiple external services: ClawHub API, GitHub (API/raw), Anthropics (Claude via ANTHROPIC_API_KEY), and community sources (X/Twitter, Reddit). project_tracker.py calls the 'gh' CLI without declaring it or passing an explicit token, which means it will use any gh-authenticated user credentials on the host. The skill writes a local SQLite DB and output files under user/home or con…

安装机制

The registry lists no install spec (instruction-only), but SKILL.md instructs pip install -r requirements.txt and the repository contains requirements.txt and multiple Python modules that import third-party libraries (httpx, python-dotenv, anthropic client likely). There is no automated vetting or pinned-install mechanism declared; this manual-install pattern increases risk if users install packages without review. No downloads from untrusted …

证书

The declared required env var (ANTHROPIC_API_KEY) is proportionate for LLM-based script generation. Optional env vars (GITHUB_TOKEN, XAI_API_KEY, CLAWHUB_BASE_URL) are reasonable. However, project_tracker uses the 'gh' CLI which will use any gh auth the host user has configured (without needing GITHUB_TOKEN env), effectively giving the skill access to host GitHub credentials not declared in requires.env. That implicit credential usage is dispr…

持久

The skill does create and write a local SQLite DB and output files under per-user data directories (or a container workspace). It does not request 'always: true', does not modify other skills' configs, and does not request elevated OS privileges. File writes are expected given the purpose.

安装(复制给龙虾 AI)

将下方整段复制到龙虾中文库对话中,由龙虾按 SKILL.md 完成安装。

请把本段交给龙虾中文库(龙虾 AI)执行:为本机安装 OpenClaw 技能「SkillMetricScraper」。简介:OpenClaw Skills Weekly — tracks trending ClawHub skills, generates GitHubAwesom…。
请 fetch 以下地址读取 SKILL.md 并按文档完成安装:https://raw.githubusercontent.com/openclaw/skills/refs/heads/main/skills/ademczuk/skillmetricscraper/SKILL.md
(来源:yingzhi8.cn 技能库)

SKILL.md

打开原始 SKILL.md(GitHub raw)

---
name: skills-weekly
description: "OpenClaw Skills Weekly — tracks trending ClawHub skills, generates GitHubAwesome-style YouTube video scripts with two-track ranking (Movers + Rockets)."
emoji: "U0001F4CA"
user-invocable: true
metadata:
  openclaw:
    version: "1.2.0"
    requires:
      bins: ["python3"]
      env: ["ANTHROPIC_API_KEY"]
    triggers:
      - "skills weekly"
      - "trending skills"
      - "clawhub trending"
      - "skill metrics"
      - "weekly report"
      - "skill snapshot"
      - "openclaw skills"
      - "generate report"
      - "video script"
---

# OpenClaw Skills Weekly

Automated pipeline for tracking trending ClawHub skills and generating YouTube-ready video scripts in the GitHubAwesome format.

## What This Skill Does

1. **ClawHub API Discovery** — Fetches all ~13K skills from `GET https://clawhub.ai/api/v1/skills` with cursor pagination. No auth required.
2. **SQLite Time-Series Snapshots** — Records daily metrics (installs, downloads, stars) to build 7-day velocity history.
3. **Two-Track Ranking** — MOVERS (established skills, 30+ days, ranked by install velocity) and ROCKETS (new skills <30 days, with recency bonus). Author diversity cap prevents one author from dominating.
4. **Content Harvesting** — Fetches documentation and author info from ClawHub detail API for top-ranked skills.
5. **YouTube Script Generation** — Generates GitHubAwesome-style video segments via Claude Haiku: hook-first, technical specs, no popularity metrics, dry newscast tone.
6. **Dual Output** — Markdown report (`.md`) + voice-ready video script (`.txt`).

## Commands

Parse the user's request and route to the correct mode:

| User says | Mode | What happens |
|---|---|---|
| `weekly report` or `full report` or `generate report` | Full Pipeline | Discovery → snapshot → rank → harvest → scripts → output |
| `snapshot` or `daily snapshot` | Snapshot Only | Record ClawHub metrics to DB (no scripts) |
| `trending` or `what's trending` | Quick Trending | Show top 10 from existing DB data |
| `status` or `db status` | Status | Show DB health and snapshot history |
| `video script` or `generate script` | Script Only | Re-generate scripts from last snapshot (no re-fetch) |

## Full Pipeline (Weekly Report)

First, install dependencies if not already present:

```bash
cd "${SKILL_ROOT}" && pip install -r requirements.txt --quiet 2>/dev/null || pip3 install -r requirements.txt --quiet
```

Then run the full pipeline:

```bash
cd "${SKILL_ROOT}" && python3 run_weekly.py --top 10 --episode ${EPISODE_NUM:-1}
```

Replace `${EPISODE_NUM}` with the episode number the user specifies, or default to 1.

If the user says `--skip-x` or doesn't want X/Twitter capture, add `--skip-x`:

```bash
cd "${SKILL_ROOT}" && python3 run_weekly.py --top 10 --skip-x --episode ${EPISODE_NUM:-1}
```

**What this produces:**
- `openclaw_weekly_YYYYMMDD.md` — Data-rich markdown report with metrics, rankings, and scripts
- `openclaw_weekly_YYYYMMDD_script.txt` — Voice-ready video script in GitHubAwesome format

Present both file paths to the user when done.

**Expected output:**
```
============================================================
  OpenClaw Skills Weekly — Full Pipeline (v4)
  Week of Mar 01, 2026
============================================================
  PHASE 1: Multi-Source Signal Capture (X + Reddit + HN)
  PHASE 2: ClawHub Data Pipeline
  [1/5] Discovering ClawHub skills...
  [2/5] Saving snapshot...
  [3/5] Ranking by 7-day velocity...
  [4/5] Harvesting content...
  [5/5] Generating YouTube scripts...
  DONE:
    Report: openclaw_weekly_20260301.md
    Script: openclaw_weekly_20260301_script.txt
```

## Snapshot Only (Daily Cron)

For daily snapshot accumulation without script generation:

```bash
cd "${SKILL_ROOT}" && python3 run_weekly.py --snapshot-only --skip-x
```

Tell the user how many skills were captured and how many snapshot dates exist in the DB.

## Quick Trending

Show what's trending from existing DB data without re-fetching:

```bash
cd "${SKILL_ROOT}" && python3 main.py --list-db
```

## Status

```bash
cd "${SKILL_ROOT}" && python3 main.py --list-db
```

Shows: DB path, total snapshot rows, distinct dates, top skills by current installs.

## CLI Options Reference

| Flag | Default | Description |
|---|---|---|
| `--top N` | 10 | Number of top movers to include |
| `--days N` | 7 | Trailing days for velocity calculation |
| `--episode N` | 1 | Episode number for video script cold open |
| `--skip-x` | false | Skip X/Twitter signal capture |
| `--snapshot-only` | false | Just record snapshot, no scripts |
| `--max-pages N` | 0 (all) | Limit API pages (for testing) |
| `--model MODEL` | claude-haiku-4-5-20251001 | Anthropic model for script gen |
| `--output FILE` | auto-dated | Custom output file path |
| `--mock` | false | Use synthetic data (offline dev) |

## Environment Variables

| Variable | Required | Description |
|---|---|---|
| `ANTHROPIC_API_KEY` | Yes | For YouTube script generation via Claude |
| `GITHUB_TOKEN` | No | For fetching source READMEs from GitHub |
| `XAI_API_KEY` | No | For X/Twitter signal capture via xAI |
| `CLAWHUB_BASE_URL` | No | Override ClawHub URL (default: https://clawhub.ai) |

## Video Script Format

Scripts follow the GitHubAwesome "GitHub Trending Weekly" format:
- **Cold open**: "It is time for OpenClaw Skills Weekly, episode number N..."
- **Per-skill segments** (~20 sec each): Hook first → technical specs → sharp closer
- **No popularity metrics** in narration (no download/install/star counts)
- **Dry, confident newscast tone** — no hype, no superlatives
- **No outro** — last item ends the episode

## Architecture

```
run_weekly.py          # Full pipeline orchestrator
main.py                # Alternative CLI with --list-db
discovery.py           # ClawHub API cursor pagination (~13K skills)
storage.py             # SQLite time-series (slug-scoped dedup, CTE velocity)
ranker.py              # Two-track: Movers + Rockets, author diversity cap
harvester.py           # ClawHub detail API content + author extraction
script_generator.py    # LLM script gen + markdown + video script rendering
community_signals.py   # Multi-source signal loading and rendering
x_capture.py           # xAI x_search API integration
reddit_capture.py      # Reddit + Hacker News signal capture
hourly_heartbeat.py    # Top-500 hourly snapshot + project metadata
project_tracker.py     # OpenClaw GitHub repo metadata tracking
data/metrics.db        # SQLite database (auto-created)
```