openclaw 网盘下载
OpenClaw

技能详情(站内镜像,无评论)

首页 > 技能库 > Openclaw Memory

Agent memory with ALMA meta-learning, LLM fact extraction, and full-text search. Observer calls remote LLM APIs (OpenAI/Anthropic/Gemini). ALMA and Indexer w...

开发与 DevOps

作者:Artale @arosstale

许可证:MIT-0

MIT-0 ·免费使用、修改和重新分发。无需归因。

版本:v2.1.0

统计:⭐ 0 · 284 · 0 current installs · 0 all-time installs

0

安装量(当前) 0

🛡 VirusTotal :良性 · OpenClaw :可疑

Package:arosstale/openclaw-memories

安全扫描(ClawHub)

  • VirusTotal :良性
  • OpenClaw :可疑

OpenClaw 评估

The skill appears to implement the described memory components, but the registry metadata does not declare the Observer's required LLM API keys and the skill will send conversation text to third‑party LLM endpoints — an incoherence you should understand before installing.

目的

Name/description (ALMA, Observer, Indexer) match the included code: ALMA evolves designs offline, Indexer reads workspace Markdown, and Observer calls remote LLM APIs. The code and README are consistent with the stated purpose.

说明范围

SKILL.md and the code limit scope to: (a) offline ALMA and Indexer that use an in‑memory DB and workspace Markdown files, and (b) an Observer that sends sanitized conversation content to OpenAI/Anthropic/Gemini endpoints. The Observer will transmit conversation content to external LLMs (expected for extraction) — this is functionalityally appropriate but has privacy/exfiltration implications that users should consider.

安装机制

No install spec in registry (instruction-only skill). Package.json and source are present but there are no runtime dependencies and no external download/install steps that would write arbitrary code to disk beyond a normal npm install. This is low risk from an installer perspective.

证书

SKILL.md and observer.ts require an LLM API key (OPENAI_API_KEY, ANTHROPIC_API_KEY, or apiKey in config), but the registry metadata lists no required environment variables or primary credential. The Observer legitimately needs an API key, so the metadata omission is an incoherence and a risk: users may not be warned that conversational data can be sent to third‑party LLMs and that keys are necessary.

持久

always is false; the skill does not request permanent platform presence or modify other skills/config. The Indexer reads workspace files but uses an in‑memory DB and does not write persistent system‑wide config. No elevated privileges requested.

安装(复制给龙虾 AI)

将下方整段复制到龙虾中文库对话中,由龙虾按 SKILL.md 完成安装。

请把本段交给龙虾中文库(龙虾 AI)执行:为本机安装 OpenClaw 技能「Openclaw Memory」。简介:Agent memory with ALMA meta-learning, LLM fact extraction, and full-text search…。
请 fetch 以下地址读取 SKILL.md 并按文档完成安装:https://raw.githubusercontent.com/openclaw/skills/refs/heads/main/skills/arosstale/openclaw-memories/SKILL.md
(来源:yingzhi8.cn 技能库)

SKILL.md

打开原始 SKILL.md(GitHub raw)

---
name: openclaw-memories
description: Agent memory with ALMA meta-learning, LLM fact extraction, and full-text search. Observer calls remote LLM APIs (OpenAI/Anthropic/Gemini). ALMA and Indexer work offline.
---

# OpenClaw Memory System

Three components for agent memory:

1. **ALMA** — Evolves memory designs through mutation + evaluation (offline)
2. **Observer** — Extracts structured facts from conversations via LLM API (requires API key)
3. **Indexer** — Full-text search over workspace Markdown files (offline)

## Environment Variables

Observer requires one of:
- `OPENAI_API_KEY`
- `ANTHROPIC_API_KEY`
- Or pass `apiKey` in config

ALMA and Indexer require no keys or network access.

## How It Works

### ALMA (Algorithm Learning via Meta-learning Agents)
Proposes memory system designs, evaluates them, keeps the best. Uses gaussian mutation and simulated annealing to explore the design space.

```
alma.propose() → design
alma.evaluate(design.id, metrics) → score  
alma.best() → top design
alma.top(5) → leaderboard
```

### Observer
Sends conversation history to an LLM, gets back structured facts:
- Kind: world fact / biographical / opinion / observation
- Priority: high / medium / low
- Entities: mentioned people/places
- Confidence: 0.0–1.0 for opinions

Fails gracefully — returns empty array if LLM is unavailable.

### Indexer
Chunks workspace Markdown files and indexes them for search:
- `MEMORY.md` — core facts
- `memory/YYYY-MM-DD.md` — daily logs
- `bank/entities/*.md` — entity summaries
- `bank/opinions.md` — beliefs with confidence

```
indexer.index() → count of chunks indexed
indexer.search('query') → ranked results
indexer.rebuild() → re-index from scratch
```

## Install

```bash
npm install @artale/openclaw-memory
```

## Limitations

- Indexer uses an in-memory mock database, not real SQLite FTS5. Search works but ranking is simplified.
- Observer calls remote APIs — not offline. Only ALMA and Indexer work without network.
- No dashboard — removed in v2 for simplicity.

## Source

5 files, 578 lines, 0 runtime dependencies.

https://github.com/arosstale/openclaw-memory