chromadb-memory

Long-term memory via ChromaDB with local Ollama embeddings. Auto-recall injects relevant context every turn. No cloud APIs required — fully self-hosted.

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "chromadb-memory" with this command: npx skills add msensintaffar/chromadb-memory

ChromaDB Memory

Long-term semantic memory backed by ChromaDB and local Ollama embeddings. Zero cloud dependencies.

What It Does

  • Auto-recall: Before every agent turn, queries ChromaDB with the user's message and injects relevant context automatically
  • chromadb_search tool: Manual semantic search over your ChromaDB collection
  • 100% local: Ollama (nomic-embed-text) for embeddings, ChromaDB for vector storage

Prerequisites

  1. ChromaDB running (Docker recommended):

    docker run -d --name chromadb -p 8100:8000 chromadb/chroma:latest
    
  2. Ollama with an embedding model:

    ollama pull nomic-embed-text
    
  3. Indexed documents in ChromaDB. Use any ChromaDB-compatible indexer to populate your collection.

Install

# 1. Copy the plugin extension
mkdir -p ~/.openclaw/extensions/chromadb-memory
cp {baseDir}/scripts/index.ts ~/.openclaw/extensions/chromadb-memory/
cp {baseDir}/scripts/openclaw.plugin.json ~/.openclaw/extensions/chromadb-memory/

# 2. Add to your OpenClaw config (~/.openclaw/openclaw.json):
{
  "plugins": {
    "entries": {
      "chromadb-memory": {
        "enabled": true,
        "config": {
          "chromaUrl": "http://localhost:8100",
          "collectionName": "longterm_memory",
          "ollamaUrl": "http://localhost:11434",
          "embeddingModel": "nomic-embed-text",
          "autoRecall": true,
          "autoRecallResults": 3,
          "minScore": 0.5
        }
      }
    }
  }
}
# 4. Restart the gateway
openclaw gateway restart

Config Options

OptionDefaultDescription
chromaUrlhttp://localhost:8100ChromaDB server URL
collectionNamelongterm_memoryCollection name (auto-resolves UUID, survives reindexing)
collectionIdCollection UUID (optional fallback)
ollamaUrlhttp://localhost:11434Ollama API URL
embeddingModelnomic-embed-textOllama embedding model
autoRecalltrueAuto-inject relevant memories each turn
autoRecallResults3Max auto-recall results per turn
minScore0.5Minimum similarity score (0-1)

How It Works

  1. You send a message
  2. Plugin embeds your message via Ollama (nomic-embed-text, 768 dimensions)
  3. Queries ChromaDB for nearest neighbors
  4. Results above minScore are injected into the agent's context as <chromadb-memories>
  5. Agent responds with relevant long-term context available

Token Cost

Auto-recall adds ~275 tokens per turn worst case (3 results × ~300 chars + wrapper). Against a 200K+ context window, this is negligible.

Tuning

  • Too noisy? Raise minScore to 0.6 or 0.7
  • Missing context? Lower minScore to 0.4, increase autoRecallResults to 5
  • Want manual only? Set autoRecall: false, use chromadb_search tool

Architecture

User Message → Ollama (embed) → ChromaDB (query) → Context Injection
                                                  ↓
                                          Agent Response

No OpenAI. No cloud. Your memories stay on your hardware.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Automation

OpenClaw Advanced Memory

Provides persistent, searchable AI agent memory with real-time capture, vector search, and nightly LLM curation for long-term recall on local hardware.

Registry SourceRecently Updated
1168
Profile unavailable
General

Windows Local Embedding

在 Windows 上为 OpenClaw 配置本地 embedding / 本地记忆检索时使用。适用于:下载并接入 `nomic-embed-text-v1.5.Q8_0.gguf`、把 `memorySearch.provider` 改成 `local`、检查 `openclaw memory status...

Registry SourceRecently Updated
050
Profile unavailable
General

River Memory

Store and semantically search text memories locally using Ollama with automatic management and optimization.

Registry SourceRecently Updated
0137
Profile unavailable