deep-memory

One-click clone of a production-grade semantic memory system: HOT/WARM/COLD tiered storage + Qdrant vector DB + Neo4j graph DB + qwen3-embedding. Enables cross-session semantic retrieval and entity relationship memory for AI agents.

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "deep-memory" with this command: npx skills add halfmoon82/deep-memory

Deep Memory Skill 🧠

A production-grade semantic memory system for AI agents. Combines tiered file storage with vector search and graph relationships.

Architecture

┌─────────────────────────────────────┐
│        File Layer (always-on)        │
│  HOT / WARM / COLD Markdown files   │
│  semantic_memory.json               │
└──────────────┬──────────────────────┘
               ↓
┌─────────────────────────────────────┐
│        Vector Layer (Docker)         │
│  Qdrant: semantic similarity search │
│  Collection: semantic_memories       │
│  Dimensions: 4096 (qwen3-embedding)  │
└──────────────┬──────────────────────┘
               ↓
┌─────────────────────────────────────┐
│        Graph Layer (Docker)          │
│  Neo4j: entity relationship memory  │
│  Constraints: Memory.key + Entity.id │
└─────────────────────────────────────┘
               ↓
┌─────────────────────────────────────┐
│     Embedding Model (Ollama)         │
│  qwen3-embedding:8b (4096 dims)      │
│  Local, free, no API calls          │
└─────────────────────────────────────┘

Prerequisites

  • Docker Desktop (running)
  • Ollama installed (brew install ollama on macOS)

Usage

Setup (first time)

python3 ~/.openclaw/workspace/skills/deep-memory/scripts/setup.py

Write a memory

from deep_memory import MemorySystem
mem = MemorySystem()
mem.store("user_sir", "Sir prefers direct communication, no pleasantries", tags=["preference", "communication"])

Search memories

results = mem.search("how does Sir like to communicate?", top_k=5)
for r in results:
    print(r['content'], r['score'])

Joint query (vector + graph)

results = mem.joint_query("investment strategy", entity="Sir", top_k=3)

Setup Flow

When triggered, the setup script will:

  1. Check Docker is running
  2. Check Ollama is installed and pull qwen3-embedding:8b if needed
  3. Start Qdrant container (port 6333/6334)
  4. Start Neo4j container (port 7474/7687)
  5. Create Qdrant collection (semantic_memories, 4096 dims, Cosine)
  6. Create Neo4j constraints (Memory.key, Entity.id)
  7. Create HOT/WARM/COLD directory structure
  8. Copy Python toolkit to workspace
  9. Run end-to-end verification test

Agent Integration

In your SOUL.md or AGENTS.md, add:

## Memory Retrieval
Before answering questions about prior work, decisions, or preferences:
1. Run: python3 ~/.openclaw/workspace/.lib/qdrant_memory.py search "<query>"
2. Combine with memory_search tool results
3. Use top results as context

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

memory-m3e - Semantic Memory Plugin

Semantic memory plugin using m3e-large embeddings with SQLite storage, supporting storage, retrieval, and deletion via cosine similarity search in pure JS.

Registry SourceRecently Updated
3360Profile unavailable
General

Sophie Mem0

企业级智能记忆系统,支持跨会话语义记忆存储与检索,实现多级长期上下文管理和自我反思学习能力。

Registry SourceRecently Updated
1030Profile unavailable
Coding

Awareness Cloud Memory

Persistent cloud memory across sessions. Automatically recalls past decisions, code, and tasks before each request, and saves summaries after each session. A...

Registry SourceRecently Updated
1370Profile unavailable
Automation

Plugin

Install + set up TotalReclaw encrypted memory for OpenClaw, then use totalreclaw_remember / totalreclaw_recall. Trigger on 'install TotalReclaw', 'set up Tot...

Registry SourceRecently Updated