mem0

Mem0 Platform SDK for adding persistent memory to AI applications. TRIGGER when: user mentions "mem0", "MemoryClient", "memory layer", "remember user preferences", "persistent context", "personalization", or needs to add long-term memory to chatbots, agents, or AI apps. Covers Python SDK (mem0ai), TypeScript SDK (mem0ai), and framework integrations (LangChain, CrewAI, OpenAI Agents SDK, Pipecat, LlamaIndex, AutoGen, LangGraph). Also covers the open-source self-hosted Memory class. This is the DEFAULT mem0 skill for ambiguous queries. DO NOT TRIGGER when: user asks about CLI commands, terminal usage, or shell scripts (use mem0-cli), or Vercel AI SDK / @mem0/vercel-ai-provider / createMem0 (use mem0-vercel-ai-sdk).

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "mem0" with this command: npx skills add mem0ai/mem0/mem0ai-mem0-mem0

Mem0 Platform Integration

Skill Graph: This skill is part of the Mem0 skill graph:

Mem0 is a managed memory layer for AI applications. It stores, retrieves, and manages user memories via API — no infrastructure to deploy. For self-hosted usage, see the OSS section in the client references below.

Step 1: Install and authenticate

Python:

pip install mem0ai
export MEM0_API_KEY="m0-your-api-key"

TypeScript/JavaScript:

npm install mem0ai
export MEM0_API_KEY="m0-your-api-key"

Get an API key at: https://app.mem0.ai/dashboard/api-keys?utm_source=oss&utm_medium=skill-mem0

Step 2: Initialize the client

Python:

from mem0 import MemoryClient
client = MemoryClient(api_key="m0-xxx")

TypeScript:

import MemoryClient from 'mem0ai';
const client = new MemoryClient({ apiKey: 'm0-xxx' });

For async Python, use AsyncMemoryClient.

Step 3: Core operations

Every Mem0 integration follows the same pattern: retrieve → generate → store.

Add memories

messages = [
    {"role": "user", "content": "I'm a vegetarian and allergic to nuts."},
    {"role": "assistant", "content": "Got it! I'll remember that."}
]
client.add(messages, user_id="alice")

Search memories

results = client.search("dietary preferences", filters={"user_id": "alice"})
for mem in results.get("results", []):
    print(mem["memory"])

Get all memories

all_memories = client.get_all(filters={"user_id": "alice"})

Update a memory

client.update("memory-uuid", text="Updated: vegetarian, nut allergy, prefers organic")

Delete a memory

client.delete("memory-uuid")
client.delete_all(user_id="alice")  # delete all for a user

Common integration pattern

from mem0 import MemoryClient
from openai import OpenAI

mem0 = MemoryClient()
openai = OpenAI()

def chat(user_input: str, user_id: str) -> str:
    # 1. Retrieve relevant memories
    memories = mem0.search(user_input, filters={"user_id": user_id})
    context = "\n".join([m["memory"] for m in memories.get("results", [])])

    # 2. Generate response with memory context
    response = openai.chat.completions.create(
        model="gpt-5-mini",
        messages=[
            {"role": "system", "content": f"User context:\n{context}"},
            {"role": "user", "content": user_input},
        ]
    )
    reply = response.choices[0].message.content

    # 3. Store interaction for future context
    mem0.add(
        [{"role": "user", "content": user_input}, {"role": "assistant", "content": reply}],
        user_id=user_id
    )
    return reply

Common edge cases

  • Search returns empty: Memories process asynchronously. Wait 2-3s after add() before searching. Also verify user_id matches exactly (case-sensitive) and use filters={"user_id": "..."} syntax.
  • AND filter with user_id + agent_id returns empty: Entities are stored separately. Use OR instead, or query separately.
  • Duplicate memories: Don't mix infer=True (default) and infer=False for the same data. Stick to one mode.
  • Wrong import: Always use from mem0 import MemoryClient (or AsyncMemoryClient for async). Do not use from mem0 import Memory.
  • v3 defaults: top_k=20, threshold=0.1, rerank=False. Adjust as needed for your use case.

v2 Compatibility

If you're using SDK v2.x, note these differences:

  • Entity IDs: Pass user_id as top-level kwarg to search() instead of inside filters
  • Defaults: top_k=100, no threshold, rerank=True
  • Graph memory: Available via enable_graph=True

See the migration guide for details.

Live documentation search

For the latest docs beyond what's in the references, use the doc search tool:

python ${CLAUDE_SKILL_DIR}/scripts/mem0_doc_search.py --query "topic"
python ${CLAUDE_SKILL_DIR}/scripts/mem0_doc_search.py --page "/platform/features/graph-memory"
python ${CLAUDE_SKILL_DIR}/scripts/mem0_doc_search.py --index

No API key needed — searches docs.mem0.ai directly.

Client SDK References

Language-specific deep references (Platform + OSS):

LanguageFile
Python (MemoryClient + AsyncMemoryClient + Memory OSS)client/python.md
TypeScript/Node.js (MemoryClient + Memory OSS)client/node.md
Python vs TypeScript differencesclient/differences.md

Platform References

Load these on demand for deeper detail:

TopicFile
Quickstart (Python, TS, cURL)references/quickstart.md
SDK guide (all methods, both languages)references/sdk-guide.md
API reference (endpoints, filters, object schema)references/api-reference.md
Architecture (pipeline, lifecycle, scoping, performance)references/architecture.md
Platform features (retrieval, graph, categories, MCP, etc.)references/features.md
Framework integrations (LangChain, CrewAI, OpenAI Agents, etc.)references/integration-patterns.md
Use cases & examples (real-world patterns with code)references/use-cases.md

Related Mem0 Skills

SkillWhen to useLink
mem0-cliTerminal commands, scripting, CI/CD, agent tool loopslocal / GitHub
mem0-vercel-ai-sdkVercel AI SDK provider with automatic memorylocal / GitHub

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Automation

Unified Memory V5

统一记忆系统 - AI Agent 专用记忆系统,支持 Context Tree、智能摘要、知识图谱、工作流引擎。零依赖,完整对标 QMD/MetaGPT

Registry SourceRecently Updated
7800Profile unavailable
General

Super Brain

AI自我增强系统 - 让AI跨会话记住用户、持续进化。当需要长期记忆用户偏好、追踪对话历史、学习服务技巧、主动提供个性化服务时使用此技能。

Registry SourceRecently Updated
3530Profile unavailable
Security

Synapse Layer

Provides persistent, encrypted AI agent memory with a 4-layer security pipeline for storing, retrieving, sharing, and analyzing agent memories.

Registry SourceRecently Updated
620Profile unavailable
Coding

M2Wise

Memory-to-Wisdom Engine for AI agents. Use this skill to give yourself long-term memory, extract user preferences/facts from conversations, and track wisdom...

Registry SourceRecently Updated
6120Profile unavailable