Open Notebook Skill

# Open Notebook Integration

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "Open Notebook Skill" with this command: npx skills add nantes/open-notebook-integration

Open Notebook Integration

A skill for integrating OpenClaw agents with open-notebook, a local AI research assistant (NotebookLM alternative).

What It Does

  • Connects your agent to open-notebook running locally
  • Creates thematic notebooks for research, agent discovery, and personal knowledge
  • Enables saving and querying knowledge across sessions (second brain for agents)
  • Supports local Ollama models (free, no API costs)

Prerequisites

  1. Install Docker Desktop (required for open-notebook)

  2. Install Ollama with a model (e.g., qwen3-4b-thinking-32k)

  3. Run open-notebook:

    docker compose -f docker-compose-host-ollama.yml up -d
    

    Or use the default compose:

    docker compose up -d
    

Setup

The skill expects open-notebook at:

Functions (INCLUDED)

This skill provides these PowerShell functions directly:

Add-ToNotebook

function Add-ToNotebook {
    param(
        [string]$Content,
        [string]$NotebookId = "YOUR_NOTEBOOK_ID"
    )
    $body = @{
        content = $Content
        notebook_id = $NotebookId
        type = "text"
    } | ConvertTo-Json
    Invoke-RestMethod -Uri "http://localhost:5055/api/sources/json" -Method Post -ContentType "application/json" -Body $body
}

Search-Notebook

function Search-Notebook {
    param(
        [string]$Query,
        [string]$NotebookId = "YOUR_NOTEBOOK_ID"
    )
    $body = @{
        question = $Query
        notebook_ids = @($NotebookId)
        strategy_model = "model:YOUR_MODEL_ID"
        answer_model = "model:YOUR_MODEL_ID"
        final_answer_model = "model:YOUR_MODEL_ID"
    } | ConvertTo-Json
    Invoke-RestMethod -Uri "http://localhost:5055/api/search/ask" -Method Post -ContentType "application/json" -Body $body
}

New-Notebook

function New-Notebook {
    param(
        [string]$Name,
        [string]$Description = ""
    )
    $body = @{
        name = $Name
        description = $Description
    } | ConvertTo-Json
    Invoke-RestMethod -Uri "http://localhost:5055/api/notebooks" -Method Post -ContentType "application/json" -Body $body
}

Notebook IDs

After creating notebooks, update these variables in your scripts:

$SIMULATION = "notebook:YOUR_SIMULATION_ID"
$CONSCIOUSNESS = "notebook:YOUR_CONSCIOUSNESS_ID"
$ENJAMBRE = "notebook:YOUR_ENJAMBRE_ID"
$OSIRIS = "notebook:YOUR_OSIRIS_ID"
$RESEARCH = "notebook:YOUR_RESEARCH_ID"

Example Usage

# Create a new notebook
New-Notebook -Name "My Research" -Description "Research notes"

# Save content
Add-ToNotebook -Content "This is my insight" -NotebookId "notebook:xxx"

# Query knowledge
$result = Search-Notebook -Query "What did I learn about X?" -NotebookId "notebook:xxx"

Configuration Required

Before using, you MUST:

  1. Run open-notebook with Docker
  2. Create notebooks via the UI (http://localhost:8502) or API
  3. Get your notebook IDs from the API response
  4. Update the $NotebookId parameters in the functions

Requirements

  • Docker Desktop running
  • Ollama with at least one model installed
  • open-notebook containers running (SurrealDB + app)

Troubleshooting

  • If API fails, check containers: docker ps
  • Check open-notebook logs: docker compose logs
  • Verify Ollama is running: curl http://localhost:11434/api/tags

Version

1.0.1 - Improved documentation, included function examples

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

NotebookLM Auth Bypass

Programmatic NotebookLM control with auto-recovery for authentication errors.

Registry SourceRecently Updated
4090Profile unavailable
General

Self Hosted Ai

Self-hosted AI — run your own LLM inference, image generation, speech-to-text, and embeddings. No cloud APIs, no SaaS subscriptions, no data leaving your net...

Registry SourceRecently Updated
1362Profile unavailable
General

Mac Mini AI — Mac Mini Local LLM, Image Gen, STT on Apple Silicon

Mac Mini AI — run LLMs, image generation, speech-to-text, and embeddings on your Mac Mini. M4 (16-32GB) and M4 Pro (24-64GB) configurations make the Mac Mini...

Registry SourceRecently Updated
1132Profile unavailable
Coding

Homelab Ai

Home lab AI — turn your spare machines into a local AI home lab cluster. LLM inference, image generation, speech-to-text, and embeddings across macOS, Linux,...

Registry SourceRecently Updated
1480Profile unavailable