summarize

Compress without losing truth. Backlink to sources.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "summarize" with this command: npx skills add simhacker/moollm/simhacker-moollm-summarize

Summarize

Compress without losing truth. Backlink to sources.

Context compression for memory management.

[!IMPORTANT] Always backlink. Every summary points to its source. Never orphan knowledge.

The Goal

When files are too large for context:

  1. Summarize — Extract key information
  2. Backlink — Point to original source
  3. Prioritize — Hot/cold hints for what matters
  4. Preserve — Never delete, just compress

Contents

FilePurpose
SKILL.mdFull protocol documentation
SUMMARIES.yml.tmplSummary template

Example

summary:
  source: "designs/original-design.md"
  created: "2025-12-30"
  
  key_points:
    - "Files are state, no hidden memory"
    - "YAML comments carry meaning"
    
  backlink: "../designs/original-design.md"
  full_context_needed_for:
    - "Implementation details"
    - "Edge cases"

The Intertwingularity

Summarize enables the LIFT stage — compress wisdom for sharing.

graph LR
    SUM[📝 summarize] -->|compresses| SL[📜 session-log]
    SUM -->|compresses| RN[📓 research-notebook]
    SUM -->|enables| HF[🌫️ honest-forget]
    SUM -->|part of| PLL[🎮📚🚀 play-learn-lift LIFT]
    
    SR[🔧 self-repair] -->|triggers| SUM

Dovetails With

Sister Skills

SkillRelationship
play-learn-lift/Summarize IS LIFT — share wisdom
honest-forget/Summarize before forgetting
session-log/Source material to compress
research-notebook/Findings to distill
self-repair/Triggers when context exceeds budget
memory-palace/Place summaries in palace rooms

Protocol Symbols

SymbolLink
SUMMARIZEPROTOCOLS.yml
HONEST-FORGETPROTOCOLS.yml
HOT-COLDPROTOCOLS.yml

Kernel

Navigation

DirectionDestination
⬆️ Upskills/
⬆️⬆️ RootProject Root
🌫️ Sisterhonest-forget/
🎮 Sisterplay-learn-lift/

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

context

No summary provided by upstream source.

Repository SourceNeeds Review
Research

ChronoSync

跨会话聊天记录同步工具。自动备份 OpenClaw 各会话的聊天记录, 实现会话间记忆共享,避免切换会话后丢失上下文。

Registry SourceRecently Updated
168
Profile unavailable
Research

Agent Memory Architecture

Complete zero-dependency memory system for AI agents — file-based architecture, daily notes, long-term curation, context management, heartbeat integration, a...

Registry SourceRecently Updated
2187
Profile unavailable