weave-integration

Add W&B Weave observability and LLM tracing to any project. Use when instrumenting LLM calls for token visibility, latency tracking, cost analysis, or debugging. Supports TypeScript/Node.js and Python projects. Weave provides automatic tracing for OpenAI, Anthropic, and 20+ LLM providers with minimal code changes.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "weave-integration" with this command: npx skills add altryne/weavify-skill/altryne-weavify-skill-weave-integration

Weave Integration

Add W&B Weave observability to any LLM project. Weave traces all LLM calls, captures tokens/latency/costs, and provides a UI for debugging and evaluation.

Quick Start

1. Install

# TypeScript/Node.js
npm install weave

# Python
pip install weave

2. Get API Key

Set WANDB_API_KEY environment variable. Get key from wandb.ai/settings.

export WANDB_API_KEY="your-key-here"

3. Initialize

// TypeScript
import * as weave from 'weave';
await weave.init('your-team/project-name');
# Python
import weave
weave.init('your-team/project-name')

4. Trace LLM Calls

Auto-patching (supported providers traced automatically):

// TypeScript - CommonJS: works out of the box
import OpenAI from 'openai';
import * as weave from 'weave';

await weave.init('my-project');
const client = new OpenAI();
// All calls now traced automatically

Manual wrapping (for custom functions or unsupported libs):

// TypeScript
const myFunction = weave.op(async (input: string) => {
  // your code here
  return result;
});
# Python
@weave.op()
def my_function(input: str):
    return result

TypeScript Setup Details

See references/typescript.md for:

  • ESM configuration (--import=weave/instrument)
  • Bundler compatibility (Next.js, Vite)
  • Manual patching fallback

Supported Providers (Auto-traced)

OpenAI, Anthropic, Cohere, Mistral, Google, Groq, Together AI, LiteLLM, Azure, Bedrock, Cerebras, HuggingFace, OpenRouter, NVIDIA NIM, and more.

Full list: https://docs.wandb.ai/weave/guides/integrations

Integration Workflow

When adding Weave to a project:

  1. Find LLM call sites — search for OpenAI/Anthropic client usage
  2. Add weave.init() — early in app startup, before any LLM calls
  3. Verify auto-patching — check traces appear in W&B UI
  4. Wrap custom functions — use weave.op() for additional visibility
  5. Add cost tracking — Weave tracks tokens automatically for supported providers

Viewing Traces

After running your app:

  • Open wandb.ai → Your project → Weave tab
  • See all traces with inputs, outputs, latency, token usage, costs
  • Filter, search, and export call data

Environment Variables

VariablePurpose
WANDB_API_KEYAuthentication (required)
WEAVE_IMPLICITLY_PATCH_INTEGRATIONSSet false to disable auto-patching

Common Patterns

Wrap Existing Client

import { wrapOpenAI } from 'weave';
import OpenAI from 'openai';

const client = wrapOpenAI(new OpenAI());

Trace Class Methods

class MyAgent {
  @weave.op
  async predict(prompt: string) {
    return "response";
  }
}

Add Display Names

const myOp = weave.op(myFunction, {
  callDisplayName: (input) => `Custom Name: ${input}`
});

Clawdbot-Specific Integration

For Clawdbot/similar Node.js agents:

  1. Locate the LLM client initialization (usually Anthropic/OpenAI SDK)
  2. Add weave.init() in the main entry point
  3. For ESM, add --import=weave/instrument to node invocation
  4. All provider calls will be traced to W&B

Troubleshooting

  • No traces appearing: Check WANDB_API_KEY is set
  • ESM not patching: Use --import=weave/instrument flag
  • Bundler issues: Mark LLM libs as external in config
  • Manual fallback: Use wrapOpenAI() or explicit weave.op() wrappers

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

Raspberry Pi Manager

Manage Raspberry Pi devices — GPIO control, system monitoring (CPU/temp/memory), service management, sensor data reading, and remote deployment. Use when you...

Registry SourceRecently Updated
Coding

LinkdAPI

Complete LinkdAPI integration OpenClaw skill. Includes all 50+ endpoints, Python/Node.js/Go SDKs, authentication, rate limits, and real-world examples. Use t...

Registry SourceRecently Updated
Coding

Tesla Commander

Command and monitor Tesla vehicles via the Fleet API. Check status, control climate/charging/locks, track location, and analyze trip history. Use when you ne...

Registry SourceRecently Updated
0154
Profile unavailable