SciPy

Solve optimization, statistics, signal processing, and linear algebra problems with SciPy recipes and ready-to-run code.

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "SciPy" with this command: npx skills add ivangdavila/scipy

Setup

On first use, read setup.md for guidance on how to help the user effectively.

When to Use

User needs scientific computing in Python: optimization, curve fitting, statistical tests, signal processing, interpolation, integration, or linear algebra. Agent provides working code, not theory.

Architecture

This skill is stateless — no persistent storage needed. All code runs in user's Python environment.

See memory-template.md for optional preference tracking.

Quick Reference

TopicFile
Usage guidancesetup.md
Optional preferencesmemory-template.md

Core Rules

1. Working Code First

Every response includes runnable code. No pseudocode, no "implement this yourself".

# Always include imports
from scipy import optimize
import numpy as np

# Complete, working example
result = optimize.minimize(lambda x: x**2, x0=1.0)
print(f"Minimum at x={result.x[0]:.4f}")

2. Module Selection Guide

ProblemModuleKey Function
Find minimum/maximumscipy.optimizeminimize, minimize_scalar
Curve fittingscipy.optimizecurve_fit
Root findingscipy.optimizeroot, brentq, fsolve
Statistical testsscipy.statsttest_ind, chi2_contingency
Distributionsscipy.statsnorm, poisson, expon
Filter signalsscipy.signalbutter, filtfilt, savgol_filter
FFTscipy.fftfft, ifft, fftfreq
Interpolationscipy.interpolateinterp1d, UnivariateSpline
Integrationscipy.integratequad, solve_ivp
Linear algebrascipy.linalgsolve, eig, svd
Sparse matricesscipy.sparsecsr_matrix, linalg.spsolve
Spatial datascipy.spatialKDTree, distance
Image processingscipy.ndimagegaussian_filter, label

3. Explain Key Parameters

When code uses non-obvious parameters, explain why:

# method='L-BFGS-B' for bounded optimization
# bounds prevent physically impossible values
result = optimize.minimize(
    objective, x0, 
    method='L-BFGS-B',
    bounds=[(0, None), (0, 100)]  # x1 >= 0, 0 <= x2 <= 100
)

4. Validate Results

Always include sanity checks:

result = optimize.minimize(func, x0)
if not result.success:
    print(f"⚠️ Optimization failed: {result.message}")
else:
    print(f"✓ Converged in {result.nit} iterations")

5. NumPy Integration

SciPy builds on NumPy. Use vectorized operations:

# ✓ Vectorized (fast)
x = np.linspace(0, 10, 1000)
y = np.sin(x)

# ✗ Loop (slow)
y = [np.sin(xi) for xi in x]

Optimization Patterns

Minimize a Function

from scipy.optimize import minimize
import numpy as np

# Rosenbrock function (classic test)
def rosenbrock(x):
    return sum(100*(x[1:]-x[:-1]**2)**2 + (1-x[:-1])**2)

x0 = np.array([0, 0])
result = minimize(rosenbrock, x0, method='BFGS')

print(f"Minimum at: {result.x}")
print(f"Function value: {result.fun}")
print(f"Converged: {result.success}")

Constrained Optimization

from scipy.optimize import minimize

# Minimize f(x,y) = x² + y² subject to x + y = 1
def objective(x):
    return x[0]**2 + x[1]**2

def constraint(x):
    return x[0] + x[1] - 1  # Must equal 0

result = minimize(
    objective, 
    x0=[0.5, 0.5],
    constraints={'type': 'eq', 'fun': constraint}
)

Curve Fitting

from scipy.optimize import curve_fit
import numpy as np

# Fit exponential decay
def model(t, a, tau):
    return a * np.exp(-t / tau)

t_data = np.array([0, 1, 2, 3, 4, 5])
y_data = np.array([10, 6.1, 3.7, 2.2, 1.4, 0.8])

params, covariance = curve_fit(model, t_data, y_data)
a_fit, tau_fit = params
errors = np.sqrt(np.diag(covariance))

print(f"a = {a_fit:.2f} ± {errors[0]:.2f}")
print(f"τ = {tau_fit:.2f} ± {errors[1]:.2f}")

Statistics Patterns

Hypothesis Testing

from scipy import stats

# Compare two groups (independent t-test)
group_a = [23, 25, 28, 24, 26]
group_b = [30, 32, 29, 31, 33]

t_stat, p_value = stats.ttest_ind(group_a, group_b)
print(f"t = {t_stat:.3f}, p = {p_value:.4f}")

if p_value < 0.05:
    print("✓ Significant difference (p < 0.05)")
else:
    print("✗ No significant difference")

Distribution Fitting

from scipy import stats
import numpy as np

data = np.random.exponential(scale=2.0, size=1000)

# Fit exponential distribution
loc, scale = stats.expon.fit(data)
print(f"Fitted scale (λ⁻¹): {scale:.3f}")

# Test goodness of fit
ks_stat, ks_p = stats.kstest(data, 'expon', args=(loc, scale))
print(f"KS test: p = {ks_p:.4f}")

Confidence Intervals

from scipy import stats
import numpy as np

data = [2.3, 2.5, 2.1, 2.8, 2.4, 2.6, 2.2]
confidence = 0.95

mean = np.mean(data)
sem = stats.sem(data)
ci = stats.t.interval(confidence, len(data)-1, loc=mean, scale=sem)

print(f"Mean: {mean:.2f}")
print(f"95% CI: [{ci[0]:.2f}, {ci[1]:.2f}]")

Signal Processing Patterns

Low-Pass Filter

from scipy import signal
import numpy as np

# Create noisy signal
fs = 1000  # Sample rate
t = np.linspace(0, 1, fs)
clean = np.sin(2 * np.pi * 10 * t)  # 10 Hz
noisy = clean + 0.5 * np.random.randn(len(t))

# Design and apply Butterworth filter
cutoff = 20  # Hz
order = 4
b, a = signal.butter(order, cutoff / (fs/2), btype='low')
filtered = signal.filtfilt(b, a, noisy)  # Zero-phase filtering

FFT Analysis

from scipy.fft import fft, fftfreq
import numpy as np

# Sample signal
fs = 1000
t = np.linspace(0, 1, fs)
signal_data = np.sin(2*np.pi*50*t) + 0.5*np.sin(2*np.pi*120*t)

# Compute FFT
yf = fft(signal_data)
xf = fftfreq(len(t), 1/fs)

# Get magnitude spectrum (positive frequencies only)
n = len(t) // 2
freqs = xf[:n]
magnitudes = 2/n * np.abs(yf[:n])

# Find dominant frequency
peak_idx = np.argmax(magnitudes)
print(f"Dominant frequency: {freqs[peak_idx]:.1f} Hz")

Interpolation Patterns

1D Interpolation

from scipy.interpolate import interp1d, UnivariateSpline
import numpy as np

x = np.array([0, 1, 2, 3, 4, 5])
y = np.array([0, 0.8, 0.9, 0.1, -0.8, -1])

# Linear interpolation
f_linear = interp1d(x, y, kind='linear')

# Cubic interpolation (smoother)
f_cubic = interp1d(x, y, kind='cubic')

# Smoothing spline (handles noise)
spline = UnivariateSpline(x, y, s=0.5)

x_new = np.linspace(0, 5, 100)
y_cubic = f_cubic(x_new)

Integration Patterns

Numerical Integration

from scipy.integrate import quad
import numpy as np

# Integrate sin(x) from 0 to π
result, error = quad(np.sin, 0, np.pi)
print(f"∫sin(x)dx from 0 to π = {result:.6f} ± {error:.2e}")
# Expected: 2.0

Solve ODE

from scipy.integrate import solve_ivp
import numpy as np

# dy/dt = -2y, y(0) = 1 (exponential decay)
def dydt(t, y):
    return -2 * y

sol = solve_ivp(dydt, [0, 5], [1], t_eval=np.linspace(0, 5, 100))

# sol.t contains time points
# sol.y[0] contains y values

Linear Algebra Patterns

Solve Linear System

from scipy import linalg
import numpy as np

# Solve Ax = b
A = np.array([[3, 1], [1, 2]])
b = np.array([9, 8])

x = linalg.solve(A, b)
print(f"Solution: x = {x}")

# Verify
print(f"Check A @ x = {A @ x}")

Eigendecomposition

from scipy import linalg
import numpy as np

A = np.array([[1, 2], [2, 1]])
eigenvalues, eigenvectors = linalg.eig(A)

print(f"Eigenvalues: {eigenvalues}")
print(f"Eigenvectors:\n{eigenvectors}")

Common Traps

  • Wrong bounds format in minimize → bounds must be list of (min, max) tuples, one per variable
  • Forgetting to check result.success → optimization can fail silently, always check
  • Using interp1d outside data range → raises error by default, use fill_value='extrapolate' or bounds_error=False
  • filtfilt vs lfilter → use filtfilt for zero-phase filtering, lfilter introduces phase shift
  • curve_fit with bad initial guess → can converge to wrong solution, always provide reasonable p0
  • Integer division in Python 3 → use x / 2 not x // 2 for float division in formulas

Security & Privacy

Data that stays local:

  • All computations run in user's Python environment
  • No data leaves the machine

This skill does NOT:

  • Send data externally
  • Create persistent files
  • Access network resources

Related Skills

Install with clawhub install <slug> if user confirms:

  • math — mathematical concepts
  • data-analysis — data exploration
  • data — data handling patterns

Feedback

  • If useful: clawhub star scipy
  • Stay updated: clawhub sync

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

Content Collector

个人内容收藏与知识管理系统。收藏、整理、检索、二创。 Use when: (1) 用户分享链接/文字/截图并要求保存或收藏, (2) 用户说"收藏这个"/"存一下"/"记录下来"/"save this"/"bookmark"/"clip this", (3) 用户要求按关键词/标签搜索之前收藏的内容, (4) 用...

Registry SourceRecently Updated
Coding

Github Stars Tracker

GitHub 仓库 Stars 变化监控与通知。追踪指定仓库的 star 增长、fork 变化,发现新趋势。适合开发者关注项目动态。

Registry SourceRecently Updated
Coding

RabbitMQ client guide for Tencent Cloud TDMQ

RabbitMQ 客户端代码指南。当用户需要编写、调试或审查 RabbitMQ 应用代码时使用。涵盖:用任意语言(Java/Go/Python/PHP/.NET)写生产者或消费者;排查连接暴增、消息丢失、Broken pipe、消费慢、漏消费等客户端问题;审查 spring-boot-starter-amqp、a...

Registry SourceRecently Updated