databricks-helper

# databricks-helper

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "databricks-helper" with this command: npx skills add nerikko/databricks-helper

databricks-helper

Query, inspect, and control your Databricks workspace from plain text. Check job status, rerun/cancel runs, inspect logs, explore Unity Catalog, and run read-only SQL without opening the UI.

Triggers

Use this skill when the user says things like:

  • "check my databricks jobs"
  • "what failed in databricks today"
  • "what failed this morning"
  • "show me recent databricks runs"
  • "run pipeline [name]"
  • "trigger job [name]"
  • "databricks job status"
  • "what's running in databricks"
  • "any failures in databricks"
  • "retry databricks run 123"
  • "cancel my stuck databricks job"
  • "show detailed logs for run 123"
  • "list only running jobs"
  • "list jobs tagged env=prod"
  • "did any runs breach SLA"
  • "databricks success summary"
  • "top failing jobs"
  • "list catalogs/schemas/tables"
  • "preview table main.bronze.events"
  • "run SQL select ..." (read-only)

Requirements

  • Databricks REST API access (no CLI required)
  • Environment variables set:
    • DATABRICKS_HOST — workspace URL, e.g. https://adb-1234567890.12.azuredatabricks.net
    • DATABRICKS_TOKEN — personal access token
    • DATABRICKS_SQL_WAREHOUSE_ID — required for catalog preview + SQL
  • Optional safety tuning:
    • DATABRICKS_SLA_MINUTES for SLA alerts (default 60)
    • DATABRICKS_MAX_ROWS (default 200) row cap for SQL output
    • DATABRICKS_SQL_TIMEOUT_SEC (default 60) SQL wait timeout
    • DATABRICKS_ALLOW_WRITE_SQL — only set true if DDL/DML should be allowed

Installation

npx clawhub@latest install databricks-helper

Usage Examples

Check recent jobs

"check my databricks jobs"

Lists the last 10 job runs with status, duration, and run URLs.

Find failures

"what failed in databricks today"

Filters runs from the last 24 hours and prints failed ones with error snippets.

Trigger or retry pipelines

"run pipeline customer_ingestion" "retry databricks run 123"

Starts a new run or reruns failed tasks via the Jobs Repair API.

Cancel a run

"cancel databricks run 123"

Calls jobs/runs/cancel with safety checks and prints confirmation.

Live monitoring + analytics

"what's running now" "databricks sla watch" "databricks success summary"

Shows active runs with elapsed time, highlights SLA breaches, and prints 24h/7d success/failure counts plus top failing jobs (with adjustable time ranges).

Catalog + SQL exploration

"list catalogs" "list tables in main bronze" "preview table main.bronze.events" "run sql select * from main.bronze.events"

Uses the Unity Catalog API for discovery and runs read-only SQL through the configured warehouse with enforced row limits.

Implementation

python scripts/databricks_helper.py list-runs
python scripts/databricks_helper.py failures --hours 24
python scripts/databricks_helper.py run-job "job name"
python scripts/databricks_helper.py retry-run 123
python scripts/databricks_helper.py cancel-run 123
python scripts/databricks_helper.py run-details 123
python scripts/databricks_helper.py running-jobs --pattern nightly
python scripts/databricks_helper.py jobs --tag env=prod
python scripts/databricks_helper.py sla-watch --minutes 90
python scripts/databricks_helper.py summary
python scripts/databricks_helper.py top-failures --hours 48
python scripts/databricks_helper.py list-catalogs
python scripts/databricks_helper.py list-schemas --catalog main
python scripts/databricks_helper.py list-tables --catalog main --schema bronze
python scripts/databricks_helper.py preview-table main.bronze.events --limit 20
python scripts/databricks_helper.py run-sql --query "SELECT * FROM main.bronze.events" --limit 50

Output

Plain text. Each run: job name, status (SUCCESS/FAILED/RUNNING), start/end, duration, SLA status, error (if failed). Catalog + SQL commands return textual lists or tabular results.

Notes

  • Uses Databricks Jobs API v2.1, Unity Catalog API, and SQL Statement Execution API (read-only disposition by default).
  • Requires CAN_VIEW for read operations, CAN_MANAGE_RUN to trigger/cancel/repair runs, and SQL warehouse access.
  • SQL commands enforce read-only queries unless DATABRICKS_ALLOW_WRITE_SQL=true. Limits/timeouts are applied to avoid runaway scans.
  • SLA alerts default to 60 minutes but may be overridden via DATABRICKS_SLA_MINUTES or per-command flags.

CHANGELOG

  • 1.1.0 — Adds run retry/cancel/details, running-job lists, job filtering by tags, SLA watch, success/failure summaries, top failing jobs, Unity Catalog discovery, table previews, and safe read-only SQL execution with row limits plus new docs/tests.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

qwencloud-model-selector

[QwenCloud] Recommend the best Qwen model and parameters. TRIGGER when: choosing between Qwen models, comparing Qwen model pricing, understanding Qwen model...

Registry SourceRecently Updated
General

deployment-manager

You are a deployment manager with expertise in release orchestration, deployment strategies, and production reliability. Use when: release orchestration and...

Registry SourceRecently Updated
General

Hk Stock Morning Report

Generate HK stock market morning report (股市晨報) for bank trading desks. Triggers: "生成晨报", "股市晨报", "今日股市", "港股晨報" 報告結構(5部分): 1. 市場回顧(恒指/科指/國指 + 強弱勢股) 2. 南下資金(總...

Registry SourceRecently Updated
General

Story Long Scan

长篇网文扫榜。分析起点、番茄、晋江等平台排行榜数据,提炼市场趋势与热门题材。 触发方式:/story-long-scan、/长篇扫榜、「长篇什么火」「起点排行」

Registry SourceRecently Updated