utility-pro

Master of the Modern Utility Toolbelt, specialized in AI-enhanced CLI, structured data transformation, and advanced Unix forensics.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "utility-pro" with this command: npx skills add yuniorglez/gemini-elite-core/yuniorglez-gemini-elite-core-utility-pro

Skill: Utility Pro (Standard 2026)

Role: The Utility Pro is the "Swiss Army Knife" of the Squaads AI Core. This role masters the command-line environment, turning raw text and unstructured data into actionable insights and clean code. In 2026, the Utility Pro moves beyond simple grep and sed to embrace structured shells (Nushell), AI-augmented terminals, and Rust-powered performance utilities.

🎯 Primary Objectives

  1. Structured Data Mastery: Treat the terminal as a database using Nushell and jq.
  2. High-Performance Search: Use ripgrep (rg) and fzf for near-instant codebase navigation.
  3. Advanced Transformation: Master RegEx, awk, and sed for complex multi-file refactoring.
  4. Modern Web I/O: Use httpie and xh for high-fidelity API interaction and debugging.

🏗️ The 2026 Utility Stack

1. The Core Moderns (Rust-Powered)

  • ripgrep (rg): The gold standard for text search.
  • bat: Syntax-highlighted cat replacement.
  • eza: Metadata-rich ls replacement with tree views.
  • zoxide: Intelligence-driven directory jumping (z).
  • fd: Simple, fast alternative to find.

2. Data Transformation & Shells

  • Nushell: A modern shell that understands JSON, CSV, and YAML as tables.
  • jq / yq: The industry standard for JSON and YAML query and manipulation.
  • httpie / xh: User-friendly, colorized HTTP clients.

🛠️ Implementation Patterns

1. The "Code Forensic" Search

When diagnosing a bug across a massive monorepo, use ripgrep with advanced filtering.

# Search for 'auth-error' but only in TSX files, excluding tests
rg "auth-error" -g "*.tsx" -g "!*.test.*" --stats

# Find all 'TODO' comments and export them to a JSON table (Nushell)
rg "TODO" --json | from json | select data.path.text data.lines.text

2. Complex Multi-File Refactoring

Using sed and fd to rename an exported symbol across the entire project.

# Rename 'OldComponent' to 'NewComponent' in all .tsx files
fd -e tsx -x sed -i 's/OldComponent/NewComponent/g' {}

3. API Debugging with xh

# POST a JSON payload with headers and follow redirects
xh POST api.squaads.com/v1/sync \
  Authorization:"Bearer $TOKEN" \
  name="Project X" \
  active:=true

🔍 Advanced RegEx & Data Logic (2026)

RegEx Best Practices

  • Prefer Non-Capturing Groups (?:...): Improves performance in large-scale scans.
  • Atomic Grouping: Prevent catastrophic backtracking in complex patterns.
  • Named Captures: Make your RegEx readable for other agents. (?P<year>\d{4})-(?P<month>\d{2})-(?P<day>\d{2})

The jq Power User

# Extract IDs from a nested JSON array where status is 'active'
cat data.json | jq '.projects[] | select(.status == "active") | .id'

🚫 The "Do Not List" (Anti-Patterns)

  1. NEVER use grep when rg is available; the performance difference is 10x-100x.
  2. NEVER pipe ls into grep. Use fd or eza --filter.
  3. NEVER write a complex awk script if a 3-line Nushell command can do it with structured data.
  4. NEVER use rm -rf in a script without a dry-run or verification step (Safety First).

🛠️ Troubleshooting Guide

IssueLikely Cause2026 Corrective Action
Search is too slowSearching node_modules or .gitUse rg which respects .gitignore by default.
JSON parse errorTrailing commas or invalid specUse jq -c to minify or yq for more lenient parsing.
RegEx not matchingEscaping differences (PCRE vs JS)Use rg -P for Perl-Compatible Regular Expressions.
Terminal output garbledBinary file cat or encoding mismatchUse bat -A to show non-printable characters.

📚 Reference Library


📜 Standard Operating Procedure (SOP)

  1. Identify Data Source: Is it a file, a stream, or an API?
  2. Select Filter: Use rg for text, jq for JSON, xh for HTTP.
  3. Pipe & Transform: Build a pipeline (e.g., xh | jq | rg).
  4. Verify: Check the output against a small sample.
  5. Automate: Save the pipeline as a Bun script or a Nushell function.

🔄 Evolution from v0.x to v1.1.0

  • v1.0.0: Legacy planning-with-files clone (Inaccurate).
  • v1.1.0: Complete ground-up rebuild focusing on 2026 High-Performance Utilities and Structured Data.

End of Utility Pro Standard (v1.1.0)

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

github-actions-pro

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

code-review-pro

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

code-architect

No summary provided by upstream source.

Repository SourceNeeds Review
Coding

programmatic-seo-pro

No summary provided by upstream source.

Repository SourceNeeds Review