Opportunity Scout
Hunt for real demand signals — not news, not trends, but people expressing pain, frustration, and unmet needs that represent building opportunities.
Skill Directory
All paths below are relative to this skill's directory.
scripts/configure.py— manage niches, keywords, sources, schedulescripts/scan_sources.py— generate search queries and process resultsscripts/score_signals.py— score and rank findingsscripts/digest.py— generate prioritized markdown digestscripts/history.py— track signals over time, detect trendsreferences/signal-types.md— what counts as a demand signal (read when scoring)references/source-guide.md— how to configure sources effectivelyassets/config.example.json— example niche configurations
Data Files
All state lives in the skill directory:
config.json— active configuration (created by configure.py)history.json— signal history log (created by history.py)findings/— raw and scored finding files per scan
Workflow
First-Time Setup
- Run
configure.py --initto create config.json from the example, or:configure.py --add-niche "AI tools for small business" --keywords "wish,need,looking for,alternative to,frustrated"configure.py --add-source reddit:r/SaaS,reddit:r/smallbusiness,hackernewsconfigure.py --set-schedule daily
Running a Scan
Execute these steps in order:
-
Generate queries: Run
scan_sources.py --generate-queriesto get optimized search queries. It prints JSON with query strings. -
Execute searches: For each query, call the
web_searchtool. Collect all results into a JSON array and save to a temp file. -
Ingest results: Run
scan_sources.py --ingest <results.json>to parse raw search results into standardized findings. Outputs findings JSON. -
Score findings: Run
score_signals.py <findings.json>to score each finding on signal strength, engagement, freshness, competition, and recurrence. Outputs scored JSON. -
Update history: Run
history.py --update <scored.json>to log findings and detect trend patterns (persistent, emerging, fading). -
Generate digest: Run
digest.py <scored.json>to produce the markdown report. Use--output <path>to save to a specific location (e.g., Obsidian vault). Use--max-results 20to limit output.
Quick Scan (Single Command Summary)
For a rapid scan of a single niche without full config:
- Run
scan_sources.py --quick "developer tools for AI agents"to get queries - Execute web_search for each query
- Pipe results through score and digest
Reading References
- Before scoring or evaluating signals manually, read
references/signal-types.mdfor the taxonomy of demand signals and how to distinguish real demand from noise. - When helping users configure sources, read
references/source-guide.md.
Cron Integration
Set schedule in config.json via configure.py --set-schedule daily|weekly.
When triggered by cron, run the full scan workflow above. Save digest to the
user's preferred output location (default: skill directory findings/).
Key Design Principles
- Demand, not news: Every finding should express unmet need, frustration, or a gap. Filter aggressively — 10 strong signals beat 100 weak ones.
- Batch queries: Combine niche + keywords into fewer, broader queries rather than one query per keyword. Respect rate limits.
- Track over time: Signals that persist across scans are more valuable than one-offs. Use history.py to surface persistent demand and fading trends.
- Score honestly: High engagement + low competition + recurring = strong opportunity. Don't inflate scores — the user needs signal, not noise.