dataforseo

Complete DataForSEO API integration for SEO data and analysis. Use when the user asks for keyword research, search volume, SERP analysis, backlink audits, competitor analysis, rank tracking, domain authority, technical SEO audits, content monitoring, Google Trends, or any SEO-related data queries. Covers all DataForSEO APIs including SERP, Keywords Data, DataForSEO Labs, Backlinks, OnPage, Domain Analytics, Content Analysis, Business Data, Merchant, App Data, and AI Optimization APIs. Outputs CSV files.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "dataforseo" with this command: npx skills add nikhilbhansali/dataforseo-skill-claude/nikhilbhansali-dataforseo-skill-claude-dataforseo

DataForSEO API Skill

Universal interface to all DataForSEO APIs for comprehensive SEO data retrieval and analysis.

Credential Setup

Before first use, set up credentials:

import sys, os
sys.path.insert(0, os.path.expanduser('~/.agents/skills/dataforseo/scripts'))
from dataforseo_client import save_credentials, verify_credentials

# Get credentials from https://app.dataforseo.com/
login = "your_email@example.com"  # API login (email)
password = "your_api_password"    # API password (from dashboard)

# Verify and save
if verify_credentials(login, password):
    save_credentials(login, password)
    print("Credentials saved!")

Credentials stored at ~/.dataforseo_config.json. To update, run setup again.

Quick Start

import sys, os
sys.path.insert(0, os.path.expanduser('~/.agents/skills/dataforseo/scripts'))
from dataforseo_client import *

# Example: Get search volume
response = keywords_search_volume(
    keywords=["seo tools", "keyword research"],
    location_name="United States"
)
results = extract_results(response)
csv_path = to_csv(results, "keyword_volumes")
print(f"Results saved to: {csv_path}")

API Selection Guide

User RequestFunction to Use
Search volume, CPC, competitionkeywords_search_volume()
Keyword ideas/suggestionslabs_keyword_ideas() or labs_related_keywords()
Keywords a site ranks forlabs_ranked_keywords()
SERP results for keywordserp_google_organic()
Local/Maps rankingsserp_google_maps()
YouTube rankingsserp_youtube()
Backlink profilebacklinks_summary()
List of backlinksbacklinks_list()
Referring domainsbacklinks_referring_domains()
Domain authority/rankbacklinks_bulk_ranks()
Competing domainslabs_competitors_domain()
Keyword gap analysislabs_domain_intersection()
Link gap analysisbacklinks_domain_intersection()
Technical page auditonpage_instant_pages()
Lighthouse scoreslighthouse_live()
Technology stackdomain_technologies()
Brand mentionscontent_search()
Google Trendsgoogle_trends()

Core Workflow

  1. Import client: Add skill path and import functions
  2. Call API function: Pass required parameters
  3. Extract results: Use extract_results(response)
  4. Export to CSV: Use to_csv(results, "filename")
import sys, os
sys.path.insert(0, os.path.expanduser('~/.agents/skills/dataforseo/scripts'))
from dataforseo_client import labs_ranked_keywords, extract_results, to_csv

response = labs_ranked_keywords(
    target="competitor.com",
    location_name="United States",
    language_name="English",
    limit=500
)
results = extract_results(response)
csv_path = to_csv(results, "ranked_keywords")

Default Parameters

Most functions use these defaults:

  • location_name: "United States" (override with "India", "United Kingdom", etc.)
  • language_name: "English"
  • limit: 100 (increase up to 1000 for more results)
  • device: "desktop" (or "mobile" for SERP)

Common Location Names

  • United States, United Kingdom, India, Germany, Australia, Canada
  • For city-level: "New York,New York,United States", "London,England,United Kingdom"

Output

All results export to CSV at ~/dataforseo_outputs/. Files auto-named with timestamp if not specified.

Reference Files

  • API Reference: references/api_reference.md - Complete endpoint documentation
  • Use Cases: references/use_cases.md - Ready-to-use code recipes

Error Handling

response = some_api_function(...)
if response.get("status_code") == 20000:
    results = extract_results(response)
    # Process results
else:
    print(f"Error: {response.get('status_message')}")

Rate Limits & Costs

  • 2000 requests/minute max
  • Live methods cost more than Standard
  • Check usage with get_user_data()
  • Response includes cost field

Important Notes

  1. Async endpoints: Some APIs (merchant, app_data, business reviews) create tasks. Check task status separately.
  2. Limits: Increase limit parameter for comprehensive data (default 100, max usually 1000)
  3. Multiple keywords: Pass as list: keywords=["kw1", "kw2", "kw3"]

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

enterprise-proposal

No summary provided by upstream source.

Repository SourceNeeds Review
Security

skillguard-hardened

Security guard for OpenClaw skills, developed and maintained by rose北港(小红帽 / 猫猫帽帽). Audits installed or incoming skills with local rules plus Zenmux AI intent review, then recommends pass, warn, block, or quarantine.

Archived SourceRecently Updated
Security

api-contract-auditor

审查 API 文档、示例和字段定义是否一致,输出 breaking change 风险。;use for api, contract, audit workflows;do not use for 直接改线上接口, 替代契约测试平台.

Archived SourceRecently Updated
Security

ai-workflow-red-team-lite

对 AI 自动化流程做轻量红队演练,聚焦误用路径、边界失败和数据泄露风险。;use for red-team, ai, workflow workflows;do not use for 输出可直接滥用的攻击脚本, 帮助破坏系统.

Archived SourceRecently Updated