hugging-face-cli

Manage Hugging Face Hub via hf CLI. Use when working with HF AI models, datasets, spaces, or repos.

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "hugging-face-cli" with this command: npx skills add yevhendiachenko0/hugging-face-cli

Hugging Face CLI

Hugging Face (https://huggingface.co) is the leading platform for sharing and collaborating on AI models, datasets, and spaces. This skill enables interaction with the Hub through the official hf CLI.

Installation

Check if hf is available by running hf version. If not installed:

pip install -U "huggingface_hub[cli]"
# or
brew install hf

If the options above do not work, follow the official installation guide.

After installation, run hf version to verify. If the command is not found, run source ~/.bashrc (or source ~/.zshrc for zsh) to reload the PATH, then try again.

Authentication

A Hugging Face User Access Token is required. The token is provided via the HF_TOKEN environment variable.

If authentication fails or the token is missing, instruct the user to:

  1. Go to https://huggingface.co/settings/tokens
  2. Create a new token — there are two permission levels:
    • Read (safer): sufficient for searching, downloading models/datasets, listing repos, browsing papers, and most read-only operations. Choose this if you only need to explore and download.
    • Write (less safe, broader access): required for creating/deleting repos, uploading files, managing discussions, deploying endpoints, and running jobs. Example 3 (create a repo and upload weights) requires a write token.
  3. Set it as an environment variable: export HF_TOKEN="hf_..." (add to shell profile for persistence)

Important: Do NOT run hf auth login interactively — it requires terminal input. Instead, use the environment variable directly. The hf CLI automatically picks up HF_TOKEN from the environment for all commands. To verify authentication, run:

hf auth whoami

Key Commands

TaskCommand
Check current userhf auth whoami
Download fileshf download <repo_id> [files...] [--local-dir <path>]
Download specific revisionhf download <repo_id> --revision <branch|tag|commit>
Download with filtershf download <repo_id> --include "*.safetensors" --exclude "*.bin"
Upload fileshf upload <repo_id> <local_path> [path_in_repo]
Upload as PRhf upload <repo_id> <local_path> [path_in_repo] --create-pr
Upload (private repo)hf upload <repo_id> <local_path> [path_in_repo] --private
Upload large folderhf upload-large-folder <repo_id> <local_path>
Create a repohf repos create <name> [--repo-type model|dataset|space] [--private]
Delete a repohf repos delete <repo_id>
Delete files from repohf repos delete-files <repo_id> <path>...
Duplicate a repohf repos duplicate <repo_id> [--type model|dataset|space]
Repo settingshf repos settings <repo_id> [--private|--public]
Manage brancheshf repos branch create|delete <repo_id> <branch>
Manage tagshf repos tag create|delete <repo_id> <tag>
List modelshf models ls [--search <query>] [--sort downloads] [--limit N]
Model infohf models info <repo_id>
List datasetshf datasets ls [--search <query>]
Dataset infohf datasets info <repo_id>
Run SQL on datahf datasets sql "<SQL>"
List spaceshf spaces ls [--search <query>]
Space infohf spaces info <repo_id>
Space dev modehf spaces dev-mode <repo_id>
List papershf papers ls [--limit N]
List collectionshf collections ls [--owner <user>] [--sort trending]
Create collectionhf collections create "<title>"
Collection infohf collections info <collection_slug>
Add to collectionhf collections add-item <collection_slug> <repo_id> <type>
Delete collectionhf collections delete <collection_slug>
Run a cloud jobhf jobs run <docker_image> <command>
List jobshf jobs ps
Job logshf jobs logs <job_id>
Cancel a jobhf jobs cancel <job_id>
Job hardwarehf jobs hardware
Deploy endpointhf endpoints deploy <name> --repo <repo_id> --framework <fw> --accelerator <hw> ...
List endpointshf endpoints ls
Endpoint infohf endpoints describe <name>
Pause/resume endpointhf endpoints pause|resume <name>
Delete endpointhf endpoints delete <name>
List discussionshf discussions ls <repo_id>
Create discussionhf discussions create <repo_id> --title "<title>"
Comment on discussionhf discussions comment <repo_id> <num> --body "<text>"
Close discussionhf discussions close <repo_id> <num>
Merge PRhf discussions merge <repo_id> <num>
Manage cachehf cache ls, hf cache rm <id>, hf cache prune
Delete bucket / fileshf buckets delete <user>/<bucket>, hf buckets rm <user>/<bucket>/<path>
Sync to buckethf sync <local_path> hf://buckets/<user>/<bucket>
Print environmenthf env

End-to-End Examples

Example 1: Explore trending models, pick one, and preview a download

hf models ls --sort trending_score --limit 5
hf models info openai-community/gpt2
hf download --dry-run openai-community/gpt2 config.json tokenizer.json
hf download openai-community/gpt2 config.json tokenizer.json --local-dir ./gpt2

Example 2: Browse today's papers and find related datasets

hf papers ls --limit 5
hf datasets ls --search "code" --sort downloads --limit 5
hf datasets info bigcode/the-stack

Example 3: Create a private model repo and upload weights

hf repos create my-fine-tuned-model --private
# create returns <your-username>/my-fine-tuned-model — use that full ID below
hf upload <username>/my-fine-tuned-model ./output --commit-message "Add fine-tuned weights"
hf repos tag create <username>/my-fine-tuned-model v1.0 -m "Initial release"

Further Reference

Reference version: hf CLI v1.x

For the full list of commands and options, use built-in help:

hf --help
hf <command> --help

Safety Rules

  • Destructive commands require explicit user confirmation. Before running any of the following, describe what will happen and ask the user to confirm:
    • hf repos delete — permanently deletes a repository
    • hf repos delete-files — deletes files from a repository
    • hf buckets delete / hf buckets rm — deletes buckets or bucket files
    • hf discussions close / hf discussions merge — closes or merges PRs/discussions
    • hf collections delete — permanently deletes a collection
    • hf endpoints delete — permanently deletes an Inference Endpoint
    • hf jobs cancel — cancels a running compute job
    • Any command with --delete flag (e.g., sync with deletion)
    • hf cache rm / hf cache prune — removes cached data from disk (re-downloadable, but may waste bandwidth)
  • Never expose or log the HF_TOKEN value. Do not include it in command output or commit it to files.
  • When uploading, warn the user if the target repo is public and the upload may contain sensitive data.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

MagicBrowse

Browser automation fallback through the magicbrowse CLI for goal-driven launch, approved attach, observe, and act on real web pages.

Registry SourceRecently Updated
Coding

GitHub Trending Skill

每日 GitHub Trending 热榜推送,支持日榜和月度汇总

Registry SourceRecently Updated
Coding

Proworkflow

ProWorkflow integration. Manage Clients, Staffs, Quotes, Templates, Messages, Groups. Use when the user wants to interact with ProWorkflow data.

Registry SourceRecently Updated
Coding

retarus-sms4a

Send SMS jobs and check SMS delivery status through the Retarus SMS for Applications REST API. Use when Codex or OpenClaw needs to create SMS jobs, inspect p...

Registry SourceRecently Updated