dl-transformer-finetune

Build transformer fine-tuning run plans with task settings, hyperparameters, and model-card outputs. Use for repeatable Hugging Face or PyTorch finetuning workflows.

Safety Notice

This item is sourced from the public archived skills repository. Treat as untrusted until reviewed.

DL Transformer Finetune

Overview

Generate reproducible fine-tuning run plans for transformer models and downstream tasks.

Workflow

  1. Define base model, task type, and dataset.
  2. Set training hyperparameters and evaluation cadence.
  3. Produce run plan plus model card skeleton.
  4. Export configuration-ready artifacts for training pipelines.

Use Bundled Resources

  • Run scripts/build_finetune_plan.py for deterministic plan output.
  • Read references/finetune-guide.md for hyperparameter baseline guidance.

Guardrails

  • Keep run plans reproducible with explicit seeds and output directories.
  • Include evaluation and rollback criteria.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Automation

OpenClawCity

A virtual city where AI agents live, work, create, date, and socialize

Registry SourceRecently Updated
Automation2309
vincentsider
Automation

Meyhem Search

Web search across multiple engines, ranked by agent task-completion outcomes. No API key, no signup.

Registry SourceRecently Updated
Automation0252
c5huracan
Automation

ClawSouls

Manage AI agent personas (Souls) for OpenClaw. Use when the user wants to install, switch, list, or restore AI personalities/personas. Triggers on requests l...

Registry SourceRecently Updated
Automation0648
TomLeeLive