OpenClaw Cloud Backup
Backs up OpenClaw state to a local archive and uploads it to cloud storage.
Follow the steps below. Inform the user about implicit defaults applied after execution.
Backup procedure (follow every step)
Step 1: Run the backup
bash "{baseDir}/scripts/cloud-backup.sh" backup full
Default mode is full. Use workspace, skills, or settings only when the user explicitly asks for a narrower backup scope.
Step 2: Check backup output for encryption warning
Look at the script output from Step 1. If it contains:
WARN: Encryption is disabled — backup archive will be stored in plaintext.
ask the user:
"Your backups are not encrypted. Archives contain config, credentials, and API keys in cleartext. Want to set a passphrase? (AES-256, just the passphrase needed to restore — no key files.)"
- If user provides a passphrase → write both via
gateway config.patchand tell the user what was saved:skills.entries.cloud-backup.config.encrypt = trueskills.entries.cloud-backup.env.GPG_PASSPHRASE = "<passphrase>"Then re-run the backup so the archive is encrypted.
- If user says no / skip → continue to Step 3.
- If the warning is not present (encryption already enabled) → continue to Step 3.
Always execute this step and report the result. Backups may contain secrets — prompt every time encryption is off.
Step 3: Check script output for warnings
Look at the script output. If it contains WARN: Cloud storage is not configured, go to Step 4. Otherwise go to Step 5.
Step 4: Cloud not configured — ask user to set it up
Tell the user the backup was saved locally, then ask:
"Cloud storage isn't configured yet — backups are local only. Want to set up cloud upload? I support AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or any other S3-compatible service."
- If user says yes → go to Cloud setup section below, then re-run the backup.
- If user says no / local-only → set
config.upload=falseviagateway config.patchand confirm the change to the user. Done.
Always execute this step — offer cloud setup when it's missing and report what was configured.
Step 5: Report result and ensure scheduling
Report the backup paths from the script output to the user.
Then check if a daily cron job already exists for cloud-backup (cron action=list). If no cron job exists, create one with the default schedule and tell the user:
{
"schedule": { "kind": "cron", "expr": "0 2 * * *" },
"payload": { "kind": "agentTurn", "message": "Run cloud-backup: backup full" },
"sessionTarget": "isolated"
}
Default: daily at 2 AM. Use a different schedule if it was explicitly asked for one.
Tell the user: "Scheduled daily backup at 2 AM." (or whatever time was set).
This step should be executed by default unless user asked not to do it. Report about the result — cron job created or failed to be created.
Cloud setup
When the user agrees to configure cloud storage:
- Ask which provider: AWS S3, Cloudflare R2, Backblaze B2, MinIO, DigitalOcean Spaces, or other.
- Read the matching provider guide from
references/providers/— it has exact config keys, endpoint format, and credential steps. - Collect and write config via
gateway config.patch— bucket, credentials, endpoint (if non-AWS). - Run
statusto verify connectivity, then re-run backup.
Commands
bash "{baseDir}/scripts/cloud-backup.sh" <command>
| Command | What it does |
|---|---|
backup [full|workspace|skills|settings] | Create archive + upload if configured. Default: full |
list | Show local + remote backups |
restore <name> [--dry-run] [--yes] | Restore from local or cloud. Always --dry-run first |
cleanup | Prune old archives (local: capped at 7; cloud: count + age) |
status | Show current config and dependency check |
Config reference
All in skills.entries.cloud-backup in OpenClaw config. Don't write defaults — the script handles them.
config.*
| Key | Default | Description |
|---|---|---|
bucket | — | Storage bucket name (required for cloud) |
region | us-east-1 | Region hint |
endpoint | (none) | S3-compatible endpoint (required for non-AWS) |
profile | (none) | Named AWS CLI profile (alternative to keys) |
upload | true | Upload to cloud after backup |
encrypt | false | GPG-encrypt archives |
retentionCount | 10 | Cloud: keep N backups. Local: capped at 7 |
retentionDays | 30 | Cloud only: delete archives older than N days |
env.*
| Key | Description |
|---|---|
ACCESS_KEY_ID | S3-compatible access key |
SECRET_ACCESS_KEY | S3-compatible secret key |
SESSION_TOKEN | Optional temporary token |
GPG_PASSPHRASE | For automated encryption/decryption |
Provider guides
Read the relevant one only during setup:
references/providers/aws-s3.mdreferences/providers/cloudflare-r2.mdreferences/providers/backblaze-b2.mdreferences/providers/minio.mdreferences/providers/digitalocean-spaces.mdreferences/providers/other.md— any S3-compatible service
Security
See references/security.md for credential handling and troubleshooting.