Synology Backup
Backup OpenClaw data to a Synology NAS over SMB or SSH/rsync. Designed for secure, automated daily snapshots with configurable retention, integrity verification, and failure alerting.
Setup
1. Network Connectivity
For VPS-to-NAS backups, use Tailscale for secure connectivity without exposing SMB to the internet:
- Install Tailscale on the Synology (Package Center → search "Tailscale")
- Install Tailscale on the VPS — see Tailscale's official install guide for your platform
- Join both to the same tailnet
- Use the Synology's Tailscale IP in config
For local network setups, use the NAS local IP directly.
2. Synology Preparation
- Create a dedicated user on the Synology (e.g.,
openclaw-backup) with minimal permissions - Create or choose a shared folder (e.g.,
backups) - Grant the user read/write access to only that folder — not admin access
3. Credentials File (SMB transport)
Create an SMB credentials file with restricted permissions — never store credentials in config or scripts:
touch ~/.openclaw/.smb-credentials
chmod 600 ~/.openclaw/.smb-credentials
# Add two lines:
# username=<your-synology-user>
# password=<your-synology-password>
4. Configuration
Create ~/.openclaw/synology-backup.json:
{
"host": "100.x.x.x",
"share": "backups/openclaw",
"mountPoint": "/mnt/synology",
"credentialsFile": "~/.openclaw/.smb-credentials",
"smbVersion": "3.0",
"transport": "smb",
"notifyOnSuccess": false,
"backupPaths": [
"~/.openclaw/workspace",
"~/.openclaw/openclaw.json",
"~/.openclaw/cron",
"~/.openclaw/agents"
],
"backupExclude": [],
"includeSubAgentWorkspaces": true,
"retention": 7,
"preRestoreRetention": 3,
"schedule": "0 3 * * *"
}
SSH transport (recommended): Set "transport": "ssh" and add "sshUser": "your-user". No credentials file needed — uses SSH key auth. Requires rsync + SSH access to the Synology.
⚠️ SSH host key warning: Scripts use
StrictHostKeyChecking=yes. Add your NAS host key to~/.ssh/known_hostsfirst by connecting manually once (ssh user@nas-ip) before automation.
Sensitive files: The .env file (containing API keys) is excluded by default. Only add it to backupPaths if your NAS share is restricted to a dedicated low-privilege user and encrypted at rest. When in doubt, leave it out — you can always re-enter API keys from scratch.
| Field | Description | Default |
|---|---|---|
host | Synology IP (Tailscale or local) | required |
share | SMB share path | required |
mountPoint | Local mount point | /mnt/synology |
credentialsFile | Path to SMB credentials file | required (SMB) |
smbVersion | SMB protocol version | 3.0 |
transport | smb or ssh | smb |
sshUser | SSH username | required (SSH) |
backupPaths | Paths to backup | workspace + config |
includeSubAgentWorkspaces | Auto-include workspace-* dirs | true |
retention | Days of daily snapshots to keep | 7 |
preRestoreRetention | Days to keep pre-restore safety snapshots | 3 |
backupExclude | rsync exclude patterns (.git/, node_modules/ always excluded) | [] |
notifyOnSuccess | Write success state and allow higher-level success reporting if desired | false |
schedule | Cron expression (host timezone) | 0 3 * * * |
sshUser | SSH username (required for ssh transport) | — |
sshHost | SSH hostname (defaults to host) | — |
sshPort | SSH port | 22 |
sshDest | Remote backup directory path (required for ssh transport) | — |
5. Install Dependencies
apt-get install -y cifs-utils rsync jq
jq is required — all scripts use it to parse the config JSON.
6. Register the Backup Cron
openclaw cron add \
--name "Synology Backup" \
--schedule "0 3 * * *" \
--tz "America/Los_Angeles" \
--message "Run the daily Synology backup: bash ~/.openclaw/workspace/skills/synology-backup/scripts/backup.sh && bash ~/.openclaw/workspace/skills/synology-backup/scripts/verify.sh. If either step fails, use the OpenClaw message tool explicitly to alert the target channel with the failing step and key error text. Then reply NO_REPLY."
Usage
Backup Now
scripts/backup.sh
Runs an incremental backup. Add --dry-run to preview what would be backed up without touching anything.
Check Status
scripts/status.sh
Shows mount health, last backup time, snapshot count, total size, and pre-restore safety snapshots.
Verify Integrity
scripts/verify.sh # verify latest snapshot
scripts/verify.sh 2026-03-25 # verify specific date
Checksums key files and counts directory contents against the snapshot to confirm data integrity.
Restore a Snapshot
scripts/restore.sh # list available snapshots
scripts/restore.sh 2026-03-25 # restore from specific date
Before restoring, automatically saves a pre-restore safety snapshot of your current state. If the restore goes wrong, restore the safety snapshot to undo.
What Gets Backed Up
~/.openclaw/workspace/— memory, SOUL, AGENTS, skills, all workspace files~/.openclaw/workspace-*/— all sub-agent workspaces (if enabled)~/.openclaw/openclaw.json— main config~/.openclaw/cron/— cron job definitions~/.openclaw/agents/— agent configurations~/.openclaw/.env— opt-in only (contains API keys)
Snapshot Structure
backups/
├── 2026-03-25/
│ ├── manifest.json # timestamp, host, path counts
│ ├── workspace/
│ ├── workspace-news/
│ ├── agents/
│ ├── cron/
│ └── openclaw.json
├── pre-restore-2026-03-25-143022/ # safety snapshot before restore
├── 2026-03-24/
└── ...
Failure Alerting
If a backup fails, the recommended pattern is for the OpenClaw cron/session layer to send the alert explicitly via the message tool.
Recommended alert contents:
- which step failed (
backup.shorverify.sh) - key error text
- hostname and snapshot date when available
Before You Enable Automated Cron
Run with --dry-run first and review the output:
scripts/backup.sh --dry-run
Then make sure your cron prompt owns notification behavior explicitly instead of relying on shell-side delivery.
Security Notes
- Credentials: Always use a dedicated SMB credentials file with
chmod 600. Never inline secrets in config, scripts, or fstab. - jq required: Install with
apt-get install -y jq. All scripts depend on it for config parsing. - Notifications: Keep notification ownership in the OpenClaw cron/session layer. Avoid hidden shell-side delivery paths.
- SSH host keys: Pre-provision
~/.ssh/known_hostsand keepStrictHostKeyChecking=yesinlib.shfor hardened setups. - Network: Use Tailscale or a VPN for remote backups. Never expose SMB (port 445) to the public internet.
- Sensitive data:
.envexcluded by default — only include if NAS access is tightly restricted. - NAS user: Dedicated user with access to only the backup share — not an admin account.
- Input validation: All config values validated before use — no shell injection via host, share, mount, or path fields.
- Path allowlist: Restore uses an explicit allowlist (
workspace,cron,agents,openclaw.json,.env) — no arbitrary path writes.
System Access
Files read: ~/.openclaw/synology-backup.json, ~/.openclaw/.smb-credentials
Files written: Synology NAS share (via SMB mount or SSH/rsync), manifest.json in each snapshot
Network: SMB (port 445) or SSH (port 22) to Synology NAS IP only
Commands used: mount, rsync, cp, find, du, df, md5sum, jq (required)
Secrets stored: None — SMB credentials live in a separate chmod 600 file