ollama-stack

Deploy a local LLM stack for offline and privacy-first workflows.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "ollama-stack" with this command: npx skills add bagelhole/devops-security-agent-skills/bagelhole-devops-security-agent-skills-ollama-stack

Ollama Stack

Deploy a local LLM stack for offline and privacy-first workflows.

Minimal Setup

curl -fsSL https://ollama.com/install.sh | sh ollama serve ollama pull llama3.1:8b ollama run llama3.1:8b

Docker Compose Pattern

  • Ollama container with persistent model volume

  • Open WebUI for chat interface

  • Optional LiteLLM proxy for unified API routing

Best Practices

  • Pin model versions for reproducibility.

  • Monitor VRAM, RAM, and swap utilization.

  • Restrict network exposure to trusted subnets.

Related Skills

  • mac-mini-llm-lab - Apple Silicon optimization

  • docker-compose - Service orchestration

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Security

sops-encryption

No summary provided by upstream source.

Repository SourceNeeds Review
Security

linux-administration

No summary provided by upstream source.

Repository SourceNeeds Review
Security

linux-hardening

No summary provided by upstream source.

Repository SourceNeeds Review