Mlx Apple Silicon Mlx

MLX-powered local AI — run LLMs, Stable Diffusion, speech-to-text, and embeddings natively on Apple Silicon via MLX. Ollama uses MLX for LLM inference, mflux...

Safety Notice

This listing is from the official public ClawHub registry. Review SKILL.md and referenced scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "Mlx Apple Silicon Mlx" with this command: npx skills add mlx-apple-silicon-mlx

No markdown body

This source entry does not include full markdown content beyond metadata.

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Coding

Mflux Image Router

Local mflux image generation on Apple Silicon — mflux routes Z-Image-Turbo, Flux Dev, Flux Schnell across your Mac fleet. mflux is MLX-native for Mac Studio,...

Registry SourceRecently Updated
1990Profile unavailable
General

Stable Diffusion Sd3

Stable Diffusion 3 and SD3.5 Large on Apple Silicon — generate Stable Diffusion images locally with DiffusionKit's MLX-native backend. SD3 Medium for fast St...

Registry SourceRecently Updated
2012Profile unavailable
Coding

Local Llm Router

Local LLM model router for Llama, Qwen, DeepSeek, Phi, Mistral, and Gemma across multiple devices. Self-hosted local LLM inference routing on macOS, Linux, a...

Registry SourceRecently Updated
2370Profile unavailable
Coding

Homelab Ai

Home lab AI — turn your spare machines into a local AI home lab cluster. LLM inference, image generation, speech-to-text, and embeddings across macOS, Linux,...

Registry SourceRecently Updated
1400Profile unavailable