Ollama is now powered by MLX on Apple Silicon in preview
Very good news! I am running Local LLM for some tasks on my Mac Mini to reduce Claude Usage. Love that! Let me dive in!