Hi everyone!
Iโd like to share M-Courtyard, an open-source macOS desktop app for fine-tuning
LLMs on Apple Silicon โ no code, no cloud, everything local.
It wraps the full pipeline in a native GUI:
Import docs โ
Generate training data (Ollama) โ
LoRA fine-tune (mlx-lm)
โ
Chat-test โ
One-click export to Ollama (Q4/Q8/F16)
Supports models from mlx-community: Qwen 3, DeepSeek R1, GLM, Llama 3, and more.
- GitHub: GitHub - tuwenbo0120/m-courtyard: M-Courtyard: Local AI Model Fine-tuning Assistant for Apple Silicon. Zero-code, zero-cloud, privacy-first desktop app powered by Tauri + React + mlx-lm.
- Download: Release M-Courtyard v0.3.0 ยท tuwenbo0120/m-courtyard ยท GitHub
- License: AGPL 3.0
- Tech: Tauri 2.x (Rust) + React + mlx-lm
Requirements: macOS 14+, Apple Silicon (M1/M2/M3/M4), 16GB+ RAM recommended
Feedback and suggestions welcome!


