Project Kaizen Dashboard

Mac Studio M3 Ultra (512GB) - AI Services Health Report


Summary

Metric Value
Services Running 13/13
Primary Model Qwen3.5-397B-A17B (MoE)
Ollama Version v0.17.5
Unified Memory 512 GB

Local LLM Models (Ollama)

Model Base Size Active Params Speed
max:voice Qwen3.5-35B-A3B ~23GB 3B 42.9 tok/s
max:deep Qwen3.5-397B-A17B ~189GB 17B 17.6 tok/s
max:think Qwen3.5-397B-A17B ~189GB 17B 17.5 tok/s
max:mem qwen3:8b ~5GB 8B Fact extraction
mxbai-embed-large ~0.7GB Embeddings (1024-dim)

Core AI Services

Service Port Status Description
Ollama 11434 Running LLM inference engine (v0.17.5)
WebSearch Proxy 11435 Running Search, memory, context injection (v1.3.0)
Orchestrator 11440 Running Dashboard API, memory proxy, service control
OpenWebUI 8080 Running Web chat interface (v0.8.1)
Memory Service 8100 Running Mem0 + ChromaDB personal memory

Voice Services

Service Port Status Details
Whisper STT 8002 Running Whisper Large v3 Turbo (MLX)
Hybrid TTS 8003 Running Kokoro + Piper (10 voices, 24kHz)

Cloud AI Proxies

Service Port Status Models
Z.AI Proxy 5001 Running GLM-5, GLM-4.7, GLM-4.6
Claude Proxy 5002 Running Opus 4.6, Sonnet 4.6, Haiku 4.5
Codex Proxy 5003 Running GPT-5.2 Codex, GPT-5.1 Codex Max

Network Services

Service Port Status Notes
Caddy HTTPS 8443 Running Reverse proxy
Cloudflare Tunnel Running External tunnel access
Glances 61208 Running System monitoring

Quick Commands

Service Management

  • ~/mlxcore/kaizen.sh start - Start all services
  • ~/mlxcore/kaizen.sh stop - Stop all services
  • ~/mlxcore/kaizen.sh status - Check service status
  • ~/mlxcore/kaizen.sh dashboard - Detailed dashboard

Testing

  • OLLAMA_MODELS=$HOME/mlxcore/services/ollama_models ollama run max:voice - Chat with Max (CLI)
  • curl http://localhost:11435/health - Check WebSearch health
  • curl http://localhost:8003/audio/voices - List TTS voices
  • curl http://localhost:8100/health - Check Memory service

File Locations

Path Description
~/mlxcore/ Main project directory
~/mlxcore/services/ All services
~/mlxcore/config/ Configuration files
~/mlxcore/services/ollama_models/ LLM model storage
~/mlxcore/logs/ Service logs
~/mlxcore/kaizen.sh Master control script