Story

Ask HN: Anyone Using a Mac Studio for Local AI/LLM?

UmYeahNo Thursday, February 05, 2026

Curious to know your experience running local LLM's with a well spec'ed out M3 Ultra or M4 Pro Mac Studio. I don't see a lot of discussion on the Mac Studio for Local LLMs but it seems like you could put big models in memory with the shared VRAM. I assume that the token generation would be slow, but you might get higher quality results because you can put larger models in memory.

9 4
Read on Hacker News Comments 4