Show HN: macOS GUI for running LLMs locally
cztomsik Monday, September 18, 2023Hello HN,
I've been working on this project for a while, and it has been in an "open" beta for some time. I finally believe it's ready for its first release.
I hope you like it.
Here are some potential questions that may arise:
1. How does it compare to LM Studio? It's likely that if you're already using LM Studio, you'll continue to do so. This project is designed to be more user-friendly.
2. Is it open-source? No, it is not.
3. Does it use any open-source libraries? Yes, it uses llama.cpp and a few others, as indicated in the license information included with the application.
4. Why is not using electronjs? Two reasons, I wanted total control over the whole tech-stack and second, I wanted to be able to send this to my friends over iMessage.
5. Does it support Intel macs? It should, but I couldn't test it.
6. Does it support older macOS? 12.6 is the lowest version at the moment.
7. Is XXX a bug? Probably :)