(no title)
brainchild-adam | 10 months ago
This is the best I could find:
"Some NeuroTools features use Google's Gemini AI service. To make these work, we send the relevant input from the app to the Google Gemini API."
I would love to play around with it, but local-only is a must for me.
On that note, would you agree that in principle it should be possible to run this with a local LLM?
martin-buur|10 months ago
And yes, this can definitely work with a local LLM, and there's a high chance I will do that next. I've created a local LLM open-source Mac app last year (paid, but open source, so you could just compile it yourself), you can find it on my profile, so it's definitely something I am very interested in! I wanted to try to tap a broader audience this time so did not release it with a local LLM feature. But very high chance it'll come next!