top | item 44002040

(no title)

w8nC | 9 months ago

Now it’s just a wrapper around hosted APIs.

Went with my own wrapper around llama.cpp and stable-diffusion.cpp with optional prompting hosted if I don’t like the result so much, but it makes a good start for hosted to improve on.

Also obfuscates any requests sent to hosted, cause why feed them insight to my use case when I just want to double check algorithmic choices of local AI? The ground truth relationship func names and variable names imply is my little secret

discuss

order

Patrick_Devine|9 months ago

Wait, what hosted APIs is Ollama wrapping?