top | item 41576980

(no title)

lhousa | 1 year ago

Rookie question: the openAPI endpoint costs extra right? Not something that comes with chatGPT or chatGPT+.

discuss

order

zlwaterfield|1 year ago

Correct but I'm going to loom into a locally running LLM so it would be free.

Tepix|1 year ago

Please do (assuming you mean "look"). When you add support for a custom API URL, please make sure it supports HTTP Basic authentication.

That's super useful for people who run say ollama with an nginx reverse proxy in front of it (that adds authentication).

nickthegreek|1 year ago

Look into allowing it to connect to either a LM Studio endpoint or ollama please.

Szpadel|1 year ago

yes, but gpt-4o-mini costs very little so you probably will spend well under $1/month

miguelaeh|1 year ago

I don't think the point here should be the cost, but the fact that you are sending everything you write to OpenAI to train their models on your information. The option of a local model allows you to preserve the privacy of what you write. I like that.