(no title)
splintercell | 1 year ago
Take for instance, self-hosting your website may have all these considerations, but you're getting information from the LLMs. It would be helpful to know that the LLM is in your control.
splintercell | 1 year ago
Take for instance, self-hosting your website may have all these considerations, but you're getting information from the LLMs. It would be helpful to know that the LLM is in your control.
Oras|1 year ago
Same with LLMs, you can use providers who don’t log requests and SOC2 compliant.
Small models that run locally is a waste of time as they don’t have adequate value compared to larger models.
KetoManx64|1 year ago
troyvit|1 year ago
It depends on how important the web site is and what the purpose is. Personal blog (Ghost, WordPress)? File sharing (Nextcloud)? Document collaboration (Etherpad)? Media (Jellyfin)? Why _not_ run it from your room with a reverse proxy? You're paying for internet anyway.
> Same with LLMs, you can use providers who don’t log requests and SOC2 compliant.
Sure. Until they change their minds and decide they want to, or until they go belly-up because they didn't monetize you.
> Small models that run locally is a waste of time as they don’t have adequate value compared to larger models.
A small model won't get you GPT-4o value, but for coding, simple image generation, story prompts, "how long should I boil an egg" questions, etc. it'll do just fine, and it's yours. As a bonus, while a lot of energy went into creating the models you'd use, you're saving a lot of energy using them compared to asking the giant models simple questions.