top | item 40977714

(no title)

dws | 1 year ago

> Sure, I can run some slow Llama3 models on my home network, but why bother when it is so cheap or free to run it on a cloud service?

Running locally, you can change the system prompt. I have Gemma set up on a spare NUC, and changed the system prompt from "helpful" to "snarky" and "kind, honest" to "brutally honest". Having an LLM that will roll its eyes at you and say "whatever" is refreshing.

discuss

order

No comments yet.