top | item 39453967

(no title)

milansuk | 2 years ago

You can run Gemma and hundreds of other models(many fine-tuned) in llama.cpp. It's easy to swap to a different model.

It's important there are companies publishing models(running locally). If some stop and others are born, it's ok. The worst thing that could happen is having AI only in the cloud.

discuss

order

No comments yet.