top | item 38332157

(no title)

abiraja | 2 years ago

Have you tried Mistral?

discuss

order

dannyw|2 years ago

Mistral is genuinely groundbreaking, for a fast, locally-hosted model without content filtering at the base layer. You can try it online here: https://labs.perplexity.ai/ (switch to Mistral)

eropple|2 years ago

It's very fast, but it doesn't seem very good. It doesn't take instruction well (acknowledges and spits back the same wrong stuff) and doesn't seem to have much of a corpus or it's dropping most of it on the floor because it successfully answers zero of my three basic smoke-test questions.

js4ever|2 years ago

Wow I was not expecting this, It's really something else in terms of speed, and results are not bad! Will test it more

anonzzzies|2 years ago

Are more companies/teams than the creating team working to get this to copilot/chatgpt standards?

audessuscest|2 years ago

Thanks for the link, do you know any other similar services that support fine-tuning ?