Sorry if you guys get so overwhelmed with deepseek submissions these days. This will be my one and only in the next time. It is cool to have an anti-weight to all these pay models.
Personally I don't get sick of it. There's a lot of hype around Deepseek specifically rn, but to run SOTA or near SOTA models locally is a huge deal, even if it's slow.
cwizou|1 year ago
karmakaze|1 year ago
> Why Ollama? Because it makes running large language models actually easy.
> If it doesn’t work, fix your system. That’s not my problem.
nkozyra|1 year ago
BimJeam|1 year ago
ghostie_plz|1 year ago
what an off-putting start
Euphorbium|1 year ago
BimJeam|1 year ago
croes|1 year ago
BimJeam|1 year ago
ai-christianson|1 year ago
assimpleaspossi|1 year ago
Saw this in the article
>I would not recommend running this on your main system. Unless you like unnecessary risks.
croes|1 year ago
Using hosted versions where host collects data or using a unknown software that runs the model is the risk.
donclark|1 year ago
throwaway638637|1 year ago