top | item 42897785

Running DeepSeek R1 on Your Own (cheap) Hardware – The fast and easy way

20 points| BimJeam | 1 year ago |linux-howto.org

19 comments

order

cwizou|1 year ago

Maybe you should add "distills" to the title? As this is about installing Ollama to grab the 7b or 14b R1-Qwen-distills, not "R1".

karmakaze|1 year ago

"The fast and easy way" is also being oversold.

> Why Ollama? Because it makes running large language models actually easy.

> If it doesn’t work, fix your system. That’s not my problem.

nkozyra|1 year ago

Right, and fundamentally no different than running any other ollama model that can run reasonably on your local machine.

BimJeam|1 year ago

OK I understand now and will fix that title. Sorry for that inconvenience. My bad. :-/

ghostie_plz|1 year ago

> Unless you like unnecessary risks. In that case, go ahead, genius.

what an off-putting start

Euphorbium|1 year ago

I have R1:1.5B running on my 8gb ram M4 mac mini. Dont know where I would use it, as it is too weak to solve actual problems, but it does run.

BimJeam|1 year ago

Set up a local AI with DeepSeek R1 on a dedicated Linux machine using Ollama—no cloud, no subscriptions, just raw AI power at your fingertips.

croes|1 year ago

Ollama doesn’t run Deepseek, just distilled versions

BimJeam|1 year ago

Sorry if you guys get so overwhelmed with deepseek submissions these days. This will be my one and only in the next time. It is cool to have an anti-weight to all these pay models.

ai-christianson|1 year ago

Personally I don't get sick of it. There's a lot of hype around Deepseek specifically rn, but to run SOTA or near SOTA models locally is a huge deal, even if it's slow.

assimpleaspossi|1 year ago

Are there any security concerns over DeepSeek as there are over TikTok?

Saw this in the article

>I would not recommend running this on your main system. Unless you like unnecessary risks.

croes|1 year ago

The model itself can’t do anything bad despite giving false answers or block them.

Using hosted versions where host collects data or using a unknown software that runs the model is the risk.

donclark|1 year ago

I like this. However, I did not find any minimum specs or speed. Maybe I missed? Can some point me in the right direction please?