top | item 33848948

(no title)

lijogdfljk | 3 years ago

When can we run something like ChatGPT locally i wonder? Ie like StableDiffusion.

I'm kinda dying for that, honestly. I can't even imagine all the neat applications i'd make of ChatGPT if it was purely local.. but it would take all of my free time to play with it. It's so damn impressive.

discuss

order

lossolo|3 years ago

Probably not realistic for now to run it locally, GPT-3 has like 175 billion parameters, you need to count around at least 2 bytes in optimistic scenario per parameter so you have around 350 GB of GPU memory, you probably need at least around 15 GPUs with minimum 32 GB of memory each.

drdaeman|3 years ago

Isn’t there an abundance of GPUs from crypto farmers? ;)

braingenious|3 years ago

I asked it what hardware would be necessary to do that and it said an nVidia V100 GPU lol

SxC97|3 years ago

I think you can run BLOOM locally, but it's not quite as powerful as this iteration of chatGPT. Also the vRAM requirements are pretty high if you want to run the biggest model.

https://huggingface.co/bigscience/bloom

dqpb|3 years ago

Agreed. The things I want to do with this don't make sense as a web service.