top | item 42310675

(no title)

dimensi0nal | 1 year ago

The only consumer demand for local AI models is for generating pornography

discuss

order

treprinum|1 year ago

How about running your intelligent home with a voice assistant on your own computer? In privacy-oriented countries (Germany) that would be massive.

magicalhippo|1 year ago

This is what I'm fiddling with. My 2080Ti is not quite enough to make it viable. I find the small models fail too often, so need larger Whisper and LLM models.

Like the 4060 Ti would have been a nice fit if it hadn't been for the narrow memory bus, which makes it slower than my 2080 Ti for LLM inference.

A more expensive card has the downside of not being cheap enough to justify idling in my server, and my gaming card is at times busy gaming.

serf|1 year ago

absolutely wrong -- if you're not clever enough to think of any other reason to run an LLM locally then don't condemn the rest of the world to "well they're just using it for porno!"

knowitnone|1 year ago

so you're saying that a huge market?!