top | item 40368212

(no title)

kettro | 1 year ago

I would argue that it is society’s job to care for its most vulnerable.

discuss

order

llm_trw|1 year ago

Yes, not openai's.

dns_snek|1 year ago

It depends. I don't think OpenAI (or anyone else selling products to the general audience) should be forced to make their products so safe that they can't possibly harm anyone under any circumstance. That's just going to make the product useless (like many LLMs currently are, depending on the topic). However that's a very different standard than the original comment which stated:

> I suspect Altman/Brockman/Murati intended for this thing to be dangerous for mentally unwell users, using the exact same logic as tobacco companies.

Tobacco companies knew about the dangers of their products, and they purposefully downplayed them, manipulated research, and exploited addictive properties of their products for profit, which caused great harm to society.

Disclosing all known (or potential) dangers of your products and not purposefully exploiting society (psychologically, physiologically, financially, or otherwise) is a standard that every company should be forced to meet.

woopsn|1 year ago

"Our primary fiduciary duty is to humanity."

swat535|1 year ago

Corporations should benefit the society and avoid harming it in some shape or form, this is why we have regulations around them.

bongodongobob|1 year ago

That doesn't mean we pad all the rooms or ban peanuts. Yes, we should care for them but not at the detriment of the other 99%.

earthling8118|1 year ago

Well, conveniently, this is benefitting the 1% much more than the 99%