top | item 37780704

(no title)

debrice | 2 years ago

I always felt that if you want fairness and want to rip the full benefits of your creation (here AI) you should also be assuming full responsibilities for it. This is were most organizations fell short as they are always trying shift all responsibility to their users (self driving anyone?).

discuss

order

aldousd666|2 years ago

When you have millions of users, and your product has inherent danger, you can't assume the liability for all of them. (hammers and nails anyone?) The only reason that Microsoft has agreed to be liable for their users copyright stuff is because they know this case is a winner for OpenAI and that it does indeed meet merit for fair use. They wouldn't do that to 'be nice,' because not even microsoft can foot the bill for millions of users being sued. Their only alternative would be to not produce the product.

ang_cire|2 years ago

You are missing that the AI is the one creating the output.

If I sell you a hammer and nails, I'm not liable if you create a dangerous building.

If you ask me to build you a dangerous building and I do it, I am liable if people get hurt.

OpenAI wants to pretend that its users are creating the output because they write the prompt, but this is just plainly false, and OpenAI's own limits they put on output shows they know this. Otherwise they'd let the models output information about how to write exploits, how to kill people, etc, which they don't.

debrice|2 years ago

If you product has inherent danger you should be responsible for it, I don't think it's unreasonable. If you're asking the same rights as human beings, then you should assume the same exact responsibilities