top | item 46503838

(no title)

ethical_source | 1 month ago

If you follow the "tool-maker is responsible for tool-use" thread of thought to its logical conclusion, you have to hold creators of open-weights models responsible for whatever people do with these models. Do you want to live in a world that follows this rule?

discuss

order

UncleMeat|1 month ago

But we don't have to take things to furthest conclusions. We can very easily draw both a moral and legal line between "somebody downloaded an open weight model, created a prompt from scratch to generate revenge porn of somebody, and then personally distributed that image" and "twitter has a revenge porn button right next to every woman on the platform that generates and distributes revenge porn off of a simple sentence."

ethical_source|1 month ago

No, we can't draw such a line. Where would you draw it? What is the minimum friction? How would you quantify it?

If you try, you quickly end up codifying absurdities like the 80%-finished-receiver rule in firearm regulation. See https://daytonatactical.com/how-to-finish-an-80-ar-15-lower-...

People who say "society should permit X, but only if it's difficult" have a view of the world incompatible with technological progress and usually not coherent at all.

thaw13579|1 month ago

The core issue is that X is now a tool for creating and virally distributing these images anonymously to a large audience, often targeting the specific individuals featured in the images. For example, to any post with a picture, any user can simply reply "@grok take off their clothes and make them do something degrading", and the response is then generated by X and posted in the same thread. That is an entirely different kind of tool from an open-weight model.

The LLM itself is more akin to a gun available in a store in the "gun is a tool" argument (reasonable arguments on both side in my opinion); however, this situation more like a gun manufacturer creating a program to mass distribute free pistols to a masked crowd, with predictable consequences. I'd say the person running that program was either negligent or intentionally promoting havoc to the point where it should be investigated and regulated.

jnovek|1 month ago

The phrase “its logical conclusion” is doing a lot of heavy lifting here. Why on earth would that absurdity be the logical conclusion? To me it looks like a very illogical conclusion.

praptak|1 month ago

> "tool-maker is responsible for tool-use"

You left out "who controls the output of the tool", which makes it a strawman.

lynndotpy|1 month ago

Importantly, X also provides the hardware to run the model, a friendly user-interface around it, and the social platform to publicly share and discuss outputs from the model. It's not just access to the model.