top | item 46594909

(no title)

sebasv_ | 1 month ago

I see at least 2 axes here: * Should access to a tool be restricted of it is used for malice * Is a company complicit if its automated service is being used for malice

For 1, crowbars are generally available but knives and guns are heavily regulated in the vast majority of the world, even though both are used for murder as well as legitimate applications.

For 2, things get even more complicated. Eg if my router is hacked and participates in a botnet I am generally not liable, but if I rent out my house and the tenant turns it into a weed farm i am liable.

Liability is placed where it minimises perceived societal cost. Emphasis on perceived.

What is worse for society, limiting information access to millions of people or allowing csam, harrassment and shaming?

discuss

order

buellerbueller|1 month ago

It is not clear that limiting Grok limits information access to millions of people, so I think your premise is flawed.

There are plenty of other resources that could serve the same people that Grok serves. Further, the fact that we aren't having discussions about ChatGPT or Claude as CSAM generators also suggests that Grok could be limited in ways that it isn't being limited currently.