(no title)
nvrmnd
|
2 years ago
This and other proposed legislation is attempting to hit the ball out of the park on the first pitch. I feel it would be a lot more sensible and effective to legislate clear and present harms, such as holding developing firms liable for deep-fake technology if used for identity theft for the purpose of fraud.
ApolloFortyNine|2 years ago
A users misuse of a technology shouldn't be the responsibility of the developer. You could apply this to almost every product in the world otherwise.
mike_d|2 years ago
We have a long history of legally targeting companies that produce products targeted at criminal activity or implied criminal activity.
wolverine876|2 years ago
bluGill|2 years ago
Note that courts take advertising over warning labels and the manual. Which is why many car ads have the text "professional driver on closed track on screen" - make it clear they they think car can do it but not most customers. Likewise cutting tools often have "guards removed for clarity" are clearly not operating (or clearly a cartoon image and not the real tool) - if they advertise someone running the tool without the guard they are liable.
There is also the concept of foreseeable misuse in courts. If you can imagine someone would do that you have do show the courts that isn't the intended purpose and you tried to prevent it. If someone does something you didn't think of, then you need to show the court you put a reasonable effort into figuring out all the possible misuses otherwise it becomes a lack of creativity on your part. Thinking of a misuse doesn't mean you have to make it impossible, just you have to make reasonable effort to ensure that doesn't happen (guards, warning labels, training, not selling to some customers - all are common tactics to sell something that can be misused without being liable, but even there you can't put a warning label on something if you could have placed a guard on the danger)
The above just brushes the surface of what the courts deal with (and different countries have different laws). If you need details talk to a lawyer.
screye|2 years ago
The car allows you to break the law by going 2x faster than the highest speed limit in the nation. A faster car, with higher ground clearance does make it easier to fatally run into someone. The Tesla cybertruck is a killing machine in car form.
Cars are the leading cause of death in the US. Maybe we need to have a similar 'pre-emptive manufacturer-side intervention' bill for cars too.
thinkingtoilet|2 years ago
nvrmnd|2 years ago
mullingitover|2 years ago
s/deep-fake/photoshop
Deepfakes are simply more convenient photo/video/audio editing that has been around for decades[1], and we don't really need new legislation to deal with them. Fraud/defamation/etc, the actual harmful aspects of what can be accomplished with deepfakes, don't need any new updates to handle the technology. If we're going to hobble new technologies, we may as well go back and hold Adobe responsible for all the shady things people have done with Photoshop, and video/audio editing suites for all the deceptive clips people have spliced together.
[1] https://www.youtube.com/watch?v=La5jrfobfTM&t=1s
gary_0|2 years ago
I vaguely recall seeing some fairly convincing B&W Soviet-era photos (I think they had Stalin in them) where people were removed and other people moved around to fill the gap. And document forgery for the purposes of fraud and espionage has of course been around for centuries.
But I think the issue is less the capability itself, and more that companies will make it too easy (trivial, actually) for anyone to commit mischief. The ability to mass-manipulate images on command is no longer restricted to the General Secretary of the USSR.
That doesn't necessarily mean regulation is required, though--plenty of modern technologies make it very easy to commit crimes, but only some of them require special rules.
ajmurmann|2 years ago