top | item 35666843

(no title)

jerry80 | 2 years ago

A lot of crimes things require intent. Perhaps using AI could actually circumvent the intent requirement of crime, thus making a lot of otherwise illegal things totally legal.

That is to say, it not only makes it hard to pin responsibility, but actually makes it no longer a crime at all.

discuss

order

AbrahamParangi|2 years ago

I think this is actually a good thing in the long run because “requiring intent” creates a clearly perverse incentive where the organization may be able to do illegal things so long as it can delude itself about them, for instance by keeping inaccurate books or allowing broken processes to remain broken because fixing them would shed light on something illegal.

Instead I think it would be better for organizations to be approximately as liable for their mistakes as their crimes. In that case it doesn’t matter if an employee does something illegal or an AI does some illegal on behalf of the company, the company will remain liable.

eropple|2 years ago

> I think this is actually a good thing in the long run because “requiring intent” creates a clearly perverse incentive where the organization may be able to do illegal things so long as it can delude itself about them

This doesn't really match how intent is handled in (at least American) law. There are reasonable-person tests. It is subpar, and so I agree with your second paragraph, but it isn't as cut and dry as the first paragraph suggests.

Kranar|2 years ago

Intent/mens rea doesn't work like that unless the law explicitly specifies so. By default intent simply means intent to perform an act, as opposed to intent to cause harm. In some specific cases, like murder, intent to cause harm is an element, but that's the exception not the rule.

For example if you intentionally take someone's property, maybe you took their phone away because you genuinely thought their phone was causing them harm and wanted to help them, you have the mens rea for theft.

However, if you unintentionally took someone's phone, like you mistook someone else's phone for your phone, then you don't have the mens rea for theft.

simonh|2 years ago

We’ve had software making business decisions for decades now, so this really isn’t a new problem. When is comes to finance laws, intent doesn’t work like that, the onus is on the company and it’s employees to prevent the criminal activity. Lack of intent, or even knowledge, is not sufficient. You have a responsibility to implement processes that give you the knowledge you need.

I wrote the above before reading the article and wondered how much the author knows about corporate criminal and civil responsibility. I’ve worked for finance institutions so I’ve been through the training on this. A graphic designer. Right, I completely understand and appreciate the problems with generative AI. Those are points well taken.

I mean I’ve got nothing against graphic designers, and I’m not saying there are no risks with AI. There are many. But the risk assessment particularly in finance, and likely other business areas is based on a fundamental misunderstanding of the way regulations on this actually work already today.