The answer from the federation is unconscionable. They blamed the boy and then said they could not be held responsible. Fuck off, a kid made a reasonable kid movement. If the robot wasn't ready to be around children, it shouldn't have been deployed around children. And there should have been a big red button that immediately everything, within reach of every player.
godelski|3 years ago
OOD data is really hard to deal with FWIW. But personally I don't feel confident that adding more matrix multiplies won't generalize in a way such that OOD isn't of major concern.
ALittleLight|3 years ago
imtringued|3 years ago
This means we are going to have the equivalent of GC in robots that interact with other humans.
throwaway894345|3 years ago
salawat|3 years ago
dmix|3 years ago
Are ML driven robots in factories killing people or Something? Because I haven’t heard of anything else.
The only other modern AI ethics stuff I hear about is making image generators more politically correct and maybe some criminal sentencing algorithms that are being misused (which isn’t really an AI ethics problem but a judicial procedural one).
jrumbut|3 years ago
But the kid is some kind of local chess champion, I can't fully fault the decision to have him play with the experimental chess robot. Is it more dangerous than a lawn mower or a blender or any other machine that 9 year olds might begin to operate?
NamTaf|3 years ago
Relying on a human is the last option, not the default, when it comes to safety. Human adaptability is not a licence to hand-wave away design responsibility. The most glaring example is Tesla, who is unforgivably guilty of this.
This is bog-standard competent engineering in almost all domains of engineering. It is the table stakes-level expectation of a reasonable approach to safety. I'd literally end up in jail if something went wrong and I had been found to not consider these factors.
Software- and computer-related domains of engineering are a conspicuous outlier when it comes to this philosophy.
zuminator|3 years ago
carbadtraingood|3 years ago
imtringued|3 years ago
Aeolun|3 years ago
rowanG077|3 years ago
[deleted]
lupire|3 years ago
carbadtraingood|3 years ago
yessirwhatever|3 years ago
[deleted]
worstestes|3 years ago
twayt|3 years ago
[deleted]
kortilla|3 years ago
This isn’t adequate in civilized countries. You can’t run an amusement park that severs the limbs of 1% of the participants under the protection of a disclaimer.
ClumsyPilot|3 years ago
makeitdouble|3 years ago
Imagine a world where guardians will never let a kid go somewhere or do something that they don’t have 100% knowledge of, or aren’t 100% sure it’s perfectly safe.
In this specific case:
- you wouldn’t expect that issue at first sight
- it looks fun enough to give it a try
- the kid disn’t die. It truely hurts and can have long lasting damages if not treated properly, but a finger broken is not the end of the world for the kid.
beached_whale|3 years ago
kstenerud|3 years ago
You wouldn't be allowed to build a mangler anymore in a civilized society, waiver or no.
gaudat|3 years ago
Quoting from anothing reply: Video of the incident: https://twitter.com/xakpc/status/1550224137041371144
There is not even an e-stop.
Although I can imagine what it sounded like at the scene though. "Ay Blin" and then people scurrying around looking for the power plug of the robot or hopelessly trying to overpower a heavily geared joint motor.
roenxi|3 years ago
[deleted]
nerdawson|3 years ago
The robot behaved in an unexpected way which caused injury to a child. An apology is the absolute bare minimum they could do.
As a parent, I think it’s perfectly reasonable to expect event organisers to have put adequate safeguards in place.
We don’t need to put kids in front of a bear to teach them about the wonders of nature. Likewise, they don’t need to be in the path of a dangerous robot to discover machines.
carbadtraingood|3 years ago
[deleted]
userbinator|3 years ago
Although I do agree with you about having an e-stop.
dymk|3 years ago
"OMG did you just use the 'think of the children' fallacy??? Invalid!"
bee_rider|3 years ago
carbadtraingood|3 years ago
I know this is controversial to some, such as userbinator, who believe that industrial robotics without safeguards are just fine to mix with children.
Wowfunhappy|3 years ago
Vespasian|3 years ago
Usually there need to be either a physical barrier like a cage or a virtual one like a laser waterfall that detects foreign objects in the robots perimeter and emergency stops it.
These rules were disregarded here.
I used to work in a company were such machines were developed and even a very experienced engineer, working on a prototype, was once hit by it (no serious injuries and safety was improved afterwards) because they can move very fast and in unexpected ways.
These days there are better solutions available (so called cobots) which are designed to be work together in very close proximity with humans whiteout physical separation. They feature very sensitive force sensors and are severely restricted in the way the are allowed to move.
So yes "think of the humans/children" does apply here. This is a solved problems and the operators decided to disregard established procedures and went instead for "flashy and cheap" (cobots are more expensive and slow as molasses)
charcircuit|3 years ago
What do you mean? There's no reasonable time where you and your opponent are touching the pieces at the same time. Nor is there a reasonable time where you reach for the same piece.
ordu|3 years ago
krallja|3 years ago
carbadtraingood|3 years ago
Children are literally wired for physical experimentation and echopraxia.
Kids also want to be seen as helping. If this robot put a piece down poorly, the kid might be trying to straighten up after it.
All of which are eminently reasonable for a kid who has no concept of operating around dangerous machinery.
stevage|3 years ago
rmbyrro|3 years ago
lupire|3 years ago