All 10 questions were variants of the trolley problem. The problems were also poorly formed which is what made answering them difficult, not the actual moral dilema. For example, in the real trolly problem it is clear that you really only have two options flip the switch or not, it is hard to imagine there are other alternatives. But in many of these problems, it is hard to believe there is not a better option available, even though the problem states "but the only way..."
Yes, they were quite poorly formed, which renders the study entirely pointless. The idea behind these problems is to collect people's intuitive reactions to simple, clear situations. If the problems are so poorly formed that intuition simply rejects them, the exercise is useless, except as a study in human frustration. (Perhaps that's the point? It would explain the poor UI design.)
Several situations present you with the option of causing one immediate and fairly certain death to prevent several more distant, much more speculative deaths. You can't predict the path of a boulder down a mountain. A boulder that would be stopped by one person would be unlikely to kill five. The man driving the injured people to the hospital actually has little idea how quickly he needs to get them there to save their lives. The certainty presented in the problems is so unreal that they might as well have been posed in an entirely abstract way in the first place.
Yeah, I stopped when it asked me to choose the number of people that would be intentionally infected with HIV that would obligate the doctor to poison the patient/murderer. There were just so many holes (break doctor/patient confidentiality, how would the doctor know how many people would get infected/etc)
I would actually have to disagree. They are different situations cast around the trolley problem. For me, the settings of each situation actually played a large amount in the weighting. Trying to not reveal much, in my opinion, the lunch line guy wasn't wrong at all while the box lady was, even though the situations are virtually identical (sacrifice one to save many).
This test uses a variant of the good old "trolley dilemma" (http://en.wikipedia.org/wiki/Trolley_problem) where a trolley/train is headed for five workmen on a track and it asks if you'd pull a lever to divert the trolley onto a track where only one man is working. (This test then goes on to vary that situation somewhat.)
Studies have been run on this in the past and to my dismay most people would, in the initial scenario, flick the switch to save five but kill one - immediately becoming murderers rather than bystanders. I suspect this test is trying to weed out what could make people flip-flop from one point of view to the other, as when replaced with "pushing a fat man off a bridge to block the trolley" the stats have tended to swing the other way.
Involvement in the world isn't voluntary; actions have consequences, some people prefer some consequences to other consequences.
In your case you value maintaining an unsullied self-image over assisting 5 people in mortal peril, and would prefer if other people saw your choice the same way.
But you're right (I've met people who were familiar with the people involved here): the intuition is that the stronger the perceived interpersonal relationship between you and the unlucky bloke the less likely you are to flick the switch, but teasing out exactly how much relationship is needed to make the moral intuition flip is tricky business.
to my dismay: You are (in some woolly sense) five times more likely to end up as one of the five in a situation like this, than as the one. It is therefore better for you on the whole -- and, symmetrically, better for everyone -- for people in general to choose to switch the switch and kill one person rather than five. It's nice that you value their moral purity ("murderers rather than bystanders"), but I would prefer to live in a world where people tend to do what produces most net benefit rather than one where people tend to safeguard their moral purity, so I remain undismayed. (I'd be dismayed if I thought that "most people" aren't at all troubled by the prospect of being put in such a situation, but that's a separate issue from what they'd do once in it.)
Then again, I think much more harm results from people not living up to either sort of moral standard than from people having suboptimal moral principles (whatever that actually means).
"Child in the bunker" question looked quite different (from all others), and not as a variant of trolley problem. Actually, it was the only problem where I thought it's not just permittable but even required to kill it in order to save the others.
I've learned about trolley dilemma few days ago on "Justice with Michael Sander" lectures, Harvard's OCW equivalent program to publish their materials online.
These tests just annoy the crap out of me. I put up with the first few questions because I wanted to see if they would do something novel. But no, they had all the standard problems of morality tests:
1) False dichotomy. (There are exactly two actions you can take, no more.)
2) Unrealistic foreknowledge. (These are exactly what the results of these actions will be.)
3) Unrealistic scenarios. (How many of us are ever going to be standing with our foot stuck in the tracks of the sideline right near the switch when five other people are...blah blah blah.)
This kind of test is exactly what gives philosophers a bad reputation. They are studying important issues; could they /please/ take the time to build a test that respects the intelligence of the testee?
These are simplifying assumptions; it's rather like a scientific experiment that aims to control for other factors, in order to distill the experiment down to its essential core.
Andrew has been kayaking and is six miles from the nearest town. He hears on the WBZ4 radio station that the damn has broken upstream and that the river is about to flood.
Spelling and grammar mistakes reduce your credibility.
Interesting, my radio station was not WBZ4, it was WBZ60. This corresponds well with the observation in other comments that there was some numerical priming going on that might have been the real point of the study and not the moral part of it.
I have no patience for these contrived dilemmas that few people will ever encounter. When humanity is done making war with one another then maybe we could ask these questions. But until then it is like treating someone's skin rash and ignoring that they are in cardiac arrest.
Why do you think there are wars? It has to do with the moral standing and beliefs of the leaders who wage those wars. These types of tests are actually very appropriate for "treating" the problem of war. One life for one more important one, or one life for many, or a few lives for many are the foundations of war. Is the study and the results a fix for these things? No. But they can give a much greater understanding of the decisions people are willing to make, allowing for better judgement of reactions ot situations.
[+] [-] seles|16 years ago|reply
[+] [-] dkarl|16 years ago|reply
Several situations present you with the option of causing one immediate and fairly certain death to prevent several more distant, much more speculative deaths. You can't predict the path of a boulder down a mountain. A boulder that would be stopped by one person would be unlikely to kill five. The man driving the injured people to the hospital actually has little idea how quickly he needs to get them there to save their lives. The certainty presented in the problems is so unreal that they might as well have been posed in an entirely abstract way in the first place.
[+] [-] kylec|16 years ago|reply
[+] [-] ismarc|16 years ago|reply
[+] [-] petercooper|16 years ago|reply
Studies have been run on this in the past and to my dismay most people would, in the initial scenario, flick the switch to save five but kill one - immediately becoming murderers rather than bystanders. I suspect this test is trying to weed out what could make people flip-flop from one point of view to the other, as when replaced with "pushing a fat man off a bridge to block the trolley" the stats have tended to swing the other way.
[+] [-] frig|16 years ago|reply
In your case you value maintaining an unsullied self-image over assisting 5 people in mortal peril, and would prefer if other people saw your choice the same way.
But you're right (I've met people who were familiar with the people involved here): the intuition is that the stronger the perceived interpersonal relationship between you and the unlucky bloke the less likely you are to flick the switch, but teasing out exactly how much relationship is needed to make the moral intuition flip is tricky business.
[+] [-] gjm11|16 years ago|reply
Then again, I think much more harm results from people not living up to either sort of moral standard than from people having suboptimal moral principles (whatever that actually means).
[+] [-] thingie|16 years ago|reply
[+] [-] atamyrat|16 years ago|reply
http://justiceharvard.org/
Link was posted here recently, if anyone missed it.
[+] [-] ZeroGravitas|16 years ago|reply
Pushing fat guys in front of trains seems a bit cartoony and far less certain to save anyone in reality.
I think that the confusion around whether it would even work is enough to muddy the results in that case.
[+] [-] mhb|16 years ago|reply
http://moral.wjh.harvard.edu/eric1/test/testP.html
[+] [-] dstorrs|16 years ago|reply
1) False dichotomy. (There are exactly two actions you can take, no more.)
2) Unrealistic foreknowledge. (These are exactly what the results of these actions will be.)
3) Unrealistic scenarios. (How many of us are ever going to be standing with our foot stuck in the tracks of the sideline right near the switch when five other people are...blah blah blah.)
This kind of test is exactly what gives philosophers a bad reputation. They are studying important issues; could they /please/ take the time to build a test that respects the intelligence of the testee?
[+] [-] gort|16 years ago|reply
[+] [-] ilitirit|16 years ago|reply
Spelling and grammar mistakes reduce your credibility.
[+] [-] bwillard|16 years ago|reply
[+] [-] req2|16 years ago|reply
[+] [-] zargon|16 years ago|reply
[+] [-] ismarc|16 years ago|reply