top | item 10407951

VW

120 points| boriselec | 10 years ago |blog.cleancoder.com

44 comments

order

weinzierl|10 years ago

   I suppose you could make the argument that these programmers did not 
   know what they were doing. That they were simply given some specs, and 
   they implemented those specs, and didn't know that they were 
   accomplices in a case of massive fraud.


   I think that argument is even more asinine than Michael Horn's. They 
   knew. And if they didn't know, they should have known. They had a 
   responsibility to know.

I agree with all the points in the article except for the point that the programmers should have known.

For me it is a plausible scenario that the programmers have been told that his feature is needed for some good reason (probably testing).

When I was a young engineer I had a mentor. He was a war baby and a strict pacifist. He was also very good and his advice was much sought after so he could afford to refuse all offers from the defense industry.

He once told me that for his whole life he manged to never designed anything that could be used to harm people - except for one thing. When he was young he was hired to design a gear rim for a crane. He told me, he was given the load specifications but never saw a drawing of the actual crane. That was a bit unusual but nothing he worried about.

It turned out that the gear rim was actually for a Howitzer. He never worked for that client again.

There are all kinds of reasons why a car has to behave differently while on a dynamometer and there are all kinds of special code branches that are executed only during test. For the programmers it probably was just another special case among many.

Don't be evil and don't be a fool, but you can't be expected to do a full ethics check for every feature you are supposed to implement.

EDIT: Spelling, style and removal of some superfluous chatter.

tzs|10 years ago

From what I've read, the trigger for the mode switch was very detailed and narrowly tailored to the EPA certification testing, and included barometric pressure as a factor.

That makes it quite a bit harder to believe that whoever implemented it thought it was for some legitimate testing. For testing you want a trigger that is hard for anyone to hit accidentally, but easy for people who know about it to hit. You would not include barometric pressure, because that narrows the ability to get into the test mode way too much.

An ideal sequence would be some nonsensical sequence of inputs, like a specific sequence of left and right steering inputs, with a specific sequence of turn signals (often opposite of the direction turned) if the ECU has turn signal data available, interleaved with a specific pattern of taps on the brakes.

radiorental|10 years ago

"There are all kinds of reasons why a car has to behave differently while on a dynamometer and there are all kinds of special code branches that are executed only during test. For the programmers it probably was just another special case among many."

I think the point is that someone deliberately did this and they had their hands in the code. Yes, there are variants of the tune-able parameters for various regions and tests. As part of design and validation these can be used interchangeable on the test beds. However, someone, somewhere wrote the emissions defeator.

dsfyu404ed|10 years ago

I find his assertion that the programmers "should have known" to be beyond naive.

There's a reason that teams designing these sorts of systems consult with lawyers who are experts on the relevant law. The programmer's job is to program. Expecting them to also deal with details of legality and morality (beyond grossly obvious things like hard coding dosage limits into medical equipment) is just wishful thinking.

That's the kind of talk that people want to hear. "Oh the developers were given shitty instructions, they shouldn't have listened" but talk is cheap. Stop to consider the implications of that sort of second guessing. Obviously things get wacky at both extremes but when you give someone a spec to meet you need to have an expectation that it will meet that spec. Our industry is built upon millions of black boxes that meet I/O spec sure having the developers turn around and say "we changed you spec because it was killing polar bears" comes with a much larger can of worms than just implementing what you're told to implement and accepting that it might not be morally agreeable and getting on to the next thing.

There's a reason people aren't all generic worker bees. It's efficient to have the lawyers worry about laws, coders worry about code and managers act as the interface between them and accept the blame if what the lawyers say isn't properly translated into the programmers' instructions. than it is to have all three groups worry about all three subjects.

I think law is interesting and has a lot in common with software developing but I don't want to have to go looking up case law as required research before coding a windshield wiper controller..

resu_nimda|10 years ago

The programmer's job is to program. Expecting them to also deal with details of legality and morality (beyond grossly obvious things like hard coding dosage limits into medical equipment) is just wishful thinking.

We're humans, not robots. People can be expected to think about things and participate in society. It's generally held that we should expect pretty much everyone to concern themselves with details of legality and morality as part of being a good citizen... "I'm just a simple automaton doing what I'm told" is generally not a valid excuse.

Do you actually know programmers who literally just take specs and implement them and have no thoughts or opinions about the larger context of what's going on? In my experience, programmers have a lot to say about non-programming aspects of work.

The other issue here is, what constitutes "grossly obvious?" You just drew a totally arbitrary line based on your own opinion of what can be expected and what can't. Your argument is a bit of a strawman, nobody is expecting coders to go read up on case law.

Ultimately, we don't know anything about what happened at VW. We don't know who was responsible, or who knew what, and we're all just crafting up scenarios and speculation ("you see, the specs were such that the engineers couldn't possibly have known what was going on") based on our own experiences and biases.

Maarten88|10 years ago

> The programmer's job is to program...

I think an engineer has more responsibility than following orders. Especially a German engineer should be aware of this. "Ich habe es nicht gewüst" is only an excuse as long as it is true.

Mithaldu|10 years ago

Writing the software is one act. Being the one to greenlight taking this software and putting it into machines that will be sold to end customers is another.

I can see many reasons why software might be written, or maybe even configured, in a way that could be lethal when deployed to an actual customer, but have completely valid and sane reasons for existing (all maner of testing comes to mind).

Unless it can be proven that the developers had intent and did follow through, there is no particular reason why the blame should fall entirely on them.

Additionally, if he is so intent on having a "profession" that punishes illdoers, he should first call for one that protects good members.

dmd|10 years ago

It is absolutely plausible that the programmers had no idea.

From the excellent Metafilter thread:

> i mean, how do the product managers rationalize this feature to their colleagues? what to they write in the spec that isn't all-out incriminating?

Modularity

Department 1:

Req 1: Software should enable emissions controls upon receipt of control signal A.

Req 2: Software should disable emissions controls upon receipt of control signal B.

Department 2:

Req 1: if epa testing device is detected send signal A.

Req 2: if epa testing device is not detected send signal B

http://www.metafilter.com/153117/EPA-Accuses-VW-of-Emissions...

scintill76|10 years ago

Still seems kind of fishy, at least without some stated reason why it needs to know it's being tested.

teddyh|10 years ago

Just because it’s possible doesn’t mean that it’s plausible or even likely.

yason|10 years ago

A wise future programmer might want to ensure that the software he writes acts mostly like science――it can be used for good and it can be used for bad but isn't inherently neither――and will thus force outsourcing decisions about the final product to someone else.

Maybe it's ok in the Volkswagen software to have a knob that controls the amount of NOX in the exhaust, for testing purposes and for adapting the car to various markets. Maybe it's ok for the software to provide heuristics for the driving conditions (highway, city, dynamometer) for some future telemetry application. But the wise future programmer does realize it needs to be someone else than himself who makes the decision to configure the system to couple those two things together, and make the car reduce pollution only when dynamometer mode is active.

Good old shifting of blame works for the bad guys as well as the good guys. It may not be pretty but it works well enough if only you're willing to draw the line of responsibility somewhere for yourself.

sauere|10 years ago

> It doesn't matter that their bosses told them to do it. They did it.

It's not that easy. Sure, what they done can be considered "evil"... but what if they had refused to do it? They would have most likely lost their jobs and they would have no chance in a court trial. Volkswagen has a army of lawyers and is in tightly connected with every relevant government agency in Germany.

moomin|10 years ago

Exactly, Uncle Bob is in the category of people who can walk away if it suits them (he's got fuck you reputation if not fuck you money). So are (some of) the bosses. The developers weren't.

The problem here is Uncle Bob thinks he's in the trenches when in fact he's armchair quarterbacking.

jtheory|10 years ago

This is all very theoretical, since we don't know the situation.

But the other possibility would be to "blow the whistle" anonymously.

The chances of getting away with that still aren't great (if VW put some effort into flushing out the snitch, I think only a practised liar could get through it...), but it's another way forward.

And actually: it's possible this actually happened, and the official story of how this was discovered is just a cover for an anonymous engineer who managed to get a warning to the right person.

mehrdada|10 years ago

Even if everything in the article were right, I disagree with the conclusion. I'm glad there we have no "profession" to act as another gatekeeper for people to do things. The negative consequences of such "licensing" system (which will probably transform into a political pact soon enough after introduction) are more profound than the cost of its nonexistence.

TazeTSchnitzel|10 years ago

The vast majority of people working on the Manhattan Project had no idea they were building a bomb.

What says the programmers knew?

nemo44x|10 years ago

It's conjecture at this point to say anyone who programmed the ECU to do this knew what they were doing.

Saying that, it is likely they did know but this comes from above. There's a few psychology experiments that show many humans will do things they know are wrong or immoral when an authority figure tells them to do it even though they don't want to do it. The Milgram Experiment, for instance, comes to this conclusion, among others.

Peer pressure and obedience of authority are real phenomenons and that starts with the leadership that needs to be held accountable. Hearing an authority figure pass the blame to someone at the bottom is disgusting and barking up the wrong tree I believe.

seivan|10 years ago

Is that the experiment that showed that "assholes" are the ones who wouldn't do what they were told if they found it immoral?

While the "nice" and "obedient" always did as they were told? Then we're told never to hire "assholes".

rycfan|10 years ago

"The public has been made aware that programmers can be culprits. This will make it more likely that the next time something goes wrong -- a plane crash, a fire, a flood -- that the public will jump to the conclusion that some programmer caused it. Yes, this is a stretch; but it wasn't so long ago that the concept of programmer implication in disasters was non-existent."

"...it wasn't so long ago..." What?

https://en.wikipedia.org/wiki/Therac-25 -- This has been a thing since at least 1985 and probably far longer.

iofj|10 years ago

Exactly. There is an existing legal and moral framework for precisely these sorts of situations. It seems to work very well for the emissions scandal too, so why mess with it.

Even if the programmer was fully aware of what they were doing, VW would still be the only party that's legally and morally responsible for this.

(And may God help whoever made the Therac-25 mistake, just imagine making a bug like that)

mledu|10 years ago

What is stopping a company from hiring a contractor to code the illegal parts to thereby insulate them from responsibility? This happens with oil and gas disasters as discussed on John Oliver https://www.youtube.com/watch?v=jYusNNldesc

JoeAltmaier|10 years ago

Because many contractors are too smart to do that kind of dirty work. Contractors are liable for harm they cause. I've refused contracting jobs because of liability.

Walkman|10 years ago

Sometimes I would not mind if developing software would need a license, even if I'm speaking against myself ATM.

Silhouette|10 years ago

The trouble with licensing a profession like software development is that no-one really knows how to do it very well yet. It's far too young and diverse an industry to have that level of experience and consensus.

Lacking more objective standards, the most likely result of attempting to regulate at this stage seems to be regulators who talk a good talk -- such as the author of this article. Those people will not necessarily be the ones with either the best ideas currently available for building good software or the most useful experience and/or data to advance the state of the art in the future.

I sometimes work on software that really does have to behave properly because significant failures in production really could be very damaging. The idea that some of the careful, successful processes used on some of those projects might be required by regulation/legislation to give way to the kind of junk that a lot of consultants peddle is quite scary.

cls59|10 years ago

No kidding. If a team of Doctors, Lawyers or Professional Engineers (Civil, Mechanical, etc.) had been involved in a ethics disaster of similar scale, those people would be in danger of loosing the ability to continue practicing at a professional level.

callesgg|10 years ago

While the ethics behind the implementation might have been obvious, the actual legality of is not. The law could very well have been written in a way that makes this completely legal.

A software engineer would most likely not know the intricate details of the law required to know whether it was legal(in all nations) or not.

revelation|10 years ago

Professions are tools workers use to increase their bargaining power over employers and put up barriers to entry into said profession. See doctors.