top | item 2779741

Software Designer Reports Error in Anthony Trial

72 points| carterac | 14 years ago |nytimes.com | reply

55 comments

order
[+] foob|14 years ago|reply
It's terrifying that an analysis from software like CacheBack can be used as an important piece of evidence in a murder trial. An error of this magnitude could easily contribute to somebody wrongly losing his or her life and that is not alright in any way. I would feel a lot more comfortable if a piece of FOSS, which could be independently vetted, was used instead of some half-baked proprietary garbage with a $500 price tag. I'm all for finding a niche market and exploiting it, but to me there is something deeply wrong about hiding the logic behind a piece of software producing courtroom evidence.
[+] bugsy|14 years ago|reply
Absolutely. This is not the only situation where expert testimony comes from professional witnesses who make a living supporting various hypothesis using confidently proclaimed but deeply flawed tools and analysis methods that the "expert" himself doesn't even understand. The response is that it's the job of the defense to bring up any problems with experts, but they don't always do that. It's a scam. That this is done on capital murder cases is an abomination, amoral, and should be criminal.
[+] georgemcbay|14 years ago|reply
"He found both reports were inaccurate (although NetAnalysis came up with the correct result), in part because it appears both types of software had failed to fully decode the entire file, due to its complexity. His more thorough analysis showed that the Web site sci-spot.com was visited only once — not 84 times."

How does that work? I mean, how do you examine what must basically be a log file (though perhaps in some binary format), come up with 84 hits but then realize it was only 1 hit and blame the problem on file complexity? Seems like such an issue would only result in underreporting, not overreporting. Where did the 84 number even come from?

[+] colonelxc|14 years ago|reply
Here is a explanation from a the maker of a competing tool[1]. It actually delves into the Mork file format with the data from the trial. There are a couple 84's in the format and in the data, but what I think what happened is because there is no "visitedcount" when you have only visited a site once, it took the data from a previous row (in this case, a myspace page) and repeated the value.

If that is truly what happened, the fix is to simply re-initialize the visitedcount to 1 between rows in case there isn't a visitedcount listed.

[1] http://wordpress.bladeforensics.com/?p=357

[+] biot|14 years ago|reply

  $ grep 12.34.56.78 logfile | wc -l

  84
Maybe the complexity comes from there being 1 CSS file, 3 javascript includes, 58 images, and a number of AJAX calls on that HTML page?
[+] dragmorp_|14 years ago|reply
This was a major mistake by the witness in this case, and everyone who has been watching the case already knew about it. Do you know why?

Because it was presented to the jury during the trial.

The jury was told that the number of visits to that site was transposed with the number of visits to myspace. A prosecution witness cleared the record in open trial.

In fact, defense attorney Jose Baez even brought up the fact during closing arguments and used it as a reason to have reasonable doubt of the entire case.

[+] lylejohnson|14 years ago|reply
No, it's not (or shouldn't be) news to those of us who watched that part of the trial. But I bet there are a lot of people whose main information sources were cable TV talking heads who may have failed to point out this correction.
[+] orenmazor|14 years ago|reply
That is one terrible written article. It's almost like the writer decided to collect 20 tweetable paragraphs and tie them together into one "article"
[+] enjo|14 years ago|reply
Here is his bio:

http://www.siquest.ca/jbradley.asp

He seems heavy on law enforcement credentials, but rather light on Computer Science. Not sure that is the right combo here.

[+] redthrowaway|14 years ago|reply
To be fair, the "heavy on the law enforcement" bit was doing exactly the kind of thing he's designing the software for. You don't need a CS degree to write a program to dig through a cache, and designing the in-house software for the RCMP is probably experience enough. We all make mistakes, but he went out of his way to make his known as soon as he learned about it. I'm happy to have people with that moral fiber heading forensics departments and designing software.
[+] roel_v|14 years ago|reply
Isn't it strange that when somebody looks for something 84 times, that a prosecutor sees that as more important as someone looking for it only once? So a stupid person who needs to read something 84 times, or whose dog eats his printed version 83 times, is more likely to 'have done it' as the person who understands it on the first try or doesn't have a dog?
[+] Steko|14 years ago|reply
Is it really that strange? I think the idea is to show a fixation or continuing interest. I've googled some terms dozens of times because I know it will return the wiki or imbd page.
[+] georgieporgie|14 years ago|reply
I'm surprised that nobody with access to the data stopped to ponder that those who know how to search would find what they need in < 84 searches, while those who don't know how to search would give up earlier. The fact everyone blindly trusted suspicious data from a 'magical' program is, to me, more disturbing than the flaw itself.
[+] colonelxc|14 years ago|reply
It's not just searches, but hits to a specific site with information about chloroform, which is even more crazy.
[+] sosuke|14 years ago|reply
She was found not guilty wasn't she? Why does it matter now that his initial findings were faulty against her.
[+] SoftwareMaven|14 years ago|reply
It matters because the prosecution should have said something. The prosecutor's job is not to just put somebody in jail, but rather, to put the right somebody in jail. Unfortunately, we seem to have forgotten that in the US as part of the adversarial position between law enforcement and citizens.
[+] masterzora|14 years ago|reply
Because said findings were found faulty while the case was still proceeding. It would be irresponsible to let the prosecution get away with illegal behaviour just because the defendant won.
[+] sesh00|14 years ago|reply
Surely it matters that the prosecution had a responsibility to pass on the information and chose not to?

That, and the fact that you've got to seriously worry when a report from a piece of software that can confuse the numbers 1 and 84 is being used as evidence in court.

[+] Nick_C|14 years ago|reply
I took away the message that I would need to be quite careful about being called as an expert witness, viz what exactly my brief was. He thought it was about something, the prosecutor asked him about something else.

Good on him for having the moral fortitude to correct his error.

[+] bugsy|14 years ago|reply
Many people still think she is guilty. It was a shared computer and her mother testified that she had done one search on chloroform. The prosecution countered that there were 84 searches, so the rest had to have been done by the daughter. But it turns out that they ran two different programs on the recovered cache data and one program said there was 1 hit total, the other program said there were 84 searches over several weeks. The company with the 1 hit wrote an analysis showing their competitor's results were wrong. The competitor, with the 84 hits, agreed, and contacted the prosecutors to let them know. The prosecutors decided not to mention that the testimony they were giving from expert witnesses was false, even though they were legally required to do so.
[+] evan_|14 years ago|reply
Maybe one of the jurors found it unlikely that she visited the page about chloroform 84 times and that subtly affected his or her perception of the prosecution's case.
[+] gojomo|14 years ago|reply
Huh? The outcome of any one trial matters very little. Accuracy and honesty in evidence collection matters in every trial.