"While they did find a backdoor in a popular FPGA chip, there is no evidence the Chinese put it there, or even that it was intentionally malicious."
Nor did the original article specifically allege that it was "the Chinese", or that the backdoor was malicious. It did allege that it was inserted by the manufacturer (although technically anything on the chip is inserted by the manufacturer), presumably because it differed from a public spec, but the veracity of that statement is still unknown (at least to us). But I don't think that that's enough to call it "bogus".
Agree; I feel like I've misjudged the story here, because apparently the idea that silicon is as riddled with backdoors as software is isn't a big deal; just, "is China cyberwaring us".
The technology behind this story is serious and interesting, and all anyone wants to talk about is politics.
They said that this chip was manufactured in China and that the manufacturer had inserted this backdoor which "could be turned into an advanced Stuxnet weapon to attack potentially millions of systems."
By implication, and by selectively leaving out information about how this attack required the use of JTAG port the article intentionally implied that it was a Chinese attack, even if it didn't lay out that claim in so many words.
As always, logical reasoning arrives just a tad too late to the party. The original story has already circulated the web and stirred anger and mistrust in a sufficient amount of people who will never read this common sense follow-up. Chalk up one more win for sensationalism and fear-mongering.
The reason corrections to sensational stories never make it as far as the original is because the people who propagated the first story have to admit they were fooled, or at least misinformed. I'm about to share this correction to my own link feeds -- but I'll be honest, I'm not enthusiastic about looking dumb.
Most of the replies on HN to the original story lined up with this follow up article. Wait and see, it's a debug feature, not necessarily a military chip, etc. On many other sites it was a different story, and I'm sure this follow up won't make their headlines either.
This is a military chip. It's used in military applications. I'm not saying that the original article didn't use this to sensationalise 'We JTAG-fuzzed a chip and found an AES key', but to deny that the chip is used in the military is inaccurate.
Good call on the Chinese front, we don't know who generated the key material to block the JTAG.
Incidentally on z/OS systems things that open the system up to external access are sometimes referred to as backdoors, which is what this is. It's a way of accessing the chip, nothing more.
The ProAsic3 is a commercial FPGA. So no, it's not military. The military may use it, but that doesn't change that it is a commercial part built to commercial specifications.
The original article was censored so it's hard to tell if it's bogus or not. I believe it's still very interesting. Here http://bit.ly/JKatpV it's an older paper from the same author detailing a technique about reading thermal signatures from individual transistors on the chip with a microscope and an astronomy camera. A little more advanced than jtag fuzzing.
However I agree that's very hard to insert a hardware backdoor on a mask, even a FPGA mask. Given the hundreds of variations and revisions that FPGAs normally have, it's about impossible to cover even a single familiy.
A more plausible explanation is: some big client got very angry in the past because they lost the password and the FPGA vendor didn't had a "master key" to unlock it. Still, very stupid move.
The bigger issue here is if such a back door/debug mode was accidentally/intentionally left in any of the Actel FPGAs that use their anti-fuse technology. At present, anti-fuse seems to be the most robust technology for preventing read-out of configuration bit-stream as there is no serial data being pushed around each time the logic resets. ProASIC3 is a great platform for when you are on a very constrained power budget, however, I would not consider it one of their leading security-hardened chips. There is a lot of design reuse in complex semiconductor products so it is possible that this portion of the design was leaked to other devices.
This is some sensible response to the evil Chinese hype.
As Feynman put it, when a researcher over blow his findings, he's doing "Cargo Cult Science" (See the ending chapter of "Surely, you're joking, MR. Feynman").
I thought cargo cult science was (mis)copying techniques without understanding how they work. The original example was building a fake airport / ATC at an abandoned military base and hoping cargo planes would land there.
Whoever wrote that article is a little sketchy with his facts:
Quote: """One of the most common building-blocks is the debugger, known as JTAG. This is a standard way of soldering some wires to the chip and connecting to the USB port,"""
JTAG is just the low-level interface to a debugger. "Soldering some wires" is not the building block and USB is nowhere related to it (for example, my work-horse JTAG interface connects to Ethernet).
Quote: """Whereas companies (should) disable the debug feature in the version they send to customers, that's not so easy with chips. It requires millions of dollars for every change to chip design. Therefore, chips always have the JTAG interface enabled."""
At least parts of JTAG need to be enabled (most notably the boundary scan that allows you to read/set individual pins) for proper testing of complex circuit boards, but also this is not the problem here: It seems that they left some instructions active to read back supposedly write-only values (e.g. the AES key in question). Designing one of these internal, protected bits to be the "disable JTAG debugging" would not be that hard. CPUs with integrated flash are doing that for years: A certain signature in the internal non-volatile memory will disable flash-readout and CPU debugging, but boundary scan will stay active.
Quote: """ As real silicon chips are becoming more expensive to manufacturer, FPGAs are becoming a more popular alternative. (...) Every change to a chip design requires millions of dollars in changes to the masks that print gates onto a chip."""
Actually looking at a fixed complexity ASICs are getting cheaper to manufacture over time, just as everything else in chip-making. Or as FPGAs. And again: High-end special-technology ASICs might cost "millions of dollars", but no one in their right mind would re-design a complete ASIC for such a simple change like disabling JTAG debugging:
Chips are built in layers, and it quite common to produce a whole batch of wafers with the "lower layers" that form the actual transistors. The metal layers on top of them (those that form the wires interconnecting the transistors) may be added to say one third of the chips.
Then when errors are found during testing, one could take another wafer from the lot, apply a corrected metal-mask and check if the error could be remedied by re-wiring (often a few spare gates are spread over the wafer "just in case" one has to splice in an inverter in a signal... or such things).
Such a relatively cheap (say: 10% of the complete ASIC production run) change would be the right thing to build a chip with JTAG completely disabled, it would be impossibly to re-enable the feature from the outside, but of course, by opening the chip and re-wiring the metal (this is possible by using focused ion beams on a bare die) one could do it. But this was not the message of the quoted article.
His article might be a bit off but it is actually true that the University of Cambridge leaked document is off-base and even their official paper continues to clame there is a backdoor but in a way that negates itself:
"Ultimately, an attacker can extract the intellectual property (IP) from the device as well as make a number of changes to the firmware such as inserting new Trojans into its configuration."
Using a flaw in the system to "insert" a new trojan is not the same as an existing one. This and many other reasons that one sees when looking at both papers, the vendor response and then their response to the vendor make it pretty obvious that they stick to the backdoor claim to maintain face (perhaps for the original grant or clients).
but the best gem of the new paper is claiming that a crypto flaw that requires physical access to exploit = Denial of Service, considering you took out the chip or that you have physical access already.
He also keeps imposing the view that the Drone was "shot down" over Iran as fact. While many people have different views on how the drone was taken intact it was not "shot down".
Of course it is reputable. Its on blogspot, a free web host known for exclusivity and high standards.
Blogspot is well known for some of the highest quality software(1) and adult paradigm-shifting(2) link sites on the planet.
Blogspot, along with Errata Security(3), provides only the highest of high quality security(4) information. After all, the tagline is "Errata Security is a high-end cyber security consulting company."
There are non-malicious reasons for designing in backdoors (debugging) but the manufacturer is based in the PRC, is subject to PRC laws and pressure, and the PRC government undoubtedly knows that the company supplies hardware to the US military. However benign the original motivation for creating the backdoor may have been, it's potentially bad news.
There is no evidence this chip is being used by the military for sensitive tasks, it isn't even certified for those uses. It's just an off-the-shelf FPGA chip that was found to have a backdoor (requiring physical access!), likely for debug purposes. While that should certainly be a concern for many of the company's customers, it is not necessarily a national security crisis.
Actel devices are often used because they use flash memory for the configuration, which makes them usable in situations where RAM may be untrustworthy for something so important (e.g. where you can get bit-flipping due to ionising radiation).
They also make properly rad-hard FPGAs, although these are ridiculously expensive (I've heard that if you are looking at building more than ~10-15 units, an ASIC may actually be cheaper).
Edit: I should add that you could still use Xilinx and Altera devices in these situations, but you really need some way of mitigating the problem -- such as TMR memory.
Even I who promised myself I would take all future stories about "cyber threats" with a huge grain of salt, because I know the US Government is very keen on expanding their (offensive) Cyber agencies, and would stop at nothing to spread propaganda about it, I still almost believed this story. Damn it.
While I find much of the author's argument relevant, "it would be expensive" is very unconvincing.
Military operations and espionage by state actors are rarely limited by commercial constraints. In other words, access to confidential data on a vast array of devices would be a bargain at several million dollars when it comes to a national intelligence agency (or Apple or Google or Microsoft for that matter).
The cause of these backdoors isn't malicious, but a byproduct of software complexity. Systems need to be debugged before being shipped to customers. Therefore, the software contains debuggers.
I should think that embedded development environments would have a foolproof way of excluding debug code built into their environments. If they don't, then perhaps this is an opportunity?
[+] [-] waiwai933|14 years ago|reply
Nor did the original article specifically allege that it was "the Chinese", or that the backdoor was malicious. It did allege that it was inserted by the manufacturer (although technically anything on the chip is inserted by the manufacturer), presumably because it differed from a public spec, but the veracity of that statement is still unknown (at least to us). But I don't think that that's enough to call it "bogus".
[+] [-] tptacek|14 years ago|reply
The technology behind this story is serious and interesting, and all anyone wants to talk about is politics.
[+] [-] Symmetry|14 years ago|reply
By implication, and by selectively leaving out information about how this attack required the use of JTAG port the article intentionally implied that it was a Chinese attack, even if it didn't lay out that claim in so many words.
[+] [-] mladenkovacevic|14 years ago|reply
[+] [-] seldo|14 years ago|reply
[+] [-] keypusher|14 years ago|reply
[+] [-] vertis|14 years ago|reply
[+] [-] saturn|14 years ago|reply
It's not a new problem.
[+] [-] _b8r0|14 years ago|reply
Good call on the Chinese front, we don't know who generated the key material to block the JTAG.
Incidentally on z/OS systems things that open the system up to external access are sometimes referred to as backdoors, which is what this is. It's a way of accessing the chip, nothing more.
[+] [-] aiscott|14 years ago|reply
To call it a military chip is inaccurate.
[+] [-] aortega|14 years ago|reply
[+] [-] eq98|14 years ago|reply
http://www.cl.cam.ac.uk/~sps32/Silicon_scan_draft.pdf
[+] [-] mtdev|14 years ago|reply
[+] [-] chj|14 years ago|reply
As Feynman put it, when a researcher over blow his findings, he's doing "Cargo Cult Science" (See the ending chapter of "Surely, you're joking, MR. Feynman").
[+] [-] Drbble|14 years ago|reply
[+] [-] wamatt|14 years ago|reply
[+] [-] raphman|14 years ago|reply
[+] [-] cnvogel|14 years ago|reply
Quote: """One of the most common building-blocks is the debugger, known as JTAG. This is a standard way of soldering some wires to the chip and connecting to the USB port,"""
JTAG is just the low-level interface to a debugger. "Soldering some wires" is not the building block and USB is nowhere related to it (for example, my work-horse JTAG interface connects to Ethernet).
Quote: """Whereas companies (should) disable the debug feature in the version they send to customers, that's not so easy with chips. It requires millions of dollars for every change to chip design. Therefore, chips always have the JTAG interface enabled."""
At least parts of JTAG need to be enabled (most notably the boundary scan that allows you to read/set individual pins) for proper testing of complex circuit boards, but also this is not the problem here: It seems that they left some instructions active to read back supposedly write-only values (e.g. the AES key in question). Designing one of these internal, protected bits to be the "disable JTAG debugging" would not be that hard. CPUs with integrated flash are doing that for years: A certain signature in the internal non-volatile memory will disable flash-readout and CPU debugging, but boundary scan will stay active.
Quote: """ As real silicon chips are becoming more expensive to manufacturer, FPGAs are becoming a more popular alternative. (...) Every change to a chip design requires millions of dollars in changes to the masks that print gates onto a chip."""
Actually looking at a fixed complexity ASICs are getting cheaper to manufacture over time, just as everything else in chip-making. Or as FPGAs. And again: High-end special-technology ASICs might cost "millions of dollars", but no one in their right mind would re-design a complete ASIC for such a simple change like disabling JTAG debugging:
Chips are built in layers, and it quite common to produce a whole batch of wafers with the "lower layers" that form the actual transistors. The metal layers on top of them (those that form the wires interconnecting the transistors) may be added to say one third of the chips.
Then when errors are found during testing, one could take another wafer from the lot, apply a corrected metal-mask and check if the error could be remedied by re-wiring (often a few spare gates are spread over the wafer "just in case" one has to splice in an inverter in a signal... or such things).
Such a relatively cheap (say: 10% of the complete ASIC production run) change would be the right thing to build a chip with JTAG completely disabled, it would be impossibly to re-enable the feature from the outside, but of course, by opening the chip and re-wiring the metal (this is possible by using focused ion beams on a bare die) one could do it. But this was not the message of the quoted article.
[+] [-] cyphunk|14 years ago|reply
"Ultimately, an attacker can extract the intellectual property (IP) from the device as well as make a number of changes to the firmware such as inserting new Trojans into its configuration."
Using a flaw in the system to "insert" a new trojan is not the same as an existing one. This and many other reasons that one sees when looking at both papers, the vendor response and then their response to the vendor make it pretty obvious that they stick to the backdoor claim to maintain face (perhaps for the original grant or clients).
but the best gem of the new paper is claiming that a crypto flaw that requires physical access to exploit = Denial of Service, considering you took out the chip or that you have physical access already.
[+] [-] Swan26|14 years ago|reply
[+] [-] nodata|14 years ago|reply
[+] [-] smashing|14 years ago|reply
Blogspot is well known for some of the highest quality software(1) and adult paradigm-shifting(2) link sites on the planet.
Blogspot, along with Errata Security(3), provides only the highest of high quality security(4) information. After all, the tagline is "Errata Security is a high-end cyber security consulting company."
(1) NSFW: http://iphonevolt.blogspot.com
(2) NSFW: http://fascormet.blogspot.com/
(3) http://erratasec.blogspot.com/
(4) NSFW: http://tophackdownloads.blogspot.com/
[+] [-] briandon|14 years ago|reply
[+] [-] keypusher|14 years ago|reply
[+] [-] eighthNote|14 years ago|reply
It's neither a Xilinx, nor an Altera.
[+] [-] pvidler|14 years ago|reply
They also make properly rad-hard FPGAs, although these are ridiculously expensive (I've heard that if you are looking at building more than ~10-15 units, an ASIC may actually be cheaper).
Edit: I should add that you could still use Xilinx and Altera devices in these situations, but you really need some way of mitigating the problem -- such as TMR memory.
[+] [-] mtgx|14 years ago|reply
[+] [-] techinsidr|14 years ago|reply
http://www.securityweek.com/china-wrongfully-accused-over-ba...
[+] [-] brudgers|14 years ago|reply
Military operations and espionage by state actors are rarely limited by commercial constraints. In other words, access to confidential data on a vast array of devices would be a bargain at several million dollars when it comes to a national intelligence agency (or Apple or Google or Microsoft for that matter).
[+] [-] stcredzero|14 years ago|reply
I should think that embedded development environments would have a foolproof way of excluding debug code built into their environments. If they don't, then perhaps this is an opportunity?
[+] [-] lawnchair_larry|14 years ago|reply
[+] [-] gyaresu|14 years ago|reply
[+] [-] unknown|14 years ago|reply
[deleted]