In discussions like this the phrase "security by obscurity" gets used as an accusation. We all agree "security by obscurity" does not work. But that's not what is happening here.
Wikipedia's definition: "the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system."
Youbico isn't saying that the security of the device is increased by keeping the source code secret.
They say they are increasing the security by things like this: disabling user-loading of new firmware (which could be a bad actor loading bad firmware), using hardware with built-in side-channel countermeasures, and disabling JTAG ports (which could be used for key extraction).
This isn't obscurity. These are some good engineering arguments. Engineering is always full of trade-offs.
None of which precludes the implementation from being open source. In fact, it just means that even if the software were open source, it would be near-meaningless since I can't verify the code running on the device and can't reflash it myself.
"Youbico isn't saying that the security of the device is increased by keeping the source code secret."
Yeah, they're not really saying anything other than trying to provide an excuse for why they won't release it. "You can't use it anyway" isn't much of a response (I actually find it rather patronizing and dismissive).
Not to pile on, but regarding: "Engineering is always full of trade-offs."... what exactly is the supposed trade off here? (Maybe they're using licensed code that they can't redistrib?)
>They say they are increasing the security by things like this: disabling user-loading of new firmware (which could be a bad actor loading bad firmware), using hardware with built-in side-channel countermeasures, and disabling JTAG ports (which could be used for key extraction).
Are all of those listed features only possible with secret code? And if yes, once someone unobscures the code or methods, they'll be able to defeat the security. Isn't that the exact definition of 'security through obscurity'?
This isnt about security. Its about its was open source before and user modifiable and it no longer is. You can force wipe on flash for example.
They clearly changed stance to ensure users cannot play with the hardware and competitors cannot copy the code. Which is fine. But its always weird when the argument of security is used instead of being genuine.
You can copy the freaking key by removing the plastic of the yubikey4. you dont need a jtag port. you just connect to the pins. And guess what. its no big deal. You can't do that remotely and its not a device for 007 spies.
> In discussions like this the phrase "security by obscurity" gets used as an accusation. We all agree "security by obscurity" does not work. But that's not what is happening here.
Well, sort of.
In the linked article Jakob Ehrensvard (Yubico CTO) wrote:
>> (…) One could say it actually works the other way. In fact, the attacker’s job becomes much easier as the code to attack is fully known and the attacker owns the hardware freely. (…)
While the rest of the article makes good points, this particular sentence hints at "security through obscurity".
Am I understanding correctly that these devices can never have their firmware updated? That there is no update mechanism seems insane. They could prevent bad firmware updates by wiping keys on upgrade. The risk now is that some firmware version is discovered to have flaws, and that device is vulnerable forever.
The most important thing any security company needs to realize is that their primary product is their reputation, not the physical or digital goods that they produce. "We, as a product company" is totally the wrong attitude. There's really no question about it, every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it.
From a product perspective it totally makes sense to be worried about open sourcing the entire design. "Our competition will make clones!" And that may be true of every other kind of product. But would you buy a cheap knockoff Yubikey? I certainly wouldn't. Again, reputation is the key here. That's what a security company sells to their customers. Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information.
A company can build up a reputation in the security industry, produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. That premium is completely derived from the trust that they've garnered. It's insane for Yubico to squander theirs under some false sense of IP security.
EDIT: And all that said, I totally understand where they're coming from on some of their points. They have to depend on chip manufacturers, and chip manufacturers are just the absolute worst when it comes to open source and security. Sometimes there are hard constraints and compromises have to be made. Most of cryptography is a trade-off. So don't take my comment to mean that designs absolutely have to be 100% open source. That's infeasible most of the time for hardware. But Yubico should be striving for it and pressuring the market.
> A company can build up a reputation in the security industry, produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets.
Hmm. I think there's considerable limits on how true this is. I would argue Yubikey's current security is more than good enough for almost everyone.
As mentioned in your edit, there's not a lot Yubico can do about the hardware restrictions. Given these restrictions, a common way companies in this industry assure users of the security of their device is FIPS 140-2 certifications, which range from levels 1 to 4.
Level 4-certified devices are extremely expensive, and the market for them is tiny, which seems to indicate that there's a definite limit on the amount people and organisations are prepared to pay to ensure security.
"The most important thing any security company needs to realize is that their primary product is their reputation, not the physical or digital goods that they produce."
That's semi-true. They're both important. The belief that the product is worth buying and effort into selling it are primary importance. Getting hacked or sued in public diminishes sales. So, the most important aspect of security for these kinds of companies is perversely minimizing potential for their image to be hit by hackers even if the products have no security. Not an accusation at Yubico but a common strategy in this market. So, they just have to present a good impression to target market.
" every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it."
Not really. It might surprise you but many companies have run for decades on proprietary platforms. They generated ridiculous sums of money in the process. All kinds of people got jobs, made money, and retired in this time. Nothing to worry about apparently most of the time. The reasons to worry are there but smaller than you think. One must balance many needs in a business. For most, this kind of thing is a checklist item about reducing liability. They're fine if it looks good on paper.
" But would you buy a cheap knockoff Yubikey? I certainly wouldn't. "
Most would. They want something as an obstacle to hackers while minimizing cost. They don't know if Yubikey has any real quality underneath given how businesses often do things. So, it's a real Yubikey vs a cheaper one. Many, not all, will choose the cheaper one. See Cisco and mobile manufacturers vs Huwei to see how big of a market share that can lead to.
"Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information."
There's a market for that. I used to try to serve it. It's tiny and fickle. Yet, I question what confidence people have in those engineers to begin with as they've never assessed their capabilities in INFOSEC and strong attacks rarely are publicized. It's not like Googling rate of car crashes.
" produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. "
Many tried. Market rejected almost all of it. Still does. They want security-defeating feature X, protocol Y, and fall-back Z. They want it to run as fast as competition despite security or safety checks on insecure, potentially-backdoored hardware to get COTS HW benefits. They also don't want to pay hardly anything extra for it despite whole teams of extra people being put into every other component for rigor and price of external evaluations. Market for high-assurance guards is so small that they have to charge over $100,000 per unit to make the money back. Hell, Signal is free and Threema charges $1-2 but they're barely a fraction of 1% of WhatApp or Facebook in marketshare. Demand-side is the problem.
So, Yubico is doing what's good for business. All of them are and should until market shows it's willing to make the compromises necessary for strong security. They won't. So, wasting money on it is foolish outside defense sector, academia, and a few niches (eg smartcards) where one can keep a job doing it.
It's a shame to see that they used the goodwill of security-conscious cryptonerds to gain a foothold on the market only to, effectively, say "We're now targeting enterprise and government who can afford to pay for third party contracting security auditors. You can't, so just take our word that it's secure."
Other companies have managed secret distribution for secure devices just fine - randomise the card manager key and bundle a tamper proof packet containing the key along with the product. Provide instructions on how to verify the integrity of the packet, and confirm a digitally signed affirmation of the key against Yubico's public key online.
That's more than RSA offers for SecurID seed verification and more than my business bank offers for two factor device PIN integrity checking.
I'm not sure who they use for their Secure Element (NXP?) but it also sounds like Yubico has gone along with their request (and NDA) to keep implementation details secret. We've seen a similar situation in SE implementations in mobile phones (for contactless payment, primarily).
Again, enterprise customers don't care (mid-sized one have insurance that will cover loss if their Common Criteria EAL 5+ vendor's hardware is compromised, big enterprise can pay for auditing). Governments don't care (they'll pay for auditing or negotiate it in any significantly high volume contract).
End users and the tech community are the only groups who'll really lose out here.
I've studied high-assurance security and hardware for a long time. This looks to be motivated by a few things:
1. Hardware cost money to develop, has to make it back, and is easy to clone. They'll keep hardware secret by default for this reason like everyone does. Also lowers odds of patent suits. All kinds of people demand open, secure hardware but almost nobody will buy it. Just like software. Number 1 problem in the INFOSEC industry.
2. There's three companies IIRC building the kinds of secure IC's they need. They NDA the stuff critical to understanding it. Plus, the implementations are secret with tamper-resistance mechanisms. Pointless relying on open-source model to understand or evaluate such a thing. Some marginal benefits but major risks would still be there. Whereas, open-sourcing the stuff adds risk in terms of issues with the suppliers. So, no OSS is an acceptable choice here.
3. Restricting some of the firmware/software is a tradeoff of the protection methods they're using. Again, reduces value in open-sourcing it as you'd have to dump it off the chip to verify it anyway. The kind of people that can do that don't need Yubico's help.
4. Yubico might not know how to build secure HW/SW combos. It's a rare skill whose techniques are a mix of published and trade secrets. Plus, attackers are always coming up with new stuff. So, obfuscation... not security by obscurity... but obfuscation of aspects of design to increase work of attackers between product releases is both justified and a proven method. If no other measures exist, then it would be the garbage known as security by obscurity. This seems to be better practice of proven mechanisms plus obfuscation which can hamper even nation-state hackers. Who knows how good their mechanism are going to be but there's potential.
So, it seems like a combination of sustaining their business by stopping clones and lawsuits with improved branding from effects of obfuscation & hardened IC's on low-skilled attacks that dominate the press. Two, very-good reasons to make a decision in this market. It's just economics in action. :)
1. The hardware design per se isn't that valuable. It's quite easy to reverse engineer and is probably more like a reference design that anything. More likely NXP (?) don't want open designs and open software because it makes it easier to reverse engineer and clone the chips themselves. For YubiKey themselves it's mainly the firmware that is valuable (well, design and access to chips to of course) which is why part of their firmware isn't open source.
After thinking through the initial "this is terrible" reaction, I actually don't mind what they're doing. Even though if there was an equivalent solution that was based on open source I'd definitely choose it over YK 4.
I also don't see anything that would really prevent them from just releasing the source they're using, even if we can't realistically do anything useful with it. The whole point of those systems is that it's secure via algorithms and hardware silos - releasing their sources shouldn't change anything.
But in practice it doesn't really matter that much - as long as they use standard interfaces and replace your key for free if someone finds a vulnerability, I'm (cautiously) fine with their new position. I think a big part of the issue is that they did something better before, but if they started with the current design, people wouldn't really complain about it that much.
Couldn't a hardware vendor theoretically provide read-only access to the firmware and then have an open-source reproducible build process so that anyone can build their own copy of the firmware and verify that the firmware on the device is bit-for-bit identical? Wouldn't that satisfy people who want to be sure of what code is running on their device while still preventing an attacker from loading custom firmware?
The argument for disabling loading new firmware on your own device is valid. It prevents an outside actor loading malicious firmware. But it's a tradeoff: it means that if a vulnerability is found, the device has to be replaced, and users can't customize their firmware. That's a good tradeoff; I'd rather risk paying for a new Yubikey than risk a security compromise, and most users are unqualified to verify the security of firmware being loaded onto the device.
The problem is, it's not a tradeoff Yubico have to make. They can allow users to achieve the same goals by distributing the device un-flashed, with the source code to the firmware. Upon flashing, the firmware would disable further flashing. If the user doesn't like this tradeoff, the user can choose to change the code. As a courtesy to more trusting users they could provide the service of optionally flashing devices for you. And qualified users can verify the security of the firmware before loading it.
But by flashing the devices themselves, Yubico has chosen the worst of both worlds. Now an outside actor can once again add malicious firmware: Yubico is an outside actor. AND nobody can verify the security of the firmware. This isn't even a tradeoff, it's just a loss.
> They can allow users to achieve the same goals by distributing the device un-flashed
There is the possibility of the device being intercepted before it reaches you. Or before you have gotten around to locking it down. Or when you plug it into your (compromised) system to lock it down.
Since all communication is done over the USB port, the problem is that the firmware can be flashed with a backdoored firmware that appears to be normal/unflashed. One that can be flashable (by basically having a virtual machine/emulator that runs the flashed image), appears to get locked down when you go through any lockdown process (since you just end up locking down the VM). But still has the backdoor in place.
Firmware aside, people can modify the hardware too. Unless you crack open the device and inspect the internals (which many devices are designed to prevent). And even then a really sophisticated attack could replace the chips with identical looking ones. If you are using off the shelf ones then it wouldn't be that hard. They can also add an extra chip before the real one that intercepts the communication. Or maybe compromise the 'insecure' USB chip (if it's programmable).
With locked down hardware the manufacturer can bake private keys onto the chips and ensure that the official stuff checks the hardware by asking it to digitally sign something with a private key. But if the attacker has added their own chip between the USB and the legit chip, they can pass through the requests to the official chip.
TPM will do something like keep a running hash of all the instructions that are sent to the hardware and use the resulting has as part of the digital signature verification, but if you mirror the requests that doesn't help.
The next stage is to use the keys on the chip to encrypt all communication between the 'secure' chip. So any 'pirate' chip won't get anything useful.
Users could be allowed to 'bake' their own keys in, but that leaves us with the intercepted hardware problem. The attacker gets the hardware, installs fake firmware that appears to accept your custom key and preforms the encryption.
Personally I think worrying about security to that level is over kill even if your dealing with quite a bit of money. It would have to be quite an organised attack. They would have to gain physical access to the device, compromise it, return it unknown and then gain physical access again later. Requiring both physical and digital security skills.
That's much more work than just, stealing it or applying Rubber-hose cryptanalysis. Attackers can also compromise the system being used to access whatever.
I am pleased they took the time to respond in length. It makes a bit more sense now (NDAs, hardware manufacturers, etc...) vs. the 'security by obscurity' mantra prevalent in the replies.
I have had my own business, and the one thing I would say to the critics of Yubico: If you have a way, given existing hardware and software tools and suppliers, to do a better job, step up and do it. AFAIK, Apple didn't opensource their hardware related to crypto, or their software.
I think you will find it takes more than wishful thinking; more like, put your money ( = or your time) where your mouth is. Engineers, and I don't just mean CI engineers here, know it is a long way from a math equation or set of equations to a real world working object. I would love to see, and I would contribute money to an opensource solution. I just don't think it is as cookie-cutter simple as the majority of comments seem to intimate on this forum.
BTW, the two major manufacturers they're talking about are NXP (http://www.nxp.com/) and Infineon (http://www.infineon.com/). STMicroelectonics (http://www.st.com/) is also a player here and Feitian has also started doing it (http://www.ftsafe.com/product/epass/eJavaToken). NXP and Infineon are notoriously hard to get started with for small companies and independent developers but they have some very clever proprietary stuff in their chips.
Very long post. Apparently, very simple explanation: they want to use NXP hardware, and NXP requires NDAs, preventing them from meaningfully opening source code to the platform.
That's a very disingenuous summary. It seems impossible to make the device open due to the NDAs. Can you explain how they would get around these?
With regards to the applet manager, that seems to be an issue with customer friction less so than being too hard. While "crypto nerds" would be fine, business applications could be affected.
My opinion on this is that physical security is paramount. Your threat model can't possibly eliminate all threats from an adversary that has physical access.
No hardware is 100% secure and for Yubico to say this issue is about "Secure Hardware vs. Open Source" seems like a red herring. Perhaps they are just trying to protect their business model? After all, there isn't anything particularly unique about the hardware.
Physical security is a moving target and a spectrum. Basic mechanisms can protect my computer if I leave it unattended in front of common hackers for a few minutes to take a leak at a restaurant. Another level of security is necessary for people with more access or tooling. At some point, basically nothing I do will help given enough resources by pro's.
So, it's not so simple. Otherwise, all buildings containing valuable protected by locks and stuff would be compromise because enemies had potential of physical access. They aren't. That's telling you something.
well, it's a shame that poor arguments get recycled like this, but it does make for easy dismissal - cryptography is based off of the idea that the methods used totally transparent, the power to decrypt comes from possession of the appropriate keys. by closing a design, hiding it from scrutiny from the majority of hackers like ourselves, helps no one other than the individuals who wish to gain unauthorised/unwanted access.
this is a fundamental concept in FOSS and for anyone to try and rationalise their way out of it - be it out of some corrupted sense of trying to do the right thing - is absurd.
fortunately i feel that the very people that would be interested in this device will be aware of this; i hope the folks at yubico reverse this decision.
This story made me think a bit about devices like the Yubikey. I'd really like one to store my keys to sign mail, or for two-factor-authentication. But the main selling point, the tamper-resistant secure-enclave-like chip, is something I don't need. I'd rather have a tiny microcontroler in USB format that I can program myself and understand nearly 100%, with no secret code going on.
My reasoning: I don't need physical tamper-resistance for my threat scenario - if it is stolen by a random thief, a coworker, a "friend", etc..
But if I was attacked by a nation-state-like actor, I cannot trust any security measure of the device. How do I know the NSA does not have a copy of every "random" card-manager key? How do I know that generated keys are not subtly biased so that they can be guessed easily? Or that there is not a secret function to extract them? Even if Yubico is 100% honest and their device is clean, I must assume that if e.g. the NSA were after me, they have the technology to extract the keys from the device, no matter what protection it has.
I understand where they're coming from. Though it would be even braver for them to get into the IC design game, and make a chip with the properties they desire. They can then publish whatever they would like about that chip.
tl;dr: code is closed and I can't change it anyway so it shouldn't matter to me.
I hope the response from consumers will be: we understand your position. Unfortunately that is unacceptable and we'll look for another vendor. It is mine. I own a Neo, not getting any of their future products.
Also as a strategic guideline...maybe if you're in the business of security...don't use hardware that requires NDAs. Yes it'll make it impossible to do some stuff and more expensive to do some stuff but I'd say there's really no option to compromise.
While it is good that they are implementing all these hardware security features, I think that we are in general over thinking the whole thing.
Their current industrial design very clearly says "hey, I am an important security key", which is exactly the wrong thing to do.
It should instead look like a cheap flash drive. And when the thief plugs it in, he sees exactly that, a low capacity USB flash drive, unencrypted, with some random documents on it.
Is the thief at this point going to perform some sophisticated hardware hacking? No, it will just get thrown away.
The industry of smartcards and similar devices has annoyed me for a long time, mainly due to its failure to provide a secure general purpose computing environment and get out of the way. I wrote about it some time ago: https://www.devever.net/~hl/smartcards
I have been a long term user/promoter of yubikeys. But today I ordered a Nitrokey Pro. They seem to be the better choice now. Definitely more open and with pen tests of hardware and firmware on their website. All schematics and firmware on GitHub.
[+] [-] davideous|9 years ago|reply
Wikipedia's definition: "the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system."
Youbico isn't saying that the security of the device is increased by keeping the source code secret.
They say they are increasing the security by things like this: disabling user-loading of new firmware (which could be a bad actor loading bad firmware), using hardware with built-in side-channel countermeasures, and disabling JTAG ports (which could be used for key extraction).
This isn't obscurity. These are some good engineering arguments. Engineering is always full of trade-offs.
[+] [-] colemickens|9 years ago|reply
"Youbico isn't saying that the security of the device is increased by keeping the source code secret."
Yeah, they're not really saying anything other than trying to provide an excuse for why they won't release it. "You can't use it anyway" isn't much of a response (I actually find it rather patronizing and dismissive).
Not to pile on, but regarding: "Engineering is always full of trade-offs."... what exactly is the supposed trade off here? (Maybe they're using licensed code that they can't redistrib?)
[+] [-] sigmar|9 years ago|reply
Are all of those listed features only possible with secret code? And if yes, once someone unobscures the code or methods, they'll be able to defeat the security. Isn't that the exact definition of 'security through obscurity'?
[+] [-] zobzu|9 years ago|reply
They clearly changed stance to ensure users cannot play with the hardware and competitors cannot copy the code. Which is fine. But its always weird when the argument of security is used instead of being genuine.
You can copy the freaking key by removing the plastic of the yubikey4. you dont need a jtag port. you just connect to the pins. And guess what. its no big deal. You can't do that remotely and its not a device for 007 spies.
[+] [-] datenwolf|9 years ago|reply
Well, sort of.
In the linked article Jakob Ehrensvard (Yubico CTO) wrote:
>> (…) One could say it actually works the other way. In fact, the attacker’s job becomes much easier as the code to attack is fully known and the attacker owns the hardware freely. (…)
While the rest of the article makes good points, this particular sentence hints at "security through obscurity".
[+] [-] Alupis|9 years ago|reply
Could this be a sly attempt to close-up the source (and hardware) before they have a Tangibot[1] situation?
That scenario played out poorly for MakerBot, and perhaps YubiCo learned the wrong lessons from the entire ordeal.
[1] http://www.cnet.com/news/pulling-back-from-open-source-hardw...
[+] [-] discreditable|9 years ago|reply
Am I understanding correctly that these devices can never have their firmware updated? That there is no update mechanism seems insane. They could prevent bad firmware updates by wiping keys on upgrade. The risk now is that some firmware version is discovered to have flaws, and that device is vulnerable forever.
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] fpgaminer|9 years ago|reply
The most important thing any security company needs to realize is that their primary product is their reputation, not the physical or digital goods that they produce. "We, as a product company" is totally the wrong attitude. There's really no question about it, every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it.
From a product perspective it totally makes sense to be worried about open sourcing the entire design. "Our competition will make clones!" And that may be true of every other kind of product. But would you buy a cheap knockoff Yubikey? I certainly wouldn't. Again, reputation is the key here. That's what a security company sells to their customers. Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information.
A company can build up a reputation in the security industry, produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. That premium is completely derived from the trust that they've garnered. It's insane for Yubico to squander theirs under some false sense of IP security.
EDIT: And all that said, I totally understand where they're coming from on some of their points. They have to depend on chip manufacturers, and chip manufacturers are just the absolute worst when it comes to open source and security. Sometimes there are hard constraints and compromises have to be made. Most of cryptography is a trade-off. So don't take my comment to mean that designs absolutely have to be 100% open source. That's infeasible most of the time for hardware. But Yubico should be striving for it and pressuring the market.
[+] [-] jfindley|9 years ago|reply
Hmm. I think there's considerable limits on how true this is. I would argue Yubikey's current security is more than good enough for almost everyone.
As mentioned in your edit, there's not a lot Yubico can do about the hardware restrictions. Given these restrictions, a common way companies in this industry assure users of the security of their device is FIPS 140-2 certifications, which range from levels 1 to 4.
Level 4-certified devices are extremely expensive, and the market for them is tiny, which seems to indicate that there's a definite limit on the amount people and organisations are prepared to pay to ensure security.
[+] [-] nickpsecurity|9 years ago|reply
That's semi-true. They're both important. The belief that the product is worth buying and effort into selling it are primary importance. Getting hacked or sued in public diminishes sales. So, the most important aspect of security for these kinds of companies is perversely minimizing potential for their image to be hit by hackers even if the products have no security. Not an accusation at Yubico but a common strategy in this market. So, they just have to present a good impression to target market.
" every ounce of closed source software/hardware in a security offering is something the customer should be concerned about it."
Not really. It might surprise you but many companies have run for decades on proprietary platforms. They generated ridiculous sums of money in the process. All kinds of people got jobs, made money, and retired in this time. Nothing to worry about apparently most of the time. The reasons to worry are there but smaller than you think. One must balance many needs in a business. For most, this kind of thing is a checklist item about reducing liability. They're fine if it looks good on paper.
" But would you buy a cheap knockoff Yubikey? I certainly wouldn't. "
Most would. They want something as an obstacle to hackers while minimizing cost. They don't know if Yubikey has any real quality underneath given how businesses often do things. So, it's a real Yubikey vs a cheaper one. Many, not all, will choose the cheaper one. See Cisco and mobile manufacturers vs Huwei to see how big of a market share that can lead to.
"Confidence that when they buy from company X they know that company X has put the best engineers to the task and crafted a device that will protect their valuable digital information."
There's a market for that. I used to try to serve it. It's tiny and fickle. Yet, I question what confidence people have in those engineers to begin with as they've never assessed their capabilities in INFOSEC and strong attacks rarely are publicized. It's not like Googling rate of car crashes.
" produce world class hardware and software, and charge a sharp premium on it, because security is _so_ important and protects some of our most valuable assets. "
Many tried. Market rejected almost all of it. Still does. They want security-defeating feature X, protocol Y, and fall-back Z. They want it to run as fast as competition despite security or safety checks on insecure, potentially-backdoored hardware to get COTS HW benefits. They also don't want to pay hardly anything extra for it despite whole teams of extra people being put into every other component for rigor and price of external evaluations. Market for high-assurance guards is so small that they have to charge over $100,000 per unit to make the money back. Hell, Signal is free and Threema charges $1-2 but they're barely a fraction of 1% of WhatApp or Facebook in marketshare. Demand-side is the problem.
So, Yubico is doing what's good for business. All of them are and should until market shows it's willing to make the compromises necessary for strong security. They won't. So, wasting money on it is foolish outside defense sector, academia, and a few niches (eg smartcards) where one can keep a job doing it.
[+] [-] davb|9 years ago|reply
Other companies have managed secret distribution for secure devices just fine - randomise the card manager key and bundle a tamper proof packet containing the key along with the product. Provide instructions on how to verify the integrity of the packet, and confirm a digitally signed affirmation of the key against Yubico's public key online.
That's more than RSA offers for SecurID seed verification and more than my business bank offers for two factor device PIN integrity checking.
I'm not sure who they use for their Secure Element (NXP?) but it also sounds like Yubico has gone along with their request (and NDA) to keep implementation details secret. We've seen a similar situation in SE implementations in mobile phones (for contactless payment, primarily).
Again, enterprise customers don't care (mid-sized one have insurance that will cover loss if their Common Criteria EAL 5+ vendor's hardware is compromised, big enterprise can pay for auditing). Governments don't care (they'll pay for auditing or negotiate it in any significantly high volume contract).
End users and the tech community are the only groups who'll really lose out here.
[+] [-] nickpsecurity|9 years ago|reply
1. Hardware cost money to develop, has to make it back, and is easy to clone. They'll keep hardware secret by default for this reason like everyone does. Also lowers odds of patent suits. All kinds of people demand open, secure hardware but almost nobody will buy it. Just like software. Number 1 problem in the INFOSEC industry.
2. There's three companies IIRC building the kinds of secure IC's they need. They NDA the stuff critical to understanding it. Plus, the implementations are secret with tamper-resistance mechanisms. Pointless relying on open-source model to understand or evaluate such a thing. Some marginal benefits but major risks would still be there. Whereas, open-sourcing the stuff adds risk in terms of issues with the suppliers. So, no OSS is an acceptable choice here.
3. Restricting some of the firmware/software is a tradeoff of the protection methods they're using. Again, reduces value in open-sourcing it as you'd have to dump it off the chip to verify it anyway. The kind of people that can do that don't need Yubico's help.
4. Yubico might not know how to build secure HW/SW combos. It's a rare skill whose techniques are a mix of published and trade secrets. Plus, attackers are always coming up with new stuff. So, obfuscation... not security by obscurity... but obfuscation of aspects of design to increase work of attackers between product releases is both justified and a proven method. If no other measures exist, then it would be the garbage known as security by obscurity. This seems to be better practice of proven mechanisms plus obfuscation which can hamper even nation-state hackers. Who knows how good their mechanism are going to be but there's potential.
So, it seems like a combination of sustaining their business by stopping clones and lawsuits with improved branding from effects of obfuscation & hardened IC's on low-skilled attacks that dominate the press. Two, very-good reasons to make a decision in this market. It's just economics in action. :)
[+] [-] uola|9 years ago|reply
[+] [-] viraptor|9 years ago|reply
I also don't see anything that would really prevent them from just releasing the source they're using, even if we can't realistically do anything useful with it. The whole point of those systems is that it's secure via algorithms and hardware silos - releasing their sources shouldn't change anything.
But in practice it doesn't really matter that much - as long as they use standard interfaces and replace your key for free if someone finds a vulnerability, I'm (cautiously) fine with their new position. I think a big part of the issue is that they did something better before, but if they started with the current design, people wouldn't really complain about it that much.
[+] [-] rcthompson|9 years ago|reply
[+] [-] kerkeslager|9 years ago|reply
The problem is, it's not a tradeoff Yubico have to make. They can allow users to achieve the same goals by distributing the device un-flashed, with the source code to the firmware. Upon flashing, the firmware would disable further flashing. If the user doesn't like this tradeoff, the user can choose to change the code. As a courtesy to more trusting users they could provide the service of optionally flashing devices for you. And qualified users can verify the security of the firmware before loading it.
But by flashing the devices themselves, Yubico has chosen the worst of both worlds. Now an outside actor can once again add malicious firmware: Yubico is an outside actor. AND nobody can verify the security of the firmware. This isn't even a tradeoff, it's just a loss.
[+] [-] H3g3m0n|9 years ago|reply
There is the possibility of the device being intercepted before it reaches you. Or before you have gotten around to locking it down. Or when you plug it into your (compromised) system to lock it down.
Since all communication is done over the USB port, the problem is that the firmware can be flashed with a backdoored firmware that appears to be normal/unflashed. One that can be flashable (by basically having a virtual machine/emulator that runs the flashed image), appears to get locked down when you go through any lockdown process (since you just end up locking down the VM). But still has the backdoor in place.
Firmware aside, people can modify the hardware too. Unless you crack open the device and inspect the internals (which many devices are designed to prevent). And even then a really sophisticated attack could replace the chips with identical looking ones. If you are using off the shelf ones then it wouldn't be that hard. They can also add an extra chip before the real one that intercepts the communication. Or maybe compromise the 'insecure' USB chip (if it's programmable).
With locked down hardware the manufacturer can bake private keys onto the chips and ensure that the official stuff checks the hardware by asking it to digitally sign something with a private key. But if the attacker has added their own chip between the USB and the legit chip, they can pass through the requests to the official chip.
TPM will do something like keep a running hash of all the instructions that are sent to the hardware and use the resulting has as part of the digital signature verification, but if you mirror the requests that doesn't help.
The next stage is to use the keys on the chip to encrypt all communication between the 'secure' chip. So any 'pirate' chip won't get anything useful.
Users could be allowed to 'bake' their own keys in, but that leaves us with the intercepted hardware problem. The attacker gets the hardware, installs fake firmware that appears to accept your custom key and preforms the encryption.
Personally I think worrying about security to that level is over kill even if your dealing with quite a bit of money. It would have to be quite an organised attack. They would have to gain physical access to the device, compromise it, return it unknown and then gain physical access again later. Requiring both physical and digital security skills.
That's much more work than just, stealing it or applying Rubber-hose cryptanalysis. Attackers can also compromise the system being used to access whatever.
[+] [-] eggy|9 years ago|reply
I have had my own business, and the one thing I would say to the critics of Yubico: If you have a way, given existing hardware and software tools and suppliers, to do a better job, step up and do it. AFAIK, Apple didn't opensource their hardware related to crypto, or their software.
I think you will find it takes more than wishful thinking; more like, put your money ( = or your time) where your mouth is. Engineers, and I don't just mean CI engineers here, know it is a long way from a math equation or set of equations to a real world working object. I would love to see, and I would contribute money to an opensource solution. I just don't think it is as cookie-cutter simple as the majority of comments seem to intimate on this forum.
[+] [-] drazvan|9 years ago|reply
[+] [-] tptacek|9 years ago|reply
[+] [-] infinite8s|9 years ago|reply
[+] [-] dmitrygr|9 years ago|reply
All that he says is summarized in "it was too hard to think of a solution, so we didn't do it."
[+] [-] rrego|9 years ago|reply
With regards to the applet manager, that seems to be an issue with customer friction less so than being too hard. While "crypto nerds" would be fine, business applications could be affected.
[+] [-] skybrian|9 years ago|reply
[+] [-] sigmar|9 years ago|reply
No hardware is 100% secure and for Yubico to say this issue is about "Secure Hardware vs. Open Source" seems like a red herring. Perhaps they are just trying to protect their business model? After all, there isn't anything particularly unique about the hardware.
[+] [-] nickpsecurity|9 years ago|reply
So, it's not so simple. Otherwise, all buildings containing valuable protected by locks and stuff would be compromise because enemies had potential of physical access. They aren't. That's telling you something.
[+] [-] foxhill|9 years ago|reply
this is a fundamental concept in FOSS and for anyone to try and rationalise their way out of it - be it out of some corrupted sense of trying to do the right thing - is absurd.
fortunately i feel that the very people that would be interested in this device will be aware of this; i hope the folks at yubico reverse this decision.
[+] [-] hendzen|9 years ago|reply
https://en.wikipedia.org/wiki/NSA_Suite_A_Cryptography
[+] [-] captainmuon|9 years ago|reply
My reasoning: I don't need physical tamper-resistance for my threat scenario - if it is stolen by a random thief, a coworker, a "friend", etc..
But if I was attacked by a nation-state-like actor, I cannot trust any security measure of the device. How do I know the NSA does not have a copy of every "random" card-manager key? How do I know that generated keys are not subtly biased so that they can be guessed easily? Or that there is not a secret function to extract them? Even if Yubico is 100% honest and their device is clean, I must assume that if e.g. the NSA were after me, they have the technology to extract the keys from the device, no matter what protection it has.
[+] [-] microcolonel|9 years ago|reply
[+] [-] kriro|9 years ago|reply
I hope the response from consumers will be: we understand your position. Unfortunately that is unacceptable and we'll look for another vendor. It is mine. I own a Neo, not getting any of their future products.
Also as a strategic guideline...maybe if you're in the business of security...don't use hardware that requires NDAs. Yes it'll make it impossible to do some stuff and more expensive to do some stuff but I'd say there's really no option to compromise.
[+] [-] ansible|9 years ago|reply
Their current industrial design very clearly says "hey, I am an important security key", which is exactly the wrong thing to do.
It should instead look like a cheap flash drive. And when the thief plugs it in, he sees exactly that, a low capacity USB flash drive, unencrypted, with some random documents on it.
Is the thief at this point going to perform some sophisticated hardware hacking? No, it will just get thrown away.
[+] [-] hlandau|9 years ago|reply
[+] [-] jwildeboer|9 years ago|reply
[+] [-] xaduha|9 years ago|reply
Personally I only tried IsoApplet, but openpgp applet should work too.
[+] [-] floatboth|9 years ago|reply