This doesn't say anything. They invested in a tokenization company. That's not a new or interesting technology. What am I missing?
There are interesting data security companies happening right now. For instance, Matthew Green is doing Zeutro, an ABE company. Think of ABE as Shamir's Secret Sharing on Steroids: you can encrypt data and delegate it out to different people based on boolean expressions. That at least addresses a fundamental problem in data center encryption (the fact that serverside data encryption is "all or none" with respect to applications).
This, though? I assume the announcement means VGS is doing great in the market. Congratulations, I guess?
Tokenization is just one part of the solution and you're correct, tokenization providers are plentiful.
VGS also handles compliance, audits, assumes liability and handles custodianship of the data, and provides a convention (versus configuration for most tokenization security) that provides a simple integration.
If you're looking for someone to help offload and get you compliant quickly without having to get mired into the world of compliance yourself it's a solid offering.
Exactly. I don't see anything new or differentiating here except that A16Z has a louder microphone and is using HN as a stage to announce it. Besides also apparently reveling in the fallacy that if expert are involved, it can't be hacked.
your comment reminds me of the commentary on Drew Houston's announcement of a product called dropbox (you might have heard of it...) here on HN 10 years ago. (1)
"This is nothing new", "I could build this myself", "you have to install something, nobody does that", etc. etc.
The key to a successful company is not being first, not (only) having a great technical solution, and not having tech noone else does.
The key is an umbrella of technology, business sense, marketing ability, salesmanship and much more. Andreeses/Horowwitz probably see a whole umbrella, and not only the tech.
From a quick skim through their FAQ it wasn't clear that they are even encrypting the data on their servers, and I didn't notice any claim about users controlling access to their private information.
It's actually way cooler than that, besides all the compliance help they provide - they provide tokenization but also software to make that tokenization work with 0 code changes on your side (besides env changes), via the use of proxies. It's awesome - your system won't know the difference between the tokenized SSNs and the real ones.
EnvKey[1] takes a somewhat similar approach to securing credentials/config in that we effectively replace your config with a short token that can be set as an environment variable. This then 'expands' into your full configuration when it's needed.
But the crucial difference is that instead of storing sensitive data in plaintext ourselves and then sending out access tokens, we manage an OpenPGP PKI/web-of-trust for you behind the scenes so that we're only storing encrypted data, and only the token (which we never see in its entirety) can decrypt it.
End-to-end encryption is much harder to implement for these kinds of use cases than simple tokenization, but there's also the huge benefit of not needing to trust your storage layer.
With credit cards, for example, an approach like this could hypothetically remove PCI-compliance as an issue entirely because no one is actually storing the cc # in the clear. To me this is a lot more interesting than simply shifting the burden of trust. That said, anything is better than our current status quo of spraying secrets all over the place.
Huge fan on EnvKey. A perfect example of security + usability done right -- they make more convenient to do the right thing in terms of managing sensitive environment variables.
Different niche than VGS, which again, is taking a novel approach to securing sensitive information. You can tell that their founders have had real-world experience from their novel solution; using a proxy to mask and reveal sensitive information.
Looks cool! I can very much appreciate progress in this space. I haven’t been able to find this info by skimming the website: while I understand that the user ultimately holds the keys that can decrypt the secret, how do you prevent this key from becoming the weakest link? Assuming the worst, users could just store their key where they used to store their secrets before EnvKey (like app environment)?
Something I appreciate very much about running in the cloud is being able to use the control plane’s APIs to authenticate requesters (e.g. Kubernetes API + Service Accounts or AWS IAM + Instance Roles). Does EnvKey have anything in the way of that?
Regarding PCI compliance: if card data is encrypted, the scope of compliance simply moves over to the keys :-)
I interviewed and was offered a job at this company. I turned it down because they had some of the most morally bankrupt leadership I have ever seen in a startup. Frankly, it made me less likely to interview with YC companies at all.
Just a quick list of giant red flags-
1. They are violating visa laws by having their employees in the Ukraine lie on their applications and say they are coming into the US for tourism instead of business.
2. The Ukrainian developers they get out here are kept on their Ukrainian salary, with a small stipend for housing. So they get to live in the bay area on a eastern european salary.
3. Their CEO actually bragged to me about how little they were paying the only female developer they had in the office. He thought it was hilarious.
4. When they made an offer they refused to tell me how many shares had been issued for the company or what percentage the offer included, making their offer completely impossible to decipher. It was also about 15% lower than the numbers they had discussed with me beforehand.
If I was an investor in this company I would demand the removal of the CEO and put their CTO in charge.
Whether they actually violate visa laws really depends on particular circumstances. Getting paid Ukrainian salaries actually makes it more likely that what they're doing is completely fine from USCIS point of view. Employees of multinationals such as Google, Microsoft, Facebook, Intel etc. who live and work outside of the US routinely travel to US on B2 visa for various work purposes. It really depends on how long you are planning to stay, what exactly you are planning to do, etc.
What kind of software engineer in their sane mind would want to stay in the US illegally (I don't think one can get any long-term tourist visa?) _and_ get paid peanuts? Even if they really want to live in the US, being poor sounds like a very strange sacrifice.
Unless one's a junior developer (where I heard it's hard to compete those days), as far as I know there are a lot of realistic options to find a legal immigration venue to a first-world country (probably not to the US, though) - so, why do stupid things like that?
I can see why this is an attractive idea to fund, but in my opinion it's the wrong way to resolve the problems highlighted in the article.
This is not a technical problem, it's a usability problem. We have had the cryptography necessary to technically fix this for a long time. Replace the single human-memorable token (SSN) with a unique public/private key pair. Then you provide safe authentication by signing verification messages with your private key without placing that private key into the hands of a centralized vendor (like Very Good Security).
The obstacle to this solution is 1) buy-in, to either get the government to do this or to bypass it with this solution in private industry, and 2) usability, to abstract as much of the technical signing process away from the user as possible. But this is a better solution. From what I can understand of Very Good Security's website, it's just more of the same. It wants to become the secure gatekeeper of sensitive data instead of developing a novel means of obviating that problem entirely.
The real company to fund is one which takes inspiration from an existing cryptographic protocol - like ApplePay's or AndroidPay's - and expands it to handle identity verification and one-time payment authorization without requiring an SSN or canonical credit card.
I think it is an example of great preconditions for starting a company, even though it will be very challenging to make it work well. Basically, our technology is advanced enough to do this, but it so complicated that the percentage of people who can use it, rounding to the nearest, is 0%. I see it as similar to the situation with Dropbox when it was started, where it was possible to accomplish the same thing yourself -- if you have expert level ability in that specific area.
Observing how people get along with cryptocurrency wallet software, key management is a hurdle that many will fail to clear.
> Replace the single human-memorable token (SSN) with a unique public/private key pair
There are governments that work on solutions to give each citizen a certificate. What I would love to see would be the possibility to issue your own sub-identities that only exhibit as much information as you want/need to share for that specific use case. E.g. if you need to make $20k/yr for a new mobile phone plan, you can issue an identity that makes $20k/yr as long as you make at least that.
Not to knock your answer, I agree with you that this is a usability problem, but also to be fair, there’s a lot more to the technical side of this than the use of public/private key crypto as you described.
We offer a variety of various format preserving aliasing algorithms. Only legacy systems tend to choose the SSNs if they have fixed-width columns in their RDBMS that are difficult to change (imagine petabytes of data).
The idea behind format preserving aliases is actually based on the NIST SP 800-3G standard[1]. We use FF1 and are actively engaging with the world's leading cryptographers such as: https://cryptoonline.com/publications/.
Happy to share more in detail if there's interest. Please email me: mahmoud @ ${COMPANY_NAME}.com
The article says the token "maps" to the SSN, and since they want to give different tokens to different vendors using VGS, I'd assume they're either wholly random tokens associated in a database somewhere or that some other factor of randomness is added in.
But the issue I see is that there still has to be a way that the user is handing say, their SSN to a website, for it to request the token key that associates with it, which is a big risk point. Because they need to identify themselves in a way that can identify the correct VGS account to talk to?
I mean, I think really you'd be better off doing a private/public key thing, where you have some sort of device that gives a sub-key of your master identity key to the vendor?
Exactly, it seems way too complex. I don't know why my insurance company can't give me a 9-digit number that is HASH(SSN + member_id) and tell me to use that instead of my SSN.
I find it interesting that Stefan Brands [1] solved the zero-knowledge authentication problem a couple of decades ago and his tools are still not widely applied. Given my bias against imaginary property, I believe that's because his patents on them are still valid -- and apparently owned by Microsoft at the moment [2].
> Stefan Brands [1] solved the zero-knowledge authentication problem a couple of decades ago
Reference? If you want people to take you seriously when you claim that someone's solution to a problem has been overlooked you have to provide a link to the (alleged) solution, not just to the author's biography.
Your question has two specific parts that I want to address:
1) Single Point of Failure
2) Larger target for malicious actors
Regarding point #1:
- We have invested significant amount of resources in making our product as stateless as possible and our core product can live on different cloud providers' edge networks.
- We conduct failover tests every 2 weeks to ensure we have the capability to respond to any blips in downtime. Our SOC2 Type2 report is available to discuss the availability and disaster recovery items in detail.
- As a side note: We solve the issue of the "vendor is down" problem -- for example, we have customers who seamlessly switch between providers, say credit score checks, when one of them is down without the liability of storing that data themselves.
Regarding point #2:
- This is our core focus. We take on the liability. The idea here is if this is the core focus, we can do this better than a lot of folks out there.
- We also broker access to different Fortune 500 institutions that visit our offices and constantly pen-test us, audit us, etc.
I think it's important to acknowledge that as developers security is always important, but never prioritized until its urgent. We are trying to change that @ VGS.
Please, email me directly and I'm happy to have a further chat: mahmoud @ ${COMPANY_NAME}.com
agreed, imagine the ability to look at all of the data they have and find a dump on that.
idea time: Cryptographically store this data on physical cards that can fit into wallet and be managed by the user and 'revoked' if they lose the card. obviously things like backing up and storing will still need to be done, but that does not necessarily need to be reachable via an API or on the internet all after it has initially been created.
This type of things are better off delivered as an SDK rather than a 3rd party API. Sending sensitive data to VGS for encryption would be a non-starter for many companies.. the probability of data getting stolen is same for VGS or anyone else...
Why don't we already have apps on our smartphones for this?
- $PROVIDER wants the following data: $LIST_OF_OPTIONAL_AND_REQUIRED_ITEMS
- You select which you can provide
- If the data to be provided includes "billing identifier" or "credit file identifier" (and especially if the identifier is, say, SSN), then first your app obtains a new identifier from the reporting agency or your insurance carrier, and *that* number is given to $PROVIDER
Gives more control back to the customer/patient and eliminates (yet another) treasure trove of data for attackers to go after.
It's mind-boggling to me that this didn't already exist. I wonder if that's because there's a lot of low hanging fruit in security/privacy, low hanging fruit in healthcare, or a combination of the two.
Yeah this tokenization stuff is already used all over the place, (for example, Stripe does it for you with credit cards, giving you automatic PCI compliance).
I think the innovation here is that instead of being part of carrying out the rest of their business, tokenizing and keeping the real info safe is the whole product here. That seems smart to me.
The dumb part, of course, is that we have these bearer tokens (SSN and CC numbers) in the first place, without constantly rotating them. There's some amount of rotation with CC numbers when the company detects fraud and sends you a new card. But for SSN, it's unconscionable that they're both the username and password.
Biggest problem with this solution is to trust VGS capability to secure our sensitive data. Which is a hardest thing to do in the first place. All it takes for a disgruntled employee(given company's practices cited in the posts below) siphoning out the data. They are really tiny enough to pay for the damage, making their liability claim render useless. However, given the trivial nature of this problem and interest, I decided to open source a solution which avoids liability concern.
If an organization is deciding between interacting with VGS hashes/tokens having to proxy requests or deploying a secret store like HashiCorp vault what are the pros/cons?
> When it’s time to bill your insurance company, their “reimbursement” code goes through VGS which “reveals” the token and sends the real version to the insurance company.
Forgive me if I am wrong, but that means all 3rd party integrations that require the sensitive values must be implemented by VGS correct?
Of course third 3rd party don't need VGS, you will send tokenized data through forward proxy, and at the other end they will receive real data. That is the usability.
the biggest pros of VGS:
1) tokenization/detokenization through their proxies does not require any code changes, you don't need to change your architecture ( which happens if you decided to add some secret storage like HashiCorp etc)
2) Compliances, VGS provides you: PCI, EI3PA, SOC2, HIPAA, GDPR
It's a nice reference but seems likely to cause jokes after their first security hole. "Pretty Good" is modest. "Very Good" can be seen as arrogant, unless you are reading it as ironic.
I'm not quite sure what this is exactly, but it sounds like they are providing a security "service", so all the "real identifiers" will be stored on their servers?
Why should an entire country trust them? I'm not saying they wouldn't be an improvement over Equifax, but it still sounds far from ideal. I think a hardware token would be preferable.
tptacek|7 years ago
There are interesting data security companies happening right now. For instance, Matthew Green is doing Zeutro, an ABE company. Think of ABE as Shamir's Secret Sharing on Steroids: you can encrypt data and delegate it out to different people based on boolean expressions. That at least addresses a fundamental problem in data center encryption (the fact that serverside data encryption is "all or none" with respect to applications).
This, though? I assume the announcement means VGS is doing great in the market. Congratulations, I guess?
mjallday|7 years ago
VGS also handles compliance, audits, assumes liability and handles custodianship of the data, and provides a convention (versus configuration for most tokenization security) that provides a simple integration.
If you're looking for someone to help offload and get you compliant quickly without having to get mired into the world of compliance yourself it's a solid offering.
kkotak|7 years ago
mixmax|7 years ago
"This is nothing new", "I could build this myself", "you have to install something, nobody does that", etc. etc.
The key to a successful company is not being first, not (only) having a great technical solution, and not having tech noone else does.
The key is an umbrella of technology, business sense, marketing ability, salesmanship and much more. Andreeses/Horowwitz probably see a whole umbrella, and not only the tech.
(1) https://news.ycombinator.com/item?id=8863
rladd|7 years ago
tomasien|7 years ago
asimpletune|7 years ago
danenania|7 years ago
But the crucial difference is that instead of storing sensitive data in plaintext ourselves and then sending out access tokens, we manage an OpenPGP PKI/web-of-trust for you behind the scenes so that we're only storing encrypted data, and only the token (which we never see in its entirety) can decrypt it.
End-to-end encryption is much harder to implement for these kinds of use cases than simple tokenization, but there's also the huge benefit of not needing to trust your storage layer.
With credit cards, for example, an approach like this could hypothetically remove PCI-compliance as an issue entirely because no one is actually storing the cc # in the clear. To me this is a lot more interesting than simply shifting the burden of trust. That said, anything is better than our current status quo of spraying secrets all over the place.
1 - https://www.envkey.com
zallarak|7 years ago
Different niche than VGS, which again, is taking a novel approach to securing sensitive information. You can tell that their founders have had real-world experience from their novel solution; using a proxy to mask and reveal sensitive information.
Artemis2|7 years ago
Something I appreciate very much about running in the cloud is being able to use the control plane’s APIs to authenticate requesters (e.g. Kubernetes API + Service Accounts or AWS IAM + Instance Roles). Does EnvKey have anything in the way of that?
Regarding PCI compliance: if card data is encrypted, the scope of compliance simply moves over to the keys :-)
SahAssar|7 years ago
I get that you are not storing it in the clear, but what if I actually have to use it?
vgs_interviewee|7 years ago
I interviewed and was offered a job at this company. I turned it down because they had some of the most morally bankrupt leadership I have ever seen in a startup. Frankly, it made me less likely to interview with YC companies at all.
Just a quick list of giant red flags-
1. They are violating visa laws by having their employees in the Ukraine lie on their applications and say they are coming into the US for tourism instead of business.
2. The Ukrainian developers they get out here are kept on their Ukrainian salary, with a small stipend for housing. So they get to live in the bay area on a eastern european salary.
3. Their CEO actually bragged to me about how little they were paying the only female developer they had in the office. He thought it was hilarious.
4. When they made an offer they refused to tell me how many shares had been issued for the company or what percentage the offer included, making their offer completely impossible to decipher. It was also about 15% lower than the numbers they had discussed with me beforehand.
If I was an investor in this company I would demand the removal of the CEO and put their CTO in charge.
xyzzyz|7 years ago
cbg0|7 years ago
Edit: I've sent an e-mail about it.
spullara|7 years ago
drdaeman|7 years ago
What kind of software engineer in their sane mind would want to stay in the US illegally (I don't think one can get any long-term tourist visa?) _and_ get paid peanuts? Even if they really want to live in the US, being poor sounds like a very strange sacrifice.
Unless one's a junior developer (where I heard it's hard to compete those days), as far as I know there are a lot of realistic options to find a legal immigration venue to a first-world country (probably not to the US, though) - so, why do stupid things like that?
dvdhsu|7 years ago
desigooner|7 years ago
throwawaymath|7 years ago
This is not a technical problem, it's a usability problem. We have had the cryptography necessary to technically fix this for a long time. Replace the single human-memorable token (SSN) with a unique public/private key pair. Then you provide safe authentication by signing verification messages with your private key without placing that private key into the hands of a centralized vendor (like Very Good Security).
The obstacle to this solution is 1) buy-in, to either get the government to do this or to bypass it with this solution in private industry, and 2) usability, to abstract as much of the technical signing process away from the user as possible. But this is a better solution. From what I can understand of Very Good Security's website, it's just more of the same. It wants to become the secure gatekeeper of sensitive data instead of developing a novel means of obviating that problem entirely.
The real company to fund is one which takes inspiration from an existing cryptographic protocol - like ApplePay's or AndroidPay's - and expands it to handle identity verification and one-time payment authorization without requiring an SSN or canonical credit card.
PowerfulWizard|7 years ago
Observing how people get along with cryptocurrency wallet software, key management is a hurdle that many will fail to clear.
amag|7 years ago
There are governments that work on solutions to give each citizen a certificate. What I would love to see would be the possibility to issue your own sub-identities that only exhibit as much information as you want/need to share for that specific use case. E.g. if you need to make $20k/yr for a new mobile phone plan, you can issue an identity that makes $20k/yr as long as you make at least that.
asimpletune|7 years ago
JohnJamesRambo|7 years ago
What do you think of Vinny Lingham's company that is aiming to do something similar?
Edmond|7 years ago
https://www.cipheredtrust.com/doc/
CiPHPerCoder|7 years ago
If you need to revoke it, you can do so since it's not cryptographically tied to anything.
Failing that, a base32-encoded random string (without = padding) with an optional checksum would do the trick.
mahmoudimus|7 years ago
We offer a variety of various format preserving aliasing algorithms. Only legacy systems tend to choose the SSNs if they have fixed-width columns in their RDBMS that are difficult to change (imagine petabytes of data).
The idea behind format preserving aliases is actually based on the NIST SP 800-3G standard[1]. We use FF1 and are actively engaging with the world's leading cryptographers such as: https://cryptoonline.com/publications/.
Happy to share more in detail if there's interest. Please email me: mahmoud @ ${COMPANY_NAME}.com
[1] https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.S...
ocdtrekkie|7 years ago
But the issue I see is that there still has to be a way that the user is handing say, their SSN to a website, for it to request the token key that associates with it, which is a big risk point. Because they need to identify themselves in a way that can identify the correct VGS account to talk to?
I mean, I think really you'd be better off doing a private/public key thing, where you have some sort of device that gives a sub-key of your master identity key to the vendor?
gav|7 years ago
mdpopescu|7 years ago
[1] https://en.wikipedia.org/wiki/Stefan_Brands [2] http://financialcryptography.com/mt/archives/001011.html
mahmoudimus|7 years ago
We are actually evaluating more recent advances in zero-knowledge systems. Stay tuned on more news soon in that front :)
lisper|7 years ago
Reference? If you want people to take you seriously when you claim that someone's solution to a problem has been overlooked you have to provide a link to the (alleged) solution, not just to the author's biography.
thinkingkong|7 years ago
robert204|7 years ago
mahmoudimus|7 years ago
Hi robert204,
Your question has two specific parts that I want to address:
1) Single Point of Failure
2) Larger target for malicious actors
Regarding point #1:
- We have invested significant amount of resources in making our product as stateless as possible and our core product can live on different cloud providers' edge networks.
- We conduct failover tests every 2 weeks to ensure we have the capability to respond to any blips in downtime. Our SOC2 Type2 report is available to discuss the availability and disaster recovery items in detail.
- As a side note: We solve the issue of the "vendor is down" problem -- for example, we have customers who seamlessly switch between providers, say credit score checks, when one of them is down without the liability of storing that data themselves.
Regarding point #2:
- This is our core focus. We take on the liability. The idea here is if this is the core focus, we can do this better than a lot of folks out there.
- We also broker access to different Fortune 500 institutions that visit our offices and constantly pen-test us, audit us, etc.
I think it's important to acknowledge that as developers security is always important, but never prioritized until its urgent. We are trying to change that @ VGS.
Please, email me directly and I'm happy to have a further chat: mahmoud @ ${COMPANY_NAME}.com
fosco|7 years ago
idea time: Cryptographically store this data on physical cards that can fit into wallet and be managed by the user and 'revoked' if they lose the card. obviously things like backing up and storing will still need to be done, but that does not necessarily need to be reachable via an API or on the internet all after it has initially been created.
I spent two minutes on this idea, be nice :-)
mvpu|7 years ago
delinka|7 years ago
fosco|7 years ago
not that it matters but I recently switched to LineageOS and my opinion has not changed.
jamestimmins|7 years ago
habitue|7 years ago
I think the innovation here is that instead of being part of carrying out the rest of their business, tokenizing and keeping the real info safe is the whole product here. That seems smart to me.
The dumb part, of course, is that we have these bearer tokens (SSN and CC numbers) in the first place, without constantly rotating them. There's some amount of rotation with CC numbers when the company detects fraud and sends you a new card. But for SSN, it's unconscionable that they're both the username and password.
segmondy|7 years ago
https://en.wikipedia.org/wiki/Tokenization_(data_security)
venganesh|7 years ago
nodesocket|7 years ago
> When it’s time to bill your insurance company, their “reimbursement” code goes through VGS which “reveals” the token and sends the real version to the insurance company.
Forgive me if I am wrong, but that means all 3rd party integrations that require the sensitive values must be implemented by VGS correct?
olkooo|7 years ago
mark_m|7 years ago
dpeck|7 years ago
skybrian|7 years ago
But no worse than "Best Buy" I suppose.
unknown|7 years ago
[deleted]
mtgx|7 years ago
Why should an entire country trust them? I'm not saying they wouldn't be an improvement over Equifax, but it still sounds far from ideal. I think a hardware token would be preferable.