Users broadly have no way of analyzing security + privacy. Even most software developers don't have the time or expertise to reverse engineer and analyze even one app, let alone every release of every app they use. They just have to take it on trust. For things like cars we have mandatory standards for vehicle design, crash testing requirements, fuel efficiency and emissions standards etc. to try and make sure that people can expect a certain level of performance and safety. For software there's nothing like that.
I think users have no idea what’s involved in software development, but that they expect any company takes care of the important things in order to bring a product to market.. that includes security and privacy. the “users don’t care” argument, i believe, is a cop out to make ourselves feel good.
I believe users won't care until it's common for there to be repercussions from not caring. If not caring means my odds of having my savings account stolen are %20 a year then people would start caring. As it is there are few directly attributable repercussions. Just vague worries about future problems.
As it is what are the odds of any one user running into issues because of lack of security and privacy? It seems fairly low.
It seems like an opportunity for Apple (assuming they're actually better) to run some scare ads (99% of people scammed via there computer were running Windows/Android) if that's true. If it was true a good ad campaign could get people to care?
For internet stuff how about 98% of the people who got their bank accounts hacked were hacked by leaks of data on Facebook. (probably not true and not provable)
Maybe we need some security insurance who will then audit software and only insure customers that use certified software? They'd have an incentive for their audits to be good because they pay out if it turns out the software is not secure And if their market was big enough then software creators would want to be certified.
It's even possible some standard sandboxes could help make it easy to certify. Add this sandbox to your app and you're certified? Maybe some OSes that already have sandboxes would automatically get certified but server side you'd need audits?
> the “users don’t care” argument, i believe, is a cop out to make ourselves feel good.
Strongly disagree. It's just the simple reality that most users don't care about security. The vast majority of potential consumers in the world don't choose digital products based on security. I always see this security angle touted on Hacker News, but I'm quite frankly shocked that people here don't have the self-awareness to realize that we live in an uber-tech geek's echo chamber.
Have you ever met an "average" Facebook user? They really, truly, do not understand or care about security. I'm very confident that even if you sat one down and walked them through all of the implications of what poor security even means, they would walk away and not change their behavior whatsoever.
> but that they expect any company takes care of the important things in order to bring a product to market.
They show that expectation by withholding their money or not using the product when something bad happens. I don't see them doing much of that when companies have data breaches.
Agreed. There's just no way consumers can be expected to usefully verify things like privacy. In the real world, "it should just work" expectations get baked into things like building codes, the Uniform Commercial Code, commoditization rules, and food safety standards.
If we don't come up with something like that as an industry, eventually somebody else is going to do it for us. And we won't like that one bit.
There already are companies that advertise privacy as an edge, but either they are sexy enough for consumers or they aren't. DuckDuckGo isn't sexy. Firefox isn't sexy. Linux-based desktops aren't sexy.
If these things caught the sparkle in customer's eyes, we'd all know it by now.
I would agree with you, except the number of people still using facebook after cambridge analytica kinda proves that people don't actually care, even the ones that say they do.
Users do care, but they tend to guess the quality of the entire product, including security, from the slick look of the product or marketing pages with fancy words. In reality, all of these are hard to verify even for tech savvy users.
Amazon/MS/Google should do a better job of making it hard to leave things unprotected. They are no longer a new service and no longer have the excuse of having to avoid friction.
I think users care, given the outrage when leaks or “creepy” ads occur. But the problem is that users have an impossible time reasonably evaluating apps for security and privacy, so there’s very little market incentive for app makers to make secure and private apps.
taneq|5 years ago
Users broadly have no way of analyzing security + privacy. Even most software developers don't have the time or expertise to reverse engineer and analyze even one app, let alone every release of every app they use. They just have to take it on trust. For things like cars we have mandatory standards for vehicle design, crash testing requirements, fuel efficiency and emissions standards etc. to try and make sure that people can expect a certain level of performance and safety. For software there's nothing like that.
kds3|5 years ago
[deleted]
ssss11|5 years ago
asiachick|5 years ago
As it is what are the odds of any one user running into issues because of lack of security and privacy? It seems fairly low.
It seems like an opportunity for Apple (assuming they're actually better) to run some scare ads (99% of people scammed via there computer were running Windows/Android) if that's true. If it was true a good ad campaign could get people to care?
For internet stuff how about 98% of the people who got their bank accounts hacked were hacked by leaks of data on Facebook. (probably not true and not provable)
Maybe we need some security insurance who will then audit software and only insure customers that use certified software? They'd have an incentive for their audits to be good because they pay out if it turns out the software is not secure And if their market was big enough then software creators would want to be certified.
It's even possible some standard sandboxes could help make it easy to certify. Add this sandbox to your app and you're certified? Maybe some OSes that already have sandboxes would automatically get certified but server side you'd need audits?
Just throwing out ideas.
vecter|5 years ago
Strongly disagree. It's just the simple reality that most users don't care about security. The vast majority of potential consumers in the world don't choose digital products based on security. I always see this security angle touted on Hacker News, but I'm quite frankly shocked that people here don't have the self-awareness to realize that we live in an uber-tech geek's echo chamber.
Have you ever met an "average" Facebook user? They really, truly, do not understand or care about security. I'm very confident that even if you sat one down and walked them through all of the implications of what poor security even means, they would walk away and not change their behavior whatsoever.
MattGaiser|5 years ago
They show that expectation by withholding their money or not using the product when something bad happens. I don't see them doing much of that when companies have data breaches.
wpietri|5 years ago
If we don't come up with something like that as an industry, eventually somebody else is going to do it for us. And we won't like that one bit.
threatofrain|5 years ago
If these things caught the sparkle in customer's eyes, we'd all know it by now.
bisby|5 years ago
harikb|5 years ago
Amazon/MS/Google should do a better job of making it hard to leave things unprotected. They are no longer a new service and no longer have the excuse of having to avoid friction.
ashtonkem|5 years ago