I often meet people that has a secret identity because a former boyfriend/girlfriend threatens them. They never know about apps harvesting their contacts and connecting them to common friends until their ex suddenly shows up again.
This mindset of "If you havent done anything bad you dont have to worry" is so sad and ignorant of others situation.
This is the problem with an app assuming it understands your web of trust. There's no way to signal a toxic, unwanted or compartmentalized connection.
It's also why I use multiple profiles in Android to force that compartmentalization since the snoopy apps have no context it's better to just deny them the data entirely.
One thing I hate is when apps do this to get information about your contacts, not only because they get who you personally know (and their phone number, email, other info in the contact) but also contacts for businesses you regularly might interact with, meaning they can potentially create a pretty detailed web of your life.
For example, it's a dark pattern when you use Messenger, at least on iOS, for the first time and it says "Upload your contacts to find friends" with an "OK" button or "Learn More" button, and only after you click "Learn More" can you ignore it. WhatsApp does something worse; you can't initialize a message with anyone without allowing contacts, and the only way to add a contact is thru the phone app (the web app definitely doesn't, and I'm pretty certain the desktop app doesn't have the functionality to add a contact). I have to go thru great pains to prevent WhatsApp from getting my contact info, just to be able to add a new friend (turn off iCloud contacts, enable WhatsApp to see my contacts, then add the contact). Maybe I am paranoid, but I really, really want Facebook to only have the information I give it, and I don't trust WhatsApp at all. At least WhatsApp encrypts the messages.
I really hate this pattern in apps, and I wish there was some way to give it some kind of "blank canvas" without any real contacts in it. Thankfully apps don't lock you completely out if you don't provide contact info, yet, but I fear it may happen soon.
First time I looked at the framework, I expected to be able to read existing contacts, and create new contacts. I was a little surprised at editing existing contacts.
Nothing is really stopping an app with permission switching all of your contact numbers so they point to someone else, whether by accident, or on purpose. It seems like something waiting to go wrong...
I'm working on an app that makes use of this in a good way (to add new contacts and remove duplicates with explicit action by the user).
While doing some research on similar apps, I didn't find a single app that does not give the user an option to not upload the contacts to the app's servers. Basically if use doesn't give this permission, app is useless.
I'm hoping the app I build will get a sustainable revenue without needing to uploading all contacts.
But in the long run, for contact management apps, there's exponentially more value in handling the contacts data in the cloud, so that its users get the max benefit. But imo any app that's not for contact management (e.g., Facebook etc) should have limited read only access to the contacts data.
Doesn't work. We've had research on this for a long time. For a small percentage of the population maybe they can make judgements on a fine grained model but for the majority of people it becomes confusing and often worse than nothing.
Yes! Its high time. Need read-only, specific groups only, read only a public profile. Of course that will be the end of these find your friends feature.
Signal uses the same process. When you first install, it uploads all your contacts to 'see' which of your contacts are Signal users. It then defaults to Signal messages to that contact. Any new Signal user on your contact list alerts you to the fact they are now using Signal and helpfully defaults to Signal messaging. When you uninstall, you have to remove your number from their central DB using their web-app to prevent your friends from sending you Signal messages you won't be able to open.
An implied "find my friend" feature that I assume Sarahah uses.
Access and hijacks are two different terms, one implies to use and the other implies to sell customer data. What the article says that the app accesses contacts for a future 'find your friend' feature.
What really bothers me is when the author says ' it’s possible Sarahah has harvested hundreds of millions of names, phone numbers, and email addresses ". I believe I remember Snapchat and other social media apps done before.
What Sarahah should have done is to communicate with their users about what their data and how they plan their security (being an anonymous messaging platform). But, let's not forget how Snapchat dealt with their security and data at its rise.
It's really sad that the ios/android model COULD be more secure than the traditional desktop software model, but at the same time it has normalized all sorts of creepy snooping behavior on the part of apps.
Facebook messenger does this as well, yet most people aren't aware of it, nor do they consent. There definitely should be much finer grain controls over this type of thing. Imagine how violated people would feel if they had to have someone take pictures of their address book (back when they were physical) just to enter the mall, yet this occurs every day and most are unaware.
Kind of a r/ShowerThoughts, we should maybe make an app & voluntarily fill our Contacts app with specific human like crap values & then sync all major apps with it thus peppering their gold-pots with waste contacts & by the time they catch up, the damage is already done. Rinse & repeat with slightly different values.
The problem is that the big companies are not interested in phone numbers and names. The real gold are the connections - who you talk to more frequently, who are your real friends vs. random people you added, etc.
From the FAQ: "Is Sarahah a hacker?!
Sarahah doesn't steal data but websites and apps impersonating Sarahah could do that". Just weird. Why would a 'personal suggestion box' app need your contacts anyway apart from the developer's own dastardly plans? It seems they are sent in the clear as well. I think this upsets me the most. How did she expect to get away with it?
One of the ios App I built needed contacts permission, but it's only within the app and never sent to my server. When I submitted it for review to Apple, they didn't approve the app asking the reason why I was accessing the contacts and if sending them to the server. Only if they are happy with the answer they will approve your app.
So in a way, iOS apps are much secure than Android.
I simply assume they are going to upload my contact data whenever they prompt me for the permission. Not sure how iOS does permissions these days - do they also have runtime permissions like Android? I know that a lot of companies here in India skip the runtime permissions system entirely and ask for them all at install time to handle fewer edge cases in the code, and make the user less likely to be suspicious.
iOS has had runtime permissions for far longer than android :)
Some apps will even explain what they (claim to) need the permission for before triggering the permission authorisation pop up.
This is why you should always read the terms and conditions of an app. I'm always shocked at how many people do not take this seriously. In this age of "data sharing" it is super important to take extra caution.
That's not a solution. T&Cs are written as vaguely as possible to permit as much flexibility as possible, and nothing stops a malicious actor from violating their public privacy claims.
The further into the future we are, the more I start to believe inverse to be true in general, not only in security world. For some reason in security world a neat extra breaking security is treated as possible intentional backdoor.
Without treating mistakes, incompetencies, outright stupidities as malicious, we grow a new generation of technologists (including me, sadly), who do not bother to become adept in problem domain, consult with experts, perform strict analysis, cause tons of technical debt for the sake of moving fast. Good enough is the norm. We still see plain text, hashed without salt, hashed with the same salt password databases leaked. Maybe it was incompetence. Maybe it was stupidity. Maybe it was FIXME. Maybe the loaded gun was left on dinner table near a toddler because "I'm only going to bathroom". This is rhetoric question: can this stupidity be seen as malicious?
This is a really good example of cruel unjustified victim blaming and "sad and ignorant of others situation". We are all in awe of your high horse of hindsight.
[+] [-] Moru|8 years ago|reply
This mindset of "If you havent done anything bad you dont have to worry" is so sad and ignorant of others situation.
[+] [-] technofiend|8 years ago|reply
It's also why I use multiple profiles in Android to force that compartmentalization since the snoopy apps have no context it's better to just deny them the data entirely.
[+] [-] yladiz|8 years ago|reply
For example, it's a dark pattern when you use Messenger, at least on iOS, for the first time and it says "Upload your contacts to find friends" with an "OK" button or "Learn More" button, and only after you click "Learn More" can you ignore it. WhatsApp does something worse; you can't initialize a message with anyone without allowing contacts, and the only way to add a contact is thru the phone app (the web app definitely doesn't, and I'm pretty certain the desktop app doesn't have the functionality to add a contact). I have to go thru great pains to prevent WhatsApp from getting my contact info, just to be able to add a new friend (turn off iCloud contacts, enable WhatsApp to see my contacts, then add the contact). Maybe I am paranoid, but I really, really want Facebook to only have the information I give it, and I don't trust WhatsApp at all. At least WhatsApp encrypts the messages.
I really hate this pattern in apps, and I wish there was some way to give it some kind of "blank canvas" without any real contacts in it. Thankfully apps don't lock you completely out if you don't provide contact info, yet, but I fear it may happen soon.
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] rvanmil|8 years ago|reply
Here's what an app on iOS can read and modify (!) when you allow it to access your contacts: https://developer.apple.com/documentation/contacts/contacts_...
[+] [-] jordansmithnz|8 years ago|reply
Nothing is really stopping an app with permission switching all of your contact numbers so they point to someone else, whether by accident, or on purpose. It seems like something waiting to go wrong...
[+] [-] calvinbhai|8 years ago|reply
While doing some research on similar apps, I didn't find a single app that does not give the user an option to not upload the contacts to the app's servers. Basically if use doesn't give this permission, app is useless.
I'm hoping the app I build will get a sustainable revenue without needing to uploading all contacts.
But in the long run, for contact management apps, there's exponentially more value in handling the contacts data in the cloud, so that its users get the max benefit. But imo any app that's not for contact management (e.g., Facebook etc) should have limited read only access to the contacts data.
[+] [-] UncleMeat|8 years ago|reply
[+] [-] hakcermani|8 years ago|reply
[+] [-] Balgair|8 years ago|reply
It's kinda like bringing a bazooka to a knife fight, but this will get the job done nonetheless.
[+] [-] oliv__|8 years ago|reply
[+] [-] trdtaylor1|8 years ago|reply
An implied "find my friend" feature that I assume Sarahah uses.
[+] [-] tgragnato|8 years ago|reply
[+] [-] McPepper|8 years ago|reply
What really bothers me is when the author says ' it’s possible Sarahah has harvested hundreds of millions of names, phone numbers, and email addresses ". I believe I remember Snapchat and other social media apps done before.
What Sarahah should have done is to communicate with their users about what their data and how they plan their security (being an anonymous messaging platform). But, let's not forget how Snapchat dealt with their security and data at its rise.
[+] [-] mmagin|8 years ago|reply
[+] [-] robotbikes|8 years ago|reply
[+] [-] ap46|8 years ago|reply
[+] [-] xenopticon|8 years ago|reply
[+] [-] nthcolumn|8 years ago|reply
[+] [-] spacemonkey92|8 years ago|reply
So in a way, iOS apps are much secure than Android.
[+] [-] bert_|8 years ago|reply
[+] [-] fareesh|8 years ago|reply
[+] [-] lambada|8 years ago|reply
[+] [-] twsted|8 years ago|reply
And then Facebook acquired it and was able to complete its database of world's relationships.
[+] [-] altotrees|8 years ago|reply
[+] [-] ceejayoz|8 years ago|reply
[+] [-] tobyhinloopen|8 years ago|reply
[+] [-] friendzis|8 years ago|reply
Without treating mistakes, incompetencies, outright stupidities as malicious, we grow a new generation of technologists (including me, sadly), who do not bother to become adept in problem domain, consult with experts, perform strict analysis, cause tons of technical debt for the sake of moving fast. Good enough is the norm. We still see plain text, hashed without salt, hashed with the same salt password databases leaked. Maybe it was incompetence. Maybe it was stupidity. Maybe it was FIXME. Maybe the loaded gun was left on dinner table near a toddler because "I'm only going to bathroom". This is rhetoric question: can this stupidity be seen as malicious?
[+] [-] squarefoot|8 years ago|reply
[+] [-] bjt2n3904|8 years ago|reply
[+] [-] GrumpyNl|8 years ago|reply
[+] [-] spacemanmatt|8 years ago|reply
[+] [-] fiatjaf|8 years ago|reply
[deleted]
[+] [-] sctb|8 years ago|reply
[+] [-] bfred_it|8 years ago|reply
[+] [-] Can_Not|8 years ago|reply