> "Due to a technical error, we recommend you review the audience of your recent posts. Learn more."
What does that even mean? What possible action can a user do to "undo" any damage? You can't really make the people who saw your posts "unsee" it.
Also, AFAIK, Facebook doesn't show the user stats of what public posts were accessed by 3rd parties like advertisers, random drive by Facebook user, so how is this statement in any way useful to the user?
“Audience setting” is internal speak for what you would call the visibility of a post. “review the audience” is unambiguous internal speak for: change the visibility setting.
I’m fairly certain that particular bug when through an emergency process and that the handful of copy-writers didn’t get to review. They are rather ruthless to pick those. I don’t remember what the grammar rule was to talk about visibility of the post, but I remember it was very specific precisely because of the distinction: who can and who has seen your posts.
It doesn't undo the damage already done, you are correct. It merely lets you control how many more people see it. That way, when you apply for a job 6 months from now the folks won't see the stuff you wanted to share with a limited number of people on your friends list.
Of note here, which has always been true, but worth remembering: Facebook can change the visibility of your posts however it likes, whenever it likes. The only thing stopping them is their own ethics. (And possibly a big fine in Europe, but that will be cold comfort to anyone whose personal info is exposed thus.)
Not that it _did_, but it certainly could, and will do so if it thinks it has good enough reason.
I wouldn't count on EU here. Public, state-owned EU broadcasters still naively and indefensibly maintain Facebook presences as the only way to get in touch. I mean, these are supposed to be press professionals, yet they still don't seem to understand the name of the game.
No: Facebook can change the algo that picks the default setting of the visibility of your future posts.
Software changes are rather easy at Facebook (although that particular tool would surely have a ton of people monitoring it, including quantitative social scientists who don’t monitor a lot of pages). Changing the graph (stored information) is pretty much impossible without a lot of checks. That distinction is key to understand control at Facebook.
Facebook would change peoples settings on purpose during updates back in 2009. Its the reason I quit the site.
"A Facebook spokesperson said the notification is the start of new proactive and transparent way for the company to handle issues going forward"
Facebook deserves everything they have coming. IMO, They also need to rethink their PR strategy. I would of given them a second chance and tried the site but every excuse is insulting. Wow
Facebook would change peoples settings on purpose during updates back in 2009.
Man, has it been that long? W/o reading the article first, I expected the first comment to be "needs [2010] in the title", because I, too, dropped FB the day I found out that all pictures were now public (and, IIRC, "by design"). I subsequently figured it was an old story about that.
The notification is so misleading: "X, we recently discovered a technical error between May 18 and 27 that automatically suggested a public audiance when you were creating posts"
My problem is the "suggested a public audience", makes it sound so minor, and really your fault for going along with the "suggestion".
This is not the first time this has happened, and it won't be the last. Facebook is too big and way off the path for anyone's good, IMO.
As someone who uses Facebook in a limited manner for a specific topic, I set my audience to public a long time ago. If it's something I can't say in public (on Facebook), I don't trust or allow that information to be on Facebook. Period. There are other platforms for exchanging information that Facebook just cannot be trusted to handle correctly.
As the saying goes, "Fool me once, shame on you. Fool me twice, shame on me."
So, a history of bugs that increase permissiveness. Were there ever any bugs that set things more restrictively? Or are these "bugs" a reflection of their priorities, i.e. willful neglect.
They had plenty of annoying popups saying "Oh you're posting publicly; did you know you can restrict the audience? Maybe you want to post to your friends only?". Over the years, I believe they were also tightening up the default visibility settings for new accounts.
This is not the first time, obviously. I remember vividly how they sometime 2008 or 2009 made all pictures in the Profile Pictures folder public - that's when I realized they can't be trusted at all with protecting information.
While "move fast and break things" sounds great and works well in a lot of cases, this is the huge problem with it. There are real consequences. And with GDPR and other privacy laws, this is will start equating to real money lost. I expect that motto to be retired much the same way Google got rid of "Don't be evil".
Who knew that startup advice when you are a 3 person shop does not apply to big corps handling a big chunk of the world's population. That's why its ludicrous to make fun of established companies for moving slow, they might be slow but try to minimize all the risk and get the same gains.
Facebook's faux pas seem to be coming thick and fast, yet it makes no dent in their apparent popularity. What sort of a beast is this that it can't be slayed by scandal after scandal?! I mean, much of the media hate it, and they jump on any chance to berate it. I personally hate it, as do a lot of other "geeky people" I know. Yet still everyone wants to use it, all the time.
It's like no one actually cares about privacy. Or what Facebook provides in exchange for privacy is somehow worth it? To me, the value proposition of Facebook simply doesn't add up... I guess much of the world strongly disagrees.
A big problem seems to also be that mainstream audience can't really tell these scandals apart, they often don't understand the details or just don't have a complete picture of any single scandal, so that if another scandal gets reported within a few weeks, they'll think it's still the same issue and only rate this first scandal as somewhat worse, rather than there being a second scandal, making the whole thing roughly twice as bad. And as a result, mainstream media also just doesn't report about it as much.
A particularly strong example of this is Meltdown and Spectre. The news around those came on public TV where I live.
Now we've discovered more and worse vulnerabilities and even for me as someone who frequently reads tech news, I pretty much had to go out of my way to get information about them.
I've asked people that question and it seems the trade off is that feeling of connection. The validation that comes from sharing moments and the happy feelings it brings you when you can brag about something. In exchange for this, people are relatively lax about their privacy. I'm sure there's also a bunch of people that want to get famous so they will absorb all the followers, likes, shares and comments they can get.
I once overheard two girls talking about guys they liked at school and I shit you not, they were rating them based on how many Instagram followers they had...
This is terrifying for people who shared things thinking that they were private (especially those suffering harassment). However, as we know, nothing you share on Facebook is private. Your posts are public regardless of what you set them to.
So set your default to public. If you feel as you need to change it for something, don't post at all.
I remember a very long time ago, I inadvertently changed a photo album I never deleted from private to public and it contained photos of an ex. I got a call from my then-current girlfriend with a boatload of questions. I started realizing shortly after how damaging this new (at the time) social media stuff could be. In my case, it was something relatively small, but I can't imagine what kind of stuff other people might have that could expose them on so many fronts. For this and other reasons, I slowly weaned myself off social media and never really looked back. I just use various accounts for logins and a few check-ins here and there.
Exactly - I did this a few years ago, and never even blink when I read this stuff. About the only thing I change the privacy setting on is if I'm traveling or similar and want to check in or post pics on the road.
"Facebook changed every post by those users during the affected time period to private, including posts that people may have meant to share publicly. The company told CNN it took five days to make those changes."
The headline and the article seem to be contradictory. Anyone know if the article is wrong or the headline is wrong? It looks like it just might be that paragraph that got it backwards since there is a direct quote later that talks about posts being automatically suggested as public.
I guess at least this is seen as a bug now? I don't know if it was just a bug affecting me or some weird test or something going on but this used to happen to me all the time - every time they'd change anything they'd change all my defaults to public. It did stop at some point, though I haven't checked in a while and don't trust them at all. They used to reset my email settings all the time too, but I stopped paying attention to that and just told Gmail everything Facebook send is spam which worked much better.
I highly doubt it. The new feature they were testing was probably rolled out to a % of their user-base (as any sane company rolling out new features does.)
Facebook has tools that lets them target experiments to specific subsets of their users. Chances are this experiment targeted 14M users in a very specific geography (generally English speaking users probably in the US/CA/NZ/AU).
Most likely, this sounds like a UI bug where users in the experiment had their default "audience" for their posts set to Public, instead of the value that it is usually set to.
The UI probably showed that it was being posted to the public, but obviously, if you weren't looking at that you'd probably expect it to have been the same as your previous default.
Maybe someone from FB can chime in and prove whether this is true but wouldn't FB take a copy of ANY data that is made public for any portion of time? So it may have been public for a day but ultimately that data is now stored and available for 3rd parties to leverage.
This is exactly what something like Archive.org would do.
Facebook has a fully automated testing pipeline, supposedly with few human gatekeepers but not testers, and use Facebook employees as the only real testers. That gives me no confidence at all.
That is, if it truly was unintentional. In the past year, most of the "data leaks" from Facebook where entirely intentional. They might market them as accidental, but you don't accidentally write an API to expose data intentionally.
In spite of all the controversy, Facebook is at all-time highs. It will be interesting to see whether Facebook is still the dominant social networking platform in 5, 10, or 25 years.
That is sloppy ops for a tech giant. No automated tests to see if post permissions are working right? QA missed it? User reports were ignored for too long? 5 days is a long time for such a big hole to be open.
[+] [-] justboxing|7 years ago|reply
What does that even mean? What possible action can a user do to "undo" any damage? You can't really make the people who saw your posts "unsee" it.
Also, AFAIK, Facebook doesn't show the user stats of what public posts were accessed by 3rd parties like advertisers, random drive by Facebook user, so how is this statement in any way useful to the user?
[+] [-] bertil|7 years ago|reply
I’m fairly certain that particular bug when through an emergency process and that the handful of copy-writers didn’t get to review. They are rather ruthless to pick those. I don’t remember what the grammar rule was to talk about visibility of the post, but I remember it was very specific precisely because of the distinction: who can and who has seen your posts.
[+] [-] Broken_Hippo|7 years ago|reply
[+] [-] compiler-guy|7 years ago|reply
Not that it _did_, but it certainly could, and will do so if it thinks it has good enough reason.
[+] [-] ry_ry|7 years ago|reply
Whatever user groups they create are only ever going to be an artificial construct - it's all just lists of stuff
[+] [-] tannhaeuser|7 years ago|reply
[+] [-] cpmouter|7 years ago|reply
[+] [-] ZgjimDida|7 years ago|reply
You made me curious. For what exactly could Facebook be fined, and how much would that fine be?
[+] [-] bertil|7 years ago|reply
Software changes are rather easy at Facebook (although that particular tool would surely have a ton of people monitoring it, including quantitative social scientists who don’t monitor a lot of pages). Changing the graph (stored information) is pretty much impossible without a lot of checks. That distinction is key to understand control at Facebook.
[+] [-] ladzoppelin|7 years ago|reply
"A Facebook spokesperson said the notification is the start of new proactive and transparent way for the company to handle issues going forward"
Facebook deserves everything they have coming. IMO, They also need to rethink their PR strategy. I would of given them a second chance and tried the site but every excuse is insulting. Wow
[+] [-] mikestew|7 years ago|reply
Man, has it been that long? W/o reading the article first, I expected the first comment to be "needs [2010] in the title", because I, too, dropped FB the day I found out that all pictures were now public (and, IIRC, "by design"). I subsequently figured it was an old story about that.
But it happened again recently, huh?
[+] [-] arielweisberg|7 years ago|reply
[+] [-] amaccuish|7 years ago|reply
My problem is the "suggested a public audience", makes it sound so minor, and really your fault for going along with the "suggestion".
[+] [-] newscracker|7 years ago|reply
As someone who uses Facebook in a limited manner for a specific topic, I set my audience to public a long time ago. If it's something I can't say in public (on Facebook), I don't trust or allow that information to be on Facebook. Period. There are other platforms for exchanging information that Facebook just cannot be trusted to handle correctly.
As the saying goes, "Fool me once, shame on you. Fool me twice, shame on me."
[+] [-] jaytong|7 years ago|reply
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] smolder|7 years ago|reply
[+] [-] TeMPOraL|7 years ago|reply
[+] [-] kerng|7 years ago|reply
[+] [-] joekrill|7 years ago|reply
[+] [-] cjhveal|7 years ago|reply
[0]: https://www.cnet.com/news/zuckerberg-move-fast-and-break-thi...
[+] [-] lalos|7 years ago|reply
[+] [-] bitmapbrother|7 years ago|reply
https://abc.xyz/investor/other/google-code-of-conduct.html
[+] [-] racl101|7 years ago|reply
[+] [-] osrec|7 years ago|reply
It's like no one actually cares about privacy. Or what Facebook provides in exchange for privacy is somehow worth it? To me, the value proposition of Facebook simply doesn't add up... I guess much of the world strongly disagrees.
[+] [-] Jonnerz|7 years ago|reply
[+] [-] Sylos|7 years ago|reply
A particularly strong example of this is Meltdown and Spectre. The news around those came on public TV where I live. Now we've discovered more and worse vulnerabilities and even for me as someone who frequently reads tech news, I pretty much had to go out of my way to get information about them.
[+] [-] ledfrog|7 years ago|reply
I once overheard two girls talking about guys they liked at school and I shit you not, they were rating them based on how many Instagram followers they had...
[+] [-] jimjimjim|7 years ago|reply
there are no repercussions anymore. uber, telcos, <cough>white house</cough>, facebook, airlines (beaten anybody up recently?), fifa, equifax.
and it sucks.
[+] [-] zamalek|7 years ago|reply
So set your default to public. If you feel as you need to change it for something, don't post at all.
[+] [-] ledfrog|7 years ago|reply
[+] [-] LocalPCGuy|7 years ago|reply
[+] [-] typicalbender|7 years ago|reply
The headline and the article seem to be contradictory. Anyone know if the article is wrong or the headline is wrong? It looks like it just might be that paragraph that got it backwards since there is a direct quote later that talks about posts being automatically suggested as public.
[+] [-] compiler-guy|7 years ago|reply
1. Facebook defaulted a bunch of posts to public, when they should have been something else.
2. Users posted a bunch of things with this unexpected default
3. As one part of fixing the bug, Facebook changed all posts made with the unexpected default to private. They were attempting to undo the damage.
Some of the posts they made private probably were intended to be public, so they made things worse for certain users.
[+] [-] mcintyre1994|7 years ago|reply
[+] [-] djrogers|7 years ago|reply
[+] [-] jhgg|7 years ago|reply
Facebook has tools that lets them target experiments to specific subsets of their users. Chances are this experiment targeted 14M users in a very specific geography (generally English speaking users probably in the US/CA/NZ/AU).
Most likely, this sounds like a UI bug where users in the experiment had their default "audience" for their posts set to Public, instead of the value that it is usually set to.
The UI probably showed that it was being posted to the public, but obviously, if you weren't looking at that you'd probably expect it to have been the same as your previous default.
[+] [-] paulie_a|7 years ago|reply
[+] [-] mygo|7 years ago|reply
[+] [-] 1290cc|7 years ago|reply
This is exactly what something like Archive.org would do.
[+] [-] misterbowfinger|7 years ago|reply
[+] [-] coldcode|7 years ago|reply
[+] [-] CryoLogic|7 years ago|reply
[+] [-] runesoerensen|7 years ago|reply
[+] [-] arbitragy|7 years ago|reply
[+] [-] reilly3000|7 years ago|reply
[+] [-] workinthehead|7 years ago|reply
[+] [-] itomato|7 years ago|reply
[+] [-] monksy|7 years ago|reply