Having worked with journalists, these sound like typical entitlement complaints. Frankly, a lot of writers have an attitude that they're "artists" who shouldn't be rushed and don't have performance requirements.
Frankly, it does not sound very hard at all. They have to write 20 posts a day, but each post is only a headline and a brief summary. A focused writer can finish that in 15 minutes.
> We had to write in the most passive tense possible. That’s why you’d see headlines that appear in an alien-esque, passive language.
Oh no! How dare Facebook strive to be neutral and passive.
> After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm
Of course they were. Facebook is quite explicitly trying to apply ML throughout their site. Why should they permanently be in the business of employing writers to do something which computers could do reliably and effectively?
> We had to write in the most passive tense possible. That’s why you’d see headlines that appear in an alien-esque, passive language.
I was just thinking that the Facebook summary headlines are surprisingly journalistic compared to Buzzfeed, Upworthy, etc. Seeing somebody complain about that explains why their title is "news curator" and not "journalist."
The article also attempts to paint them as overworked and abused and treated differently because they were contractors. Having worked in the big tech industry as a contractor, their description of contractor treatment and workload is pretty normal (no employee perks, quotas, etc.). This is due more to Vizcaino vs. Microsoft [1] than any sort of abusive contractor agenda.
I shall just leave a link to Douglas Rushkoff here https://www.youtube.com/watch?v=87TSoqnZass to explain to you the deep flaws in the "default" techie thinking that your comment is a classic example off.
"After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm." ha! aren't we all.
So.... They finally know what it feels like to be a blue collar worker whose job gets outsourced and sometimes are offered benefits contingent on training their offshore trainees.
Everyone was happy to have $30 jeans which could last one year over their $70 jeans made stateside which could last five years or more... But now that it affects them, oh my, this is bad now...
'Mark Zuckerberg has been transparent about his goal to monopolize digital news distribution. “When news is as fast as everything else on Facebook, people will naturally read a lot more news,”'
Oh lord. That's not news. Those are sensationalist, click baity headlines making us all collectively dumber. It's anecdotal, but I'm quite sure the 'news' my girlfriend has gotten from FB has made her less informed.
I actually wrote an FF plugin to hide that right bar, since I would now and then get sucked in by that crap and waste 15 minutes reading about Lindsey Lohan or whatever.
When the employees of Facebook collectively ask "What responsibility do we have to stop the election of Donald Trump" at an internal meeting, it gives me Orwellian chills. Today, it's not governments we need to fear the most, it's data hungry, fascist internet corporations.
Sometimes people have a hard time decoupling politics from their jobs. They should try how ridiculous it would sound to say "What can we do to stop Bernie Sanders". That should give them pause. As far as I know all the major candidates are operating within the law and thus we ought not try to Stop Them" just because our politics differ.
Or what if instead of national figures FB or anyone else tried to railroad an SF Supe they didn't particularly like because they shot down their proposed development. Or Menlo Park.
Very much agree, but do you really think it's all that different at the NY Times? I'm sure at FB no one thought they were going to beat Trump by being deceptive or manipulative. No, they would just link to stories they thought were more accurate, had less spin, covered more sides etc. Likewise, I'm sure many NY Times reporters are driven by a feeling of duty to concentrate on issues where they feel they can bring truth/illumination to the campaign and get Trump defeated.
I guess it's plausible to me that a new company like FB lacks some sort of important company culture for reporting that the NY Times has by virtue of its long history, but this seems minor. The real issue is (1) the consolidation in the news industry and (2) its political homogeneity. FB and the NY Times are each examples of both.
There was a recent study looking at this possible issue. They called it Search Engine Manipulation Effect [1]:
"The fifth experiment is especially notable in that it was conducted with eligible voters throughout India in the midst of India’s 2014 Lok Sabha elections just before the final votes were cast. The results of these experiments demonstrate that (i) biased search rankings can shift the voting preferences of undecided voters by 20% or more, (ii) the shift can be much higher in some demographic groups, and (iii) search ranking bias can be masked so that people show no awareness of the manipulation."
What if the employees at Facebook instead asked "What ethical responsibility do we have to ensure that we, as the primary distributor of news in the world, do some minimal amount of fact checking."
The goal shouldn't be to "stop Trump" — that's not what we want a news distributor to do. I don't want bias against a particular person in my news distributors. What I want is for my news distributors to actually have ethical standards that enable them to serve the public good (and as a by product hopefully prevent lying demagogue's from gaining power).
Is it "bias" to use a standardized method or algorithm (even one that involves humans reading natural language") to filter content by accuracy and provide some interface indication not just for phishing, and for porn, but also for falsehoods?
While this may be a more direct or open admission of a "moral" attempt to shape political outcomes, I think it's worth remembering that every other major corporation has its hand in the pot, albeit in a less direct fashion, when its board pours money into lobbyists, special interest groups, and media outlets. Sometimes for "moral" reasons, sometimes for economic reasons - but every major corporation is a political player.
I am no fan of anyone that is currently running for President. Assuming that Clinton, Sanders, or Trump ultimately win, Congress and the Senate will have to be our last line of defense in keeping the "winner" from doing serious damage to the country.
That said, I absolutely think that Facebook and other major media outlets should be held to a form of the Equal Time rule:
I'm just amazed at how happy and ignorant people seem to be about giving away Facebook more and more signals about their likes/dislikes
They started the profile badges thing and folks thought that was so cool.
I had to explain to them that this is one more data point for Facebook - if I change my profile badge to a Manchester United logo, it probably means that I really, really care about Manchester United.
I can then be targeted by an advertiser that wants to sell merchandise to Manchester United fans
I agree with you 100%. At the same time, in this specific situation I can't help but still be even more disturbed at the thought of a Trump presidency than I am by a data hungry, Orwellian, fascist Internet corporation. Not defending the Facebook employee(s) who asked that question by any stretch, just marveling at how screwed up our political system is. :(
There's nothing wrong with asking the question. In a free society, it should be perfectly acceptable to do so without consequence. It would be Orwellian if the question were never even asked.
Just think how unresonable sounds to say "What responsability do we have to stop Hitler?
Poe's law or whatever, but I can't even imagine a worst possible candidate than Trump, even Eric Cartman would be a better president than this Daily-mail-esque cartoon of human being.
Thoughts about the utility of schools to serve this function? We are already teaching humans very simple to very advanced tasks and we do so millions of times a day. Is the fact that tasks/knowledge are being structured for paced human learning irrelevant for this purpose?
So, I got interested in this article's description of horrid working conditions and decided to read about it carefully. But I noticed that it gives emotional descriptions of it long before the actual facts. I wouldn't go as far as calling this a manipulation, but it's certainly a disturbing writing style.
3rd paragraph:
> grueling work conditions, humiliating treatment, and a secretive, imperious culture in which they were treated as disposable outsiders.
6th paragraph:
> “It was degrading as a human being,” said another. “We weren’t treated as individuals. We were treated in this robot way.”
And then, finally, on 10th paragraphs, we get a glimpse on the facts:
> they received benefits including limited medical insurance, paid time off after 6 months and transit reimbursement
(BTW, is it usual for contractors to receive such perks?)
> A company happy hour would happen at 8 p.m., and we’d be working
Horrible, inhumane treatment indeed.
> Over time, the work became increasingly demanding, and Facebook’s trending news team started to look more and more like the worst stereotypes of a digital media content farm. Managers gave curators aggressive quotas for how many summaries and headlines to write, and timed how long it took curators to write a post. The general standard was 20 posts a day.
20 posts during 8 hour work day is almost half an hour on one post. It is considered too little? Seriously?
So — apart from all the pretty words, I didn't really see any especially bad treatment. Hell, I'm pretty sure that your average newspaper employees have more nightmare stories.
“It was degrading as a human being,” said another. “We weren’t treated as individuals." hmmm. Trying hard to feel sympathy/emapthy for these victums but my algorithm is throwing an exception.
I don't think the assessment that they were part of a future algorithm is too far off. Given enough expert input picking a fitting image/video or headline should be doable. Worst case you reduce the number of humans needed to one quality assurance person. Algorithm says "this headline, this image"...yes/no;fix.
I wonder that myself. I think it has more to do with standard corporate over thinking about what's skills are actually required for a job. It certainly make it easier to say, "We want someone to write titles and snippets of news stories. Who does that? Journalists. Cool, let's get some of those."
[+] [-] morgante|10 years ago|reply
Frankly, it does not sound very hard at all. They have to write 20 posts a day, but each post is only a headline and a brief summary. A focused writer can finish that in 15 minutes.
> We had to write in the most passive tense possible. That’s why you’d see headlines that appear in an alien-esque, passive language.
Oh no! How dare Facebook strive to be neutral and passive.
> After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm
Of course they were. Facebook is quite explicitly trying to apply ML throughout their site. Why should they permanently be in the business of employing writers to do something which computers could do reliably and effectively?
[+] [-] vcarl|10 years ago|reply
I was just thinking that the Facebook summary headlines are surprisingly journalistic compared to Buzzfeed, Upworthy, etc. Seeing somebody complain about that explains why their title is "news curator" and not "journalist."
[+] [-] SOLAR_FIELDS|10 years ago|reply
1. http://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?arti...
[+] [-] kyle8|10 years ago|reply
[+] [-] andrewfromx|10 years ago|reply
[+] [-] mc32|10 years ago|reply
Everyone was happy to have $30 jeans which could last one year over their $70 jeans made stateside which could last five years or more... But now that it affects them, oh my, this is bad now...
[+] [-] n72|10 years ago|reply
Oh lord. That's not news. Those are sensationalist, click baity headlines making us all collectively dumber. It's anecdotal, but I'm quite sure the 'news' my girlfriend has gotten from FB has made her less informed.
I actually wrote an FF plugin to hide that right bar, since I would now and then get sucked in by that crap and waste 15 minutes reading about Lindsey Lohan or whatever.
[+] [-] morgante|10 years ago|reply
[+] [-] notliketherest|10 years ago|reply
[+] [-] mc32|10 years ago|reply
Or what if instead of national figures FB or anyone else tried to railroad an SF Supe they didn't particularly like because they shot down their proposed development. Or Menlo Park.
[+] [-] jessriedel|10 years ago|reply
I guess it's plausible to me that a new company like FB lacks some sort of important company culture for reporting that the NY Times has by virtue of its long history, but this seems minor. The real issue is (1) the consolidation in the news industry and (2) its political homogeneity. FB and the NY Times are each examples of both.
[+] [-] sinemetu11|10 years ago|reply
"The fifth experiment is especially notable in that it was conducted with eligible voters throughout India in the midst of India’s 2014 Lok Sabha elections just before the final votes were cast. The results of these experiments demonstrate that (i) biased search rankings can shift the voting preferences of undecided voters by 20% or more, (ii) the shift can be much higher in some demographic groups, and (iii) search ranking bias can be masked so that people show no awareness of the manipulation."
[1]: http://www.pnas.org/content/112/33/E4512.abstract
[+] [-] avivo|10 years ago|reply
The goal shouldn't be to "stop Trump" — that's not what we want a news distributor to do. I don't want bias against a particular person in my news distributors. What I want is for my news distributors to actually have ethical standards that enable them to serve the public good (and as a by product hopefully prevent lying demagogue's from gaining power).
Is it "bias" to use a standardized method or algorithm (even one that involves humans reading natural language") to filter content by accuracy and provide some interface indication not just for phishing, and for porn, but also for falsehoods?
[+] [-] roymurdock|10 years ago|reply
[+] [-] kyle8|10 years ago|reply
[+] [-] downandout|10 years ago|reply
That said, I absolutely think that Facebook and other major media outlets should be held to a form of the Equal Time rule:
https://en.wikipedia.org/wiki/Equal-time_rule
The fact that we don't have an equivalent rule for major websites is a serious problem.
[+] [-] kristianc|10 years ago|reply
[+] [-] SapphireSun|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] puranjay|10 years ago|reply
They started the profile badges thing and folks thought that was so cool.
I had to explain to them that this is one more data point for Facebook - if I change my profile badge to a Manchester United logo, it probably means that I really, really care about Manchester United.
I can then be targeted by an advertiser that wants to sell merchandise to Manchester United fans
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] mwfunk|10 years ago|reply
[+] [-] krapp|10 years ago|reply
[+] [-] ivanca|10 years ago|reply
Poe's law or whatever, but I can't even imagine a worst possible candidate than Trump, even Eric Cartman would be a better president than this Daily-mail-esque cartoon of human being.
[+] [-] kristianc|10 years ago|reply
Strange that people at Facebook would feel apathetic toward individuals that can't code.
[+] [-] dredmorbius|10 years ago|reply
It's pretty common elsewhere in SV as well. And is a cardinal error.
[+] [-] drdeadringer|10 years ago|reply
[+] [-] AndrewKemendo|10 years ago|reply
If Facebook is doing it's job and utilizing their biggest talent acquisition of Yann LeCun, then this is exactly what they should be doing.
A critical part of the path towards AGI is using humans to teach it.
[+] [-] germinalphrase|10 years ago|reply
[+] [-] golergka|10 years ago|reply
3rd paragraph:
> grueling work conditions, humiliating treatment, and a secretive, imperious culture in which they were treated as disposable outsiders.
6th paragraph:
> “It was degrading as a human being,” said another. “We weren’t treated as individuals. We were treated in this robot way.”
And then, finally, on 10th paragraphs, we get a glimpse on the facts:
> they received benefits including limited medical insurance, paid time off after 6 months and transit reimbursement
(BTW, is it usual for contractors to receive such perks?)
> A company happy hour would happen at 8 p.m., and we’d be working
Horrible, inhumane treatment indeed.
> Over time, the work became increasingly demanding, and Facebook’s trending news team started to look more and more like the worst stereotypes of a digital media content farm. Managers gave curators aggressive quotas for how many summaries and headlines to write, and timed how long it took curators to write a post. The general standard was 20 posts a day.
20 posts during 8 hour work day is almost half an hour on one post. It is considered too little? Seriously?
So — apart from all the pretty words, I didn't really see any especially bad treatment. Hell, I'm pretty sure that your average newspaper employees have more nightmare stories.
[+] [-] andrewfromx|10 years ago|reply
[+] [-] zxv|10 years ago|reply
[0] https://en.wikipedia.org/wiki/Supervised_learning
[+] [-] smelendez|10 years ago|reply
[+] [-] kriro|10 years ago|reply
[+] [-] jccalhoun|10 years ago|reply
[+] [-] jonathankoren|10 years ago|reply
disclaimer: i worked on that fb product.
[+] [-] n72|10 years ago|reply
[+] [-] morning_star|10 years ago|reply
this is the most worthless piece of thing i've ever read in my life
[+] [-] dang|10 years ago|reply