>Thirty-three states including California and New York are suing Meta Platforms Inc. for harming young people's mental health and contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that cause children to be addicted to its platforms.
Incase anyone wants to look at an actual document which I didn't see linked in the article
The CBC article doesn't mention what's arguably the most interesting thing about this lawsuit, which is that it goes after Meta on the grounds of product liability. This of course a pretty established area of law and a central feature of the decades of legal battles that Big Tobacco was embroiled in, but applying it to online publishers is very new.
I think the states have an interesting point. Should you be able to knowingly create a product which harms consumers and provide it to them while failing to disclose that fact? Doing so is illegal and I think your average HN'er would agree that this is bad when applied to say Big Tobacco or some manufacturer selling a product that contains toxic chemicals or whatever, but what about Big Tech?
Of course there's a big can of worms here. We've known that watching TV "rots your brain" on some level for years, and there's a fair bit of research which claims that porn is bad for you too. So where do you draw the line and when is litigation the correct recourse for society in dealing with these issues vs approaching it another way?
Hard to have sympathy for a company like Meta at this stage in the game though...
Curious if this indictment makes a clear distinction between chemical and behavioral addition. I’ve heard some pushback comparing this to smoking. In my mind, comparing this to gambling would have been more apt. Social media seems to employ many of the behavioral reward schedules which make gambling (or loot drops in video games) addictive. (Eg: literally using a variable ratio reward schedule.) Social media isn’t merely addictive in this way (It leans on crucially important things such as social standing, or perceived social standing.) but this is a major distinction which I often see people fail to make.
Don't many companies do similar things to hook kids? Would candy companies be liable for making their products appealing to kids, knowing sugar is an addictive substance? The same argument applies to breakfast cereal companies as well.
Yes, and they should go after Roblox /Fortnite next. Fleecing parents so your kids don't get made fun of for "having the free skin" shouldn't be a business model.
No disagreement here, but Instagram shouldn't be let off the hook just because cereal companies get people addicted to sugar too and do awful things as well. Remember two wrongs don't make a right.
Last I checked candy companies aren’t finding ways to stalk children and keeping record of their habits and behaviors. Physically sugar might be worse, but the business practices of physical products have mostly been the same for decades. Social media companies are engaged in many practices yet to be litigated.
Edit for clarity: certainly candy companies have general demographic data and market research, but advertising companies have profiles down to the individual. Lawyers seem to have settled on COPPA allowing them to do this for 13 year olds, but I would imagine most parents would prefer minors be excluded from any tracking, profiling, inclusion in multivariate testing or targeted advertising.
Which is why there’s been a long history of formal regulation and explicit industry self-policing for both of those examples. Engagement-optimized social media is now big enough and seemingly permanent enough to be invited to the club, and seems to be very reluctant to do the self-policing thing so far.
Well? Its not enough to just say something and assume its absurd and stop thinking about it. Have those companies intentionally made their products addictive to the extent you think they are deserving of censure or punishment? Because if thats the case maybe your point shouldnt be "this new bad thing is fine".
imagine if kids had a magic rectangle in their pockets at all times that generated, when tapped, an infinite amount of sugary breakfast cereal for them to consume, anytime, anywhere, and it had become commonplace for kids to spend plural hours each day doing nothing at all but mindlessly tapping said rectangle and consuming the magically-produced sugary breakfast cereal. imagine if this started from early childhood, with parents giving kids a slightly larger, rubber-bumpered sugary-cereal-producing magic rectangle to occupy themselves while out to eat or shopping for groceries or even just around the house. imagine if this was all completely socially accepted and anyone speaking out against it was shouted down as being opponents of technological progress or whatever.
Candy companies don't run invisible psychological experiments on their customers. An app like Instagram is also self-replenishing, which can't be said for physical products with addictive qualities.
Candies are the least sever offender. Everyone know, that you have to limit kids candy intake, because they have too much sugar. But why all of a sudden products like yogurt or pedialyte have loads of sugar?
Does switching to high fructose qualify as tweaking the formulas specifically increase/enhance the addictive properties? To me, Socials should be treated like Big Tobacco.
Sugar is not an addictive substance. You cannot become physiologically dependent on sugar with negative consequences. Lets not bring in even more disinformation. This nebulous application of the word "addiction" to non-addictive concepts by the public can lead to damaging outcomes like state DAs themselves exploiting the public ignorance and mis-use of the concept to try to bring themselves fame so they can run for public office later. See the original post topic for an example.
> Thirty-three states including California and New York are suing Meta Platforms Inc. for harming young people's mental health and contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that cause children to be addicted to its platforms.
The lawsuit, filed in federal court in California, also claims that Meta routinely collects data on children under 13 without their parents' consent, in violation of federal law.
The federal privacy law part is crystal clear. But there is no description, or even mention, of what laws are being violated related to addictive products. Is this federal law? State law? What does the law say?
There's also a lawsuit that Seattle Public Schools filed earlier this year. Not sure where that's at but it looks like there was activity as recently as last month.
When the business is entirely about marketing everything their users do against them, regardless of age, can you blame them for wanting to grow their business?
This is why I've never had a Facebook/Instagram/Twitter, etc. The poor bastards that do get what they signed up for and accepted in the TOS.
I'm hoping this is them ramping up on that type of stuff. From what I've read Meta's made themselves a pretty strong target because like the tobacco companies they have documentation that they explicitly know they're doing it and ignoring the effects cause money.
Should they also sue all the garbage TV shows for kids and food companies putting sugar everywhere? Did they also sue TikTok? To my knowledge California and New York did not sue TikTok, but is way more dangerous that Instagram. Did they not want to upset Xi during his visit?
> Did they also sue TikTok? To my knowledge California and New York did not sue TikTok, but is way more dangerous that Instagram. Did they not want to upset Xi during his visit?
Presumably, California (in particular) has an easier time investigating Facebook than TikTok, given it lies within their jurisdiction.
Several states started investigating TikTok in 2022. They investigated Instagram first due in part to the whistleblower. So it’s not that other suits aren’t coming, it’s just that IG is the farthest along.
That's a fair analogy and numerous countries and states did introduce measures to combat obesity caused by sugary drinks (mostly through an added tax, which seems to have worked to reduce consumption).
I think there's a difference between what Coca Cola has done and what Instagram has done.
I've had someone who isn't me have to visit mental wards (practicing doctor) and almost all the wards have a strict no FaceBook, no Instagram, no TikTok after leaving. Almost all return patients end up pointing fingers at those companies.
Haven't heard of anything similar from Coca Cola products, extremely scary IMO.
Before Reagan advertising was subject to FCC regulations around producing and publishing advertisements aimed at children. Once it was deregulated the big toy companies went full bore on commercial ads and we have the familiar experience today of children begging for, needing, toys because of the manipulative tactics used in advertising. The experts call it "pester power".
This should be an easy to try case. Just bring in an expert witness about addiction. There is only one behavioral addiction in the diagnostic service manual: gambling addiction. And that one is basically just grandfathered in from before addiction was understood and had a clear medical/biological meaning. Nothing that happens while looking at screens is inherently bad or addictive. To use that word here is to devalue it and marginalize it's meaning in cases of actual addiction all just so they can try to sue to make noise and bring the DAs involved fame (so they can run for public office later) while exploiting the ignorance of the public re: medical science.
[+] [-] dang|2 years ago|reply
[+] [-] gunshai|2 years ago|reply
Incase anyone wants to look at an actual document which I didn't see linked in the article
https://www.washingtonpost.com/documents/b68f2951-2a4b-4822-...
[+] [-] safety1st|2 years ago|reply
I think the states have an interesting point. Should you be able to knowingly create a product which harms consumers and provide it to them while failing to disclose that fact? Doing so is illegal and I think your average HN'er would agree that this is bad when applied to say Big Tobacco or some manufacturer selling a product that contains toxic chemicals or whatever, but what about Big Tech?
Of course there's a big can of worms here. We've known that watching TV "rots your brain" on some level for years, and there's a fair bit of research which claims that porn is bad for you too. So where do you draw the line and when is litigation the correct recourse for society in dealing with these issues vs approaching it another way?
Hard to have sympathy for a company like Meta at this stage in the game though...
[+] [-] diob|2 years ago|reply
[+] [-] everdrive|2 years ago|reply
[+] [-] HappySweeney|2 years ago|reply
[+] [-] candiddevmike|2 years ago|reply
[+] [-] daeros|2 years ago|reply
[+] [-] jonhohle|2 years ago|reply
Edit for clarity: certainly candy companies have general demographic data and market research, but advertising companies have profiles down to the individual. Lawyers seem to have settled on COPPA allowing them to do this for 13 year olds, but I would imagine most parents would prefer minors be excluded from any tracking, profiling, inclusion in multivariate testing or targeted advertising.
[+] [-] pixl97|2 years ago|reply
[+] [-] swatcoder|2 years ago|reply
Which is why there’s been a long history of formal regulation and explicit industry self-policing for both of those examples. Engagement-optimized social media is now big enough and seemingly permanent enough to be invited to the club, and seems to be very reluctant to do the self-policing thing so far.
[+] [-] tekla|2 years ago|reply
[+] [-] burnished|2 years ago|reply
[+] [-] adamrezich|2 years ago|reply
[+] [-] rchaud|2 years ago|reply
[+] [-] YeBanKo|2 years ago|reply
[+] [-] carabiner|2 years ago|reply
[+] [-] dylan604|2 years ago|reply
[+] [-] firtoz|2 years ago|reply
[+] [-] idontpost|2 years ago|reply
[deleted]
[+] [-] superkuh|2 years ago|reply
[+] [-] gnicholas|2 years ago|reply
> Thirty-three states including California and New York are suing Meta Platforms Inc. for harming young people's mental health and contributing to the youth mental health crisis by knowingly designing features on Instagram and Facebook that cause children to be addicted to its platforms.
The lawsuit, filed in federal court in California, also claims that Meta routinely collects data on children under 13 without their parents' consent, in violation of federal law.
The federal privacy law part is crystal clear. But there is no description, or even mention, of what laws are being violated related to addictive products. Is this federal law? State law? What does the law say?
[+] [-] uxp8u61q|2 years ago|reply
[+] [-] swatcoder|2 years ago|reply
[+] [-] martinky24|2 years ago|reply
[+] [-] pxeger1|2 years ago|reply
[+] [-] jvolkman|2 years ago|reply
https://www.courtlistener.com/docket/66933258/seattle-school...
[+] [-] bastard_op|2 years ago|reply
This is why I've never had a Facebook/Instagram/Twitter, etc. The poor bastards that do get what they signed up for and accepted in the TOS.
[+] [-] tamimio|2 years ago|reply
[+] [-] ender341341|2 years ago|reply
[+] [-] YeBanKo|2 years ago|reply
[+] [-] ceejayoz|2 years ago|reply
There's a song about this; https://en.wikipedia.org/wiki/Don%27t_Threaten_Me_with_a_Goo...
> Did they also sue TikTok? To my knowledge California and New York did not sue TikTok, but is way more dangerous that Instagram. Did they not want to upset Xi during his visit?
Presumably, California (in particular) has an easier time investigating Facebook than TikTok, given it lies within their jurisdiction.
[+] [-] janalsncm|2 years ago|reply
[+] [-] some_random|2 years ago|reply
[+] [-] JumpCrisscross|2 years ago|reply
[+] [-] rvz|2 years ago|reply
[+] [-] carabiner|2 years ago|reply
[+] [-] Maarten88|2 years ago|reply
[+] [-] hanzmanner|2 years ago|reply
[+] [-] toasted-subs|2 years ago|reply
I've had someone who isn't me have to visit mental wards (practicing doctor) and almost all the wards have a strict no FaceBook, no Instagram, no TikTok after leaving. Almost all return patients end up pointing fingers at those companies.
Haven't heard of anything similar from Coca Cola products, extremely scary IMO.
[+] [-] uoaei|2 years ago|reply
[+] [-] proxiful-wash|2 years ago|reply
[deleted]
[+] [-] superkuh|2 years ago|reply
[+] [-] kelipso|2 years ago|reply