As much as I don't like facebook as a company, I think the jury reached the wrong decision here. If you read the complaint[1], "eavesdropped on and/or recorded their conversations by using an electronic device" basically amounted to "flo using facebook's sdk and sending custom events to it" (page 12, point 49). I agree that flo should be raked over the coals for sending this information to facebook in the first place, but ruling that facebook "intentionally eavesdropped" (exact wording from the jury verdict) makes zero sense. So far as I can tell, flo sent facebook menstrual data without facebook soliciting it, and facebook specifically has a policy against sending medical/sensitive information using its SDK[2]. Suing facebook makes as much sense as suing google because it turned out a doctor was using google drive to store patient records.
At the time of [1 (your footnote)] the only defendant listed in the matter was Flo, not Facebook, per the cover page of [1], so it is unsurprising that that complaint does not include allegations against Facebook.
The amended complaint, [3], includes the allegations against Facebook as at that time Facebook was added as a defendant to the case.
Amongst other things the amended complaint points out that Facebook's behavior lasted for years (into 2021) after it was publicly disclosed that this was happening (2019), and then even after Flo was forced to cease the practice by the FTC, and congressional investigations were launched (2021) it refused to review and destroy the data that had previously been improperly collected.
I'd also be surprised if discovery didn't provide further proof that Facebook was aware of the sort of data they were gathering here...
Facebook isn't guilty because Flo sent medical data through their SDK. If they were just storing it or operating on it for Flo, then the case probably would have ended differently.
Facebook is guilty because they turned around and used the medical data themselves to advertise without checking if it was legal to do so. They knew, or should have known, that they needed to check if it was legal to use it, but they didn't, so they were found guilty.
I would say you have a responsibility to ensure you are getting legal data. you don't buy stolen things. That is meta has a reponsibility to ensure that they are not partnering with crooks. Flo gets the largest blame but meta needs to show they did their part to ensure this didn't happen. (I would not call terms of use enough unless they can show they make you understand it)
That's why in these cases you'd prefer a judgment without a jury. Technical cases like this will always confuse jurors, who can't be expected to understand details about sdk, data sharing, APIs etc.
On the other hand, in a number of highprofile tech cases, you can see judges learning and discussing engineering in a deeper level.
Suing Facebook instead of Flo makes perfect sense, because Facebook has much more money. Plus juries are more likely to hate FB than a random menstruation company.
Whenever you think of a court versus Facebook, imagine one of these mini mice trying to stick it to a polar bear. Or a goblin versus a dragon, or a fly versus an elephant.
These companies are for the most part effectively outside of the law. The only time they feel pressure is when they can lose market share, and there's risk of their platform being blocked in a jurisdiction. That's it.
>These companies are for the most part effectively outside of the law
You have it wrong in the worst way. They are wholly inside the law because they have enough power to influence the people and systems that get to use discretion to determine what is and isn't inside the law. No amount of screeching about how laws ought to be enforced will affect them because they are tautologically legal, so long as they can afford to be.
The worst part for me personally is that almost everyone I know cares about this stuff and yet they keep all of their Meta accounts. I really don't get it and frankly, find it kind of disturbing.
I know people that don't see anything wrong with Meta so they keep using it. And that's fine! Your actions seem to align with your stated values.
I get human fallibility. I've been human for awhile now, and wow, have I made some mistakes and miscalculations.
What really puts a bee in my bonnet though is how dogmatic some of these people are about their own beliefs and their judgement of other people.
I love people, I really do. But what weird, inconsistent creatures we are.
Everybody blames facebook, noone blames the legislators and the courts.
Stuff like this could easily make them pay multi-billion dollar fines, stuff that affects more users maybe even in the trillion range. When government workers come pick up servers, chairs and projectors from company buildings to sell at an auction, because there is not enough liquid value in the company to pay the fines, they (well, the others) would reconsider quite fast and stop with the illegal activities.
I don't think many of you read the article... the Flo app is the one in the wrong here, not meta. The app people were sending user data to meta with no restrictions on its use. Despite however the court ruled.
> The app people were sending user data to meta with no restrictions on its use
And then meta accessed it. So unless you put restrictions on data, meta is going to access it. Don't you think it should be the other way around? Meta to ask for permission? Then we wouldn't have this sort of thing.
5 years ago I was researching the iOS app ecosystem. As part of that exercise I was looking at the potential revenue figures for some free apps.
One developer had a free app to track some child health data. It was long time ago so I don't remember the exact data being collected. But when asked about the economics of his free app, the developer felt confident about a big pay day.
As per him the app's worth was in the data being collected. I don't know what happened to the app but it seemed that app developers know what they are doing when they invade privacy of their users - under the guise of "free" app. After that I became very conscious about disabling as many permissions as possible and especially not using apps to store any personal data, especially health data.
I don't understand why anyone would let these psychopathic corporations have any of their personal or health data. Why would you use an app that tracked health data, or use a wearable device from any of these companies that did that. You have to assume, based on their past behavior, that they are logging every detail and it's going to be sold and saved in perpetuity.
True. Unfortunately, users are all humans - with miserably predictable response patterns to "Look at this Free New Shiny Thing you could have!" pitches, and the ruthless business models behind them.
My wife uses Flo though every time I see her open the app and input information the tech side of my brain is quite alarmed. An app like that keeps very very personal information and really highlights for me the need to educate non-technical folks on information security.
> [...] users, regularly answered highly intimate questions. These ranged from the timing and comfort level of menstrual cycles, through to mood swings and preferred birth control methods, and their level of satisfaction with their sex life and romantic relationships. The app even asked when users had engaged in sexual activity and whether they were trying to get pregnant.
> [...] 150 million people were using the app, according to court documents. Flo had promised them that they could trust it.
> Flo Health shared that intimate data with companies including Facebook and Google, along with mobile marketing firm AppsFlyer, and Yahoo!-owned mobile analytics platform Flurry. Whenever someone opened the app, it would be logged. Every interaction inside the app was also logged, and this data was shared.
> "[...] the terms of service governing Flo Health’s agreement with these third parties allowed them to use the data for their own purposes, completely unrelated to services provided in connection with the App,”
Bashing on Facebook/Meta might give a quick dopamine hit, but they really aren't special here. The victims' data was routinely sold, en mass, per de facto industry practices. Victims should assume that hundreds of orgs, all over the world, now have copies of it. Ditto any government or criminal groups which thought it could be useful. :(
Why would an app that tracks menstrual cycles need to connect to the Internet at all? TFA mentions asking about quite a few other personal things as well. Is the app trying to do more than just tracking? If they're involved in any kind of diagnosis then I imagine there are further legal liability issues....
And this is why I have a general no-apps policy on my phone... Or at least, I have a minimal number of apps on my phone. While this doesn't prevent a given website/webapp from sharing similar information, I just feel slightly better not giving hard device access.
Along a similar vein, I cannot believe after the stunts LinkedIn pulled, that they're even allowed on app stores at all.
No ifs, no buts. Stuff like this deserves ruinous fines for its executives.
Cycle data in the hands of many country's authorities is outright dangerous. If you're storing healthcare data, it should require IN BIG RED LETTERS an explicit opt-in, every single time, when that data leaves your device.
This is really disappointing. I used to have a fertility tracking app on the iOS App Store, zero data sharing, all local thus private. But, people don’t want to pay $1 for an app, and I can’t afford the marketing drive that an investor-backed company such as this has… and so we end up with situations like this. Pity :(
Another aspect of this is why Apple/Google let this happen in the first place. GrapheneOS is the only mobile OS I can think of that lets you disable networking on an per-app level. Why does a period tracking app need to send data to meta (why does it even need networking access at all)? Why is there no affordance of user-level choice/control that allows users to explicitly see the exact packets of data being sent off device? It would be trival for apps to have to present a list of allowed IPs/hostnames, and users to consent/not otherwise the app is not allowed on the play store.
Simply put, it should not be possible to simply send arbitrary data without some sort of user consent/control, and to me, this is where the GDPR has utterly failed. I hope one day users are given a legal right to control what data is sent off their device to a remote server with serious consequences for non-compliance.
"GrapheneOS is the only mobile OS I can think of that lets you disable networking on a per-app level."
Don't need to "root" mobile phone and install GrapheneOS. Netguard app blocks connections on a per-app basis. It generally works.
But having to take these measures, i.e., installing GrapheneOS or Netguard (plus Nebulo, etc.), is why "mobile OS" all suck. People call them "corporate OS" because the OS is not under the control of the computer owner, it is controlled by a corporation. Even GrapheneOS depends on Google's Android OS, relies on Google hardware, makes default remote connections to a mothership that happen without any user input (just like any corporate OS), and uses a Chromium-based default browser. If one is concerned about being tracked, perhaps it is best to avoid these corporate, mobile OS.
It is easy to control remote connections on a non-corporate, non-mobile OS where the user can compile the OS from source on a modestly resourced computer. The computer user can edit the source and make whatever changes they want. For example, I use one where, after compilation from source, everything is disabled by default (this is not Linux). The user must choose whether to create and enable network interfaces for remote connectivity.
> Why does a period tracking app need to send data to meta (why does it even need networking access at all)?
In case you want to sync between multiple devices, networking is the least hassle way.
> Why is there no affordance of user-level choice/control that allows users to explicitly see the exact packets of data being sent off device? It would be trival for apps to have to present a list of allowed IPs/hostnames, and users to consent/not otherwise the app is not allowed on the play store.
I don't know that it ends up being useful, because wherever the data is sent to can also send the data further on.
I mean.. there's simply no repercussions for these companies, and only rivers of money on the other side. The law is laughably inept at keeping them in check. The titans of Surveillance Capitalism don't need to obey laws. CFOs line-item-ing provisional legal settlement fees as (minor) COGS. And us digital serfs, we simply have no rights. Dumb f*cks, indeed.
[+] [-] gruez|6 months ago|reply
[1] https://www.courtlistener.com/docket/55370837/1/frasco-v-flo...
[2] https://storage.courtlistener.com/recap/gov.uscourts.cand.37... page 6, line 1
[+] [-] gpm|6 months ago|reply
The amended complaint, [3], includes the allegations against Facebook as at that time Facebook was added as a defendant to the case.
Amongst other things the amended complaint points out that Facebook's behavior lasted for years (into 2021) after it was publicly disclosed that this was happening (2019), and then even after Flo was forced to cease the practice by the FTC, and congressional investigations were launched (2021) it refused to review and destroy the data that had previously been improperly collected.
I'd also be surprised if discovery didn't provide further proof that Facebook was aware of the sort of data they were gathering here...
[3] https://storage.courtlistener.com/recap/gov.uscourts.cand.37...
[+] [-] jlarocco|6 months ago|reply
Facebook isn't guilty because Flo sent medical data through their SDK. If they were just storing it or operating on it for Flo, then the case probably would have ended differently.
Facebook is guilty because they turned around and used the medical data themselves to advertise without checking if it was legal to do so. They knew, or should have known, that they needed to check if it was legal to use it, but they didn't, so they were found guilty.
[+] [-] prasadjoglekar|6 months ago|reply
But FB, having received this info proceeded to use it and mix it with other signals it gets. Which is what the complaint against FB alleged.
[+] [-] bluGill|6 months ago|reply
[+] [-] HeavyStorm|6 months ago|reply
On the other hand, in a number of highprofile tech cases, you can see judges learning and discussing engineering in a deeper level.
[+] [-] nikanj|6 months ago|reply
[+] [-] unknown|6 months ago|reply
[deleted]
[+] [-] benreesman|6 months ago|reply
Innocent until proven guilty is the right default, but at some point when you've been accused of misconduct enough times? No jury is impartial.
[+] [-] 1oooqooq|6 months ago|reply
[deleted]
[+] [-] kubb|6 months ago|reply
These companies are for the most part effectively outside of the law. The only time they feel pressure is when they can lose market share, and there's risk of their platform being blocked in a jurisdiction. That's it.
[+] [-] potato3732842|6 months ago|reply
You have it wrong in the worst way. They are wholly inside the law because they have enough power to influence the people and systems that get to use discretion to determine what is and isn't inside the law. No amount of screeching about how laws ought to be enforced will affect them because they are tautologically legal, so long as they can afford to be.
[+] [-] lemonberry|6 months ago|reply
I know people that don't see anything wrong with Meta so they keep using it. And that's fine! Your actions seem to align with your stated values.
I get human fallibility. I've been human for awhile now, and wow, have I made some mistakes and miscalculations.
What really puts a bee in my bonnet though is how dogmatic some of these people are about their own beliefs and their judgement of other people.
I love people, I really do. But what weird, inconsistent creatures we are.
[+] [-] Dylan16807|6 months ago|reply
[+] [-] ajsnigrutin|6 months ago|reply
Stuff like this could easily make them pay multi-billion dollar fines, stuff that affects more users maybe even in the trillion range. When government workers come pick up servers, chairs and projectors from company buildings to sell at an auction, because there is not enough liquid value in the company to pay the fines, they (well, the others) would reconsider quite fast and stop with the illegal activities.
[+] [-] fHr|6 months ago|reply
[+] [-] FirmwareBurner|6 months ago|reply
[deleted]
[+] [-] comrade1234|6 months ago|reply
[+] [-] inetknght|6 months ago|reply
Flo is wrong for using an online database for personal data.
Meta is wrong for facilitating an online database for personal data.
They're both morally and ethically wrong.
[+] [-] PunchTornado|6 months ago|reply
And then meta accessed it. So unless you put restrictions on data, meta is going to access it. Don't you think it should be the other way around? Meta to ask for permission? Then we wouldn't have this sort of thing.
[+] [-] thisisit|6 months ago|reply
One developer had a free app to track some child health data. It was long time ago so I don't remember the exact data being collected. But when asked about the economics of his free app, the developer felt confident about a big pay day.
As per him the app's worth was in the data being collected. I don't know what happened to the app but it seemed that app developers know what they are doing when they invade privacy of their users - under the guise of "free" app. After that I became very conscious about disabling as many permissions as possible and especially not using apps to store any personal data, especially health data.
[+] [-] SoftTalker|6 months ago|reply
[+] [-] everdrive|6 months ago|reply
[+] [-] amarcheschi|6 months ago|reply
https://www.mozillafoundation.org/en/privacynotincluded/cate...
[+] [-] zahlman|6 months ago|reply
[+] [-] bell-cot|6 months ago|reply
[+] [-] setsewerd|6 months ago|reply
[+] [-] fHr|6 months ago|reply
[+] [-] dr-detroit|6 months ago|reply
[deleted]
[+] [-] arkwin|6 months ago|reply
[+] [-] footy|6 months ago|reply
[+] [-] unknown|6 months ago|reply
[deleted]
[+] [-] _fat_santa|6 months ago|reply
[+] [-] bell-cot|6 months ago|reply
> [...] users, regularly answered highly intimate questions. These ranged from the timing and comfort level of menstrual cycles, through to mood swings and preferred birth control methods, and their level of satisfaction with their sex life and romantic relationships. The app even asked when users had engaged in sexual activity and whether they were trying to get pregnant.
> [...] 150 million people were using the app, according to court documents. Flo had promised them that they could trust it.
> Flo Health shared that intimate data with companies including Facebook and Google, along with mobile marketing firm AppsFlyer, and Yahoo!-owned mobile analytics platform Flurry. Whenever someone opened the app, it would be logged. Every interaction inside the app was also logged, and this data was shared.
> "[...] the terms of service governing Flo Health’s agreement with these third parties allowed them to use the data for their own purposes, completely unrelated to services provided in connection with the App,”
Bashing on Facebook/Meta might give a quick dopamine hit, but they really aren't special here. The victims' data was routinely sold, en mass, per de facto industry practices. Victims should assume that hundreds of orgs, all over the world, now have copies of it. Ditto any government or criminal groups which thought it could be useful. :(
[+] [-] cindyllm|6 months ago|reply
[deleted]
[+] [-] thrance|6 months ago|reply
[+] [-] zahlman|6 months ago|reply
[+] [-] princevegeta89|6 months ago|reply
[+] [-] ChrisArchitect|6 months ago|reply
[+] [-] tracker1|6 months ago|reply
Along a similar vein, I cannot believe after the stunts LinkedIn pulled, that they're even allowed on app stores at all.
[+] [-] maxehmookau|6 months ago|reply
Cycle data in the hands of many country's authorities is outright dangerous. If you're storing healthcare data, it should require IN BIG RED LETTERS an explicit opt-in, every single time, when that data leaves your device.
[+] [-] shrubble|6 months ago|reply
[+] [-] chubs|6 months ago|reply
[+] [-] rchaud|6 months ago|reply
[+] [-] josefritzishere|6 months ago|reply
[+] [-] oxqbldpxo|6 months ago|reply
[+] [-] aboringusername|6 months ago|reply
Simply put, it should not be possible to simply send arbitrary data without some sort of user consent/control, and to me, this is where the GDPR has utterly failed. I hope one day users are given a legal right to control what data is sent off their device to a remote server with serious consequences for non-compliance.
[+] [-] 1vuio0pswjnm7|6 months ago|reply
Don't need to "root" mobile phone and install GrapheneOS. Netguard app blocks connections on a per-app basis. It generally works.
But having to take these measures, i.e., installing GrapheneOS or Netguard (plus Nebulo, etc.), is why "mobile OS" all suck. People call them "corporate OS" because the OS is not under the control of the computer owner, it is controlled by a corporation. Even GrapheneOS depends on Google's Android OS, relies on Google hardware, makes default remote connections to a mothership that happen without any user input (just like any corporate OS), and uses a Chromium-based default browser. If one is concerned about being tracked, perhaps it is best to avoid these corporate, mobile OS.
It is easy to control remote connections on a non-corporate, non-mobile OS where the user can compile the OS from source on a modestly resourced computer. The computer user can edit the source and make whatever changes they want. For example, I use one where, after compilation from source, everything is disabled by default (this is not Linux). The user must choose whether to create and enable network interfaces for remote connectivity.
[+] [-] toast0|6 months ago|reply
In case you want to sync between multiple devices, networking is the least hassle way.
> Why is there no affordance of user-level choice/control that allows users to explicitly see the exact packets of data being sent off device? It would be trival for apps to have to present a list of allowed IPs/hostnames, and users to consent/not otherwise the app is not allowed on the play store.
I don't know that it ends up being useful, because wherever the data is sent to can also send the data further on.
[+] [-] itsalotoffun|6 months ago|reply