NHacker Next
login
▲Meta accessed women's health data from Flo app without consent, says courtmalwarebytes.com
245 points by amarcheschi 6 hours ago | 149 comments
Loading comments...
gruez 3 hours ago [-]
As much as I don't like facebook as a company, I think the jury reached the wrong decision here. If you read the complaint[1], "eavesdropped on and/or recorded their conversations by using an electronic device" basically amounted to "flo using facebook's sdk and sending custom events to it" (page 12, point 49). I agree that flo should be raked over the coals for sending this information to facebook in the first place, but ruling that facebook "intentionally eavesdropped" (exact wording from the jury verdict) makes zero sense. So far as I can tell, flo sent facebook menstrual data without facebook soliciting it, and facebook specifically has a policy against sending medical/sensitive information using its SDK[2]. Suing facebook makes as much sense as suing google because it turned out a doctor was using google drive to store patient records.

[1] https://www.courtlistener.com/docket/55370837/1/frasco-v-flo...

[2] https://storage.courtlistener.com/recap/gov.uscourts.cand.37... page 6, line 1

gpm 13 minutes ago [-]
At the time of [1 (your footnote)] the only defendant listed in the matter was Flo, not Facebook, per the cover page of [1], so it is unsurprising that that complaint does not include allegations against Facebook.

The amended complaint, [3], includes the allegations against Facebook as at that time Facebook was added as a defendant to the case.

Amongst other things the amended complaint points out that Facebook's behavior lasted for years (into 2021) after it was publicly disclosed that this was happening (2019), and then even after Flo was forced to cease the practice by the FTC, and congressional investigations were launched (2021) it refused to review and destroy the data that had previously been improperly collected.

I'd also be surprised if discovery didn't provide further proof that Facebook was aware of the sort of data they were gathering here...

[3] https://storage.courtlistener.com/recap/gov.uscourts.cand.37...

jlarocco 1 hours ago [-]
That's only the first part of the story, though.

Facebook isn't guilty because Flo sent medical data through their SDK. If they were just storing it or operating on it for Flo, then the case probably would have ended differently.

Facebook is guilty because they turned around and used the medical data themselves to advertise without checking if it was legal to do so. They knew, or should have known, that they needed to check if it was legal to use it, but they didn't, so they were found guilty.

gruez 58 minutes ago [-]
>Facebook is guilty because they turned around and used the medical data themselves to advertise without checking if it was legal to do so.

What exactly did this entail? I haven't read all the court documents, but at least in the initial/amended complaint the plaintiffs didn't make this argument, probably because it's totally irrelevant to the charge of whether they "intentionally eavesdropped" or not. Either they were eavesdropping or not. Whether they were using it for advertising purposes might be relevant in armchair discussions about meta is evil or not, but shouldn't be relevant when it comes to the eavesdropping charge.

>They knew, or should have known, that they needed to check if it was legal to use it

What do you think this should look like?

jjulius 38 minutes ago [-]
>What do you think this should look like?

My honest answer that I know is impossible:

Targeted advertising needs to die entirely.

SilasX 36 minutes ago [-]
Yeah, I'm not sure if I'm missing something, and I don't like to defend FB, but ...

AIUI, they have a system for using data they receive to target ads. They tell people not to put sensitive data in it. Someone does anyway, and it gets automatically picked up to target ads. What are they supposed to do on their end? Even if they apply heuristics for "probably sensitive data we shouldn't use"[1], some stuff is still going to get through. The fault should still lie with the entity that passed on the sensitive data.

An analogy might be that you want to share photos of an event you hosted, and you tell people to send in their pics, while enforcing the norm, "oh make sure to ask before taking someone's photo", and someone insists that what they sent in was compliant with that rule, when it wasn't. And then you share them.

[1] Edit: per your other comment, they indeed had such heuristics: https://news.ycombinator.com/item?id=44901198

AtlasBarfed 25 minutes ago [-]
Facebook does not deserve an iota of presumption of innocence in the court of public opinion.

At this point if Facebook is doing something wrong, it should absolutely be assumed that Facebook is doing that on purpose, even if there may be something with automated apis, storage of information that on the face can simply look like it's part of our automated systems.

Facebook is the creepy jeffrey Epstein of tech companies, and has been from the very beginning.

prasadjoglekar 2 hours ago [-]
Flo shouldn't have sent those data to FB. That's true. Which is why they settled.

But FB, having received this info proceeded to use it and mix it with other signals it gets. Which is what the complaint against FB alleged.

changoplatanero 2 hours ago [-]
I wish there was information about who at Facebook received this information and “used” it. I suspect it was mixed in with 9 million other sources of information and no human at Facebook was even aware it was there.
xnorswap 1 hours ago [-]
Is your argument that it's fine to just collect so much information that you can't possibly responsibly handle it all?

In my opinion, that isn't something that should be allowed or encouraged.

Espressosaurus 1 hours ago [-]
So they shouldn’t be punished because they were negligent? Is that your argument?
pc86 1 hours ago [-]
I think their argument is that FB has a pipeline that processes whatever data you give it and the idea that a human being made the conscious decision to use this data is almost certainly not what happened.

"This data processing pipeline processed the data we put in the pipeline" is not necessarily negligence unless you just hate Facebook and couldn't possibly imagine any scenario where they're not all mustache-twirling villains.

qwertylicious 1 hours ago [-]
Yeah, sorry, no, I have to disagree.

We're seeing this broad trend in tech where we just want to shrug and say "gee wiz, the machine did it all on its own, who could've guessed that would happen, it's not really our fault, right?"

LLMs sharing dangerous false information, ATS systems disqualifying women at higher rates than men, black people getting falsely flagged by facial recognition systems. The list goes on and on.

Humans built these systems. Humans are responsible for governing those systems and building adequate safeguards to ensure they're neither misused nor misbehave. Companies should not be allowed to tech-wash their irresponsible or illegal behaviour.

If Facebook did indeed built a data pipeline and targeting advertising system that could blindly accept and monetize illegally acquired without any human oversight, then Facebook should absolutely be held accountable for that negligence.

pc86 1 hours ago [-]
What does the system look like where a human being individually verifies every pieces of data fed into an advertising system? Even taking the human out of the loop, how do you verify the "legality" of one piece of data vs. another coming from the same publisher?

None of your example have anything to do with the thing we're talking about, and are just meant to inflame emotional opinions rather than engender rational discussion about this issue.

qwertylicious 1 hours ago [-]
That's not my problem to solve?

If Facebook chooses to build a system that can ingest massive amounts of third party data, and cannot simultaneously develop a system to vet that data to determine if it's been illegally acquired, then they shouldn't build that system.

You're running under the assumption that the technology must exist, and therefore we must live with the consequences. I don't accept that premise.

Edit: By the way, I'm presenting this as an all-or-nothing proposition, which is certainly unreasonable, and I recognize that. KYC rules in finance aren't a panacea. Financial crimes still happen even with them in place. But they represent a best effort, if imperfect, attempt to acknowledge and mitigate those risks, and based on what we've seen from tech companies over the last thirty years, I think it's reasonable to assume Facebook didn't attempt similar diligence, particularly given a jury trial found them guilty of misbehaviour.

> None of your example have anything to do with the thing we're talking about, and are just meant to inflame emotional opinions rather than engender rational discussion about this issue.

Not at all. I'm placing this specific example in the broader context of the tech industry failing to a) consider the consequences of their actions, and b) escaping accountability.

That context matters.

myaccountonhn 27 minutes ago [-]
I often think about what having accountability in tech would entail. These big tech companies only work because they can neglect support and any kind of oversight.

In my ideal world, platforms and their moderation would be more localized, so that individuals would have more power to influence it and also hold it accountable.

decisionsmatter 56 minutes ago [-]
It's difficult for me to parse what exactly your argument is. Facebook built a system to ingest third party data. Whether you feel that such technology should exist to ingest data and serve ads is, respectfully, completely irrelevant. Facebook requires any entity (e.g. the Flo app) to gather consent from their users to send user data into the ingestion pipeline per the terms of their SDK. The Flo app, in a phenomenally incompetent and negligent manner, not only sent unconsented data to Facebook, but sent -sensitive health data-. Facebook they did what Facebook does best, which is ingest this data _that Flo attested was not sensitive and collected with consent_ into their ads systems.
qwertylicious 50 minutes ago [-]
So let's consider the possibilities:

#1. Facebook did everything they could to evaluate Flo as a company and the data they were receiving, but they simply had no way to tell that the data was illegally acquired and privacy-invading.

#2. Facebook had inadequate mechanisms for evaluating their partners, and that while they could have caught this problem they failed to do so, and therefore Facebook was negligent.

#3. Facebook turned a blind eye to clear red flags that should've caused them to investigate further, and Facebook was malicious.

Personally, given Facebook's past extremely egregious behaviour, I think it's most likely to be a combination of #2 and #3: inadequate mechanisms to evaluate data partners, and conveniently ignoring signals that the data was ill-gotten, and that Facebook is in fact negligent if not malicious. In either case Facebook should be held liable.

pc86 is taking the position that the issue is #1: that Facebook did everything they could, and still, the bad data made it through because it's impossible to build a system to catch this sort of thing.

If that's true, then my argument is that the system Facebook built is too easily abused and should be torn down or significantly modified/curtailed as it cannot be operated safely, and that Facebook should still be held liable for building and operating a harmful technology that they could not adequately govern.

Does that clarify my position?

decisionsmatter 31 minutes ago [-]
No one is arguing that FB has not engaged in egregious and illegal behavior in the past. What pc86 and I are trying to explain is that in this instance, based on the details of the court docs, Facebook did not make a conscious decision to process this data. It just did. Because this data, combined with the billion+ data points that Facebook receives every single second, was sent to Facebook with the label that it was "consented and non-sensitive health data" when it most certainly was not consented and very sensitive health data. But this is the fault of Flo. Not Facebook.

You could argue that Facebook should be more explicit in asking developers to self-certify and label their data correctly, or not send it at all. You could argue that Facebook should bolster their signal detection when it receives data from a new apps for the first time. But to argue that a human at Facebook blindly built a system to ingest data illegally without any attempt to prevent it is a flawed argument, as there are many controls, many disclosures, and (I'm sure) many internal teams and systems designed exactly for the purpose of determining whether the data they receive is has the appropriate consents (which it did, that Flo sent to them). This case is very squarely #1 in your example and maybe a bit of #2.

shkkmo 3 minutes ago [-]
>Facebook did not make a conscious decision to process this data.

Yes, it did. When Facebook built the system and allowed external entities to feed it unvetted information without human oversight, that was a choice to process this data.

> without any attempt to prevent it is a flawed argument, as there are many controls, many disclosures, and (I'm sure) many internal teams and systems designed exactly for the purpose of determining whether the data they receive is has the appropriate consents

This seems like a giant assumption to make without evidence. Given the past bad behavior from Meta, they do not deserve this benefit of the doubt.

If those systems exist, they clearly failed to actually work. However, the court documents indicate that Facebook didn't build out systems to check if stuff is health data until afterwards.

ryandrake 16 minutes ago [-]
If FB is going to use the data, then it should have the responsibility to check whether they can legally use it. Having their supplier say "It's not sensitive health data, bro, and if it is, it's consented. Trust us" should not be enough.

To use an extreme example, if someone posts CSAM through Facebook and says "It's not CSAM, trust me bro" and Facebook publishes it, then both the poster and Facebook have done wrong and should be in trouble.

Capricorn2481 8 minutes ago [-]
> Facebook did not make a conscious decision to process this data. It just did.

What everyone else is saying is what they did is illegal, and they did it automatically, which is worse. What you're describing was, in fact, built to do that. They are advertising to people based on the honor system of whoever submits the data pinky promising it was consensual. That's absurd.

changoplatanero 31 minutes ago [-]
"doing everything they could" is quite the high standard. Personally, I would only hold them to the standard of making a reasonable effort.
qwertylicious 3 minutes ago [-]
Yup, fair. I tried to acknowledge that in my paragraph about KYC in a follow-up edit to one of my earlier comments, but I agree, the language I've been using has been intentionally quite strong, and sometimes misleadingly so (I tend to communicate using strong contrasts between opposites as a way to ensure clarity in my arguments, but reality inevitably lands somewhere in the middle).
ramonga 29 minutes ago [-]
I would expect an app with 150 million active users to trigger some kind of compliance review in Meta
Capricorn2481 28 minutes ago [-]
This is the argument companies use for having shitty customer support. "Our business is too big for our small support team."

Why are you scaling up a business that can't refrain from fucking over customers?

bluGill 2 hours ago [-]
I would say you have a responsibility to ensure you are getting legal data. you don't buy stolen things. That is meta has a reponsibility to ensure that they are not partnering with crooks. Flo gets the largest blame but meta needs to show they did their part to ensure this didn't happen. (I would not call terms of use enough unless they can show they make you understand it)
gruez 2 hours ago [-]
>Flo gets the largest blame but meta needs to show they did their part to ensure this didn't happen. (I would not call terms of use enough unless they can show they make you understand it)

Court documents says that they blocked access as soon as they were aware of it. They also "built out its systems to detect and filter out “potentially health-related terms.”". Are you expecting more, like some sort of KYC/audit regime before you could get any API key? Isn't that the exact sort of stuff people were railing against, because indie/OSS developers were being hassled by the play store to undergo expensive audits to get access to sensitive permissions?

hedgehog 11 minutes ago [-]
Facebook chose to pool the data they received from customers and allow its use by others, so they are also responsible for the outcomes. If it's too hard to provide strong assurance that errors like Flo's won't result in adverse outcomes for the public, perhaps they should have designed a system that didn't work that way.
deadbabe 2 hours ago [-]
If Flo accepted the terms of use, then it means they understand it.

Really the only blame here should be on Flo.

benreesman 2 hours ago [-]
I tend to agree in this instance. But this is why you don't build a public brand of doing shit very much like this constantly.

Innocent until proven guilty is the right default, but at some point when you've been accused of misconduct enough times? No jury is impartial.

HeavyStorm 3 hours ago [-]
That's why in these cases you'd prefer a judgment without a jury. Technical cases like this will always confuse jurors, who can't be expected to understand details about sdk, data sharing, APIs etc.

On the other hand, in a number of highprofile tech cases, you can see judges learning and discussing engineering in a deeper level.

zahlman 2 hours ago [-]
> Technical cases like this will always confuse jurors... On the other hand, in a number of highprofile tech cases, you can see judges learning and discussing engineering in a deeper level.

Not to be ageist, but I find this highly counterintuitive.

pc86 1 hours ago [-]
Judges aren't necessarily brilliant, but they do spend their entire careers reading, listening to, and dissecting arguments. A large part of this requires learning new information at least well enough to make sense of arguments on both sides of the issue. So you do end up probably self-selecting for older folks able to do this better than the mean for their age, and likely for the population at large.

Let's just say with a full jury you're almost guaranteed to get someone on the other side of the spectrum, regardless of age.

BobaFloutist 60 minutes ago [-]
The judge is at their job. The jury is conscripts that are often paying a financial penalty to be present.
willsmith72 1 hours ago [-]
how exactly? you expect the average joe to have a better technical understanding, and more importantly ability to learn, than a judge? that is bizarre to me
zahlman 1 hours ago [-]
I expect the average joe to use technology much more than a judge.
echoangle 3 hours ago [-]
Is it easier for the prosecution to make the jury think Facebook is guilty or for Facebook to make the jury think they are not? I don’t see why one would be easier, except if the jury would be prejudiced against Facebook already. Or is it just luck who the jury sides with?
dylan604 2 hours ago [-]
I'd imagine Facebook looking for any potential juror in tech to be dismissed as quickly as possible while the prosecution would be looking to seat as many tech jurors they can luck their way into seating.
azemetre 2 hours ago [-]
I mean it totally depends what your views on democracy are. Juries are one of the few, likely only, practices taken from Ancient Athenian democracy which was truly led by the people. The fact that juries still work this way is a testament to the practice.

With this in mind, I personally believe groups will always come to better conclusions than individuals.

Being tried by 12 instead of 1 means more diversity of thought and opinion.

mrkstu 2 hours ago [-]
My understanding is defendants always get to choose, no? So that was an available option they chose not to avail themselves to.
at-fates-hands 34 minutes ago [-]
>> Technical cases like this will always confuse jurors.

This has been an issue since the internet was invented. Its always been the duty of the lawyers on both sides to present the information in cases like this in a manner that is understandable to the jurors.

I distinctly remember during the OJ case, there were many issues that the media said most likely were presented in such a detailed manner, many in the jurors seemed to be checked out. At the time, the prosecution spent days just on the DNA evidence. In contrast, the defense spent days just on how the LAPD collected evidence at the crime scene with the same effect, that many on the jury seemed to check out the deeper the defense dug into it.

So it not just technical cases, any kind of court case that requires a detailed understanding of anything complex comes down to how the lawyers present it to the jury.

nikanj 2 hours ago [-]
Suing Facebook instead of Flo makes perfect sense, because Facebook has much more money. Plus juries are more likely to hate FB than a random menstruation company.
mattmcknight 1 hours ago [-]
They sued both.
1oooqooq 2 hours ago [-]
[flagged]
2 hours ago [-]
zahlman 2 hours ago [-]
> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

> When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

> Please don't fulminate. Please don't sneer, including at the rest of the community.

kubb 5 hours ago [-]
Whenever you think of a court versus Facebook, imagine one of these mini mice trying to stick it to a polar bear. Or a goblin versus a dragon, or a fly versus an elephant.

These companies are for the most part effectively outside of the law. The only time they feel pressure is when they can lose market share, and there's risk of their platform being blocked in a jurisdiction. That's it.

potato3732842 4 hours ago [-]
>These companies are for the most part effectively outside of the law

You have it wrong in the worst way. They are wholly inside the law because they have enough power to influence the people and systems that get to use discretion to determine what is and isn't inside the law. No amount of screeching about how laws ought to be enforced will affect them because they are tautologically legal, so long as they can afford to be.

HPsquared 4 hours ago [-]
It's one of those "I'm not trapped here with you; you're trapped here with me" type things.
entropi 4 hours ago [-]
I think this situation is described best as being "above" the law.
kubb 4 hours ago [-]
Pedantic, but fair. You're right.
Dylan16807 4 hours ago [-]
All they need to do is impose a three digit fine per affected user and Facebook will immediately feel intense pressure.
akudha 4 hours ago [-]
$1 for the first user, $2 for second, $4 for third...By the 30th user, it would be painful even for mega corps. By 40th, it would be an absurd number.

Might also be worth trying to force them to display a banner on every page of the site "you're on facebook, you have no privacy here", like those warnings on cigarette boxes. These might not work though, people would just see and ignore them, just like smokers ignore warnings about cigarettes.

dylan604 2 hours ago [-]
But these users were NOT on Facebook. It was an app using the FB SDK. So it should be the apps that use SDKs should put up large banners clearly identifying who they are sharing data with. Some of these sites are sharing with >100 3rd party sites. It is outrageous
codegladiator 4 hours ago [-]
three digit ? the only thing these folks understand is exponential growth per affected user.
Dylan16807 14 minutes ago [-]
Yes, three digit. That would be 15 to 150 billion dollars, and Facebook would understand that amount.
bell-cot 3 hours ago [-]
Who's this "they" you speak of, and why would they bother doing that?
Dylan16807 16 minutes ago [-]
The court. Because it's their job.

I'm not using "fine" very literally. Damages paid to the victims.

lemonberry 4 hours ago [-]
The worst part for me personally is that almost everyone I know cares about this stuff and yet they keep all of their Meta accounts. I really don't get it and frankly, find it kind of disturbing.

I know people that don't see anything wrong with Meta so they keep using it. And that's fine! Your actions seem to align with your stated values.

I get human fallibility. I've been human for awhile now, and wow, have I made some mistakes and miscalculations.

What really puts a bee in my bonnet though is how dogmatic some of these people are about their own beliefs and their judgement of other people.

I love people, I really do. But what weird, inconsistent creatures we are.

kubb 4 hours ago [-]
Voting with your feet doesn't work if you don't have a place to go. People are afraid of losing their connections, which are some of the most precious things we have. Doesn't matter if it's an illusion, that's enough. Zuck is holding us hostage on our most basic human instincts. I think that's fucked up.
barbazoo 36 minutes ago [-]
I keep sharing stories like this with them. Privacy violations, genocide, mental health, …. Whenever I think it might be something someone cares about I share with them. I also make an effort to explain to my non tech folks that meta is Facebook, instagram, WhatsApp, to make sure they understand recognize the name. Many people do not know what meta is. Sometimes I suspect it was a way to capture the bad publicity and protect their brands.
A4ET8a8uTh0_v2 4 hours ago [-]
Eh, I care and I don't do it, but my wife does. I do not agree with her choices in that area and voice the concerns in a way that I hoped would speak to her, but it does not work as it is now a deeply ingrained habit.

I, too, have vices she tolerates so I don't push as hard as I otherwise would have, but I would argue it is not inconsistency. It is a question of what level of compromise is acceptable.

bossyTeacher 4 hours ago [-]
> The worst part for me personally is that almost everyone I know cares about this stuff and yet they keep all of their Meta accounts.

They care as much as people who claim to care about animals but still eat them, people who claim to love their wives and still beat/cheat them. Your actions are the sole embodiment of your beliefs

fHr 4 hours ago [-]
Roblox lul
ajsnigrutin 5 hours ago [-]
Everybody blames facebook, noone blames the legislators and the courts.

Stuff like this could easily make them pay multi-billion dollar fines, stuff that affects more users maybe even in the trillion range. When government workers come pick up servers, chairs and projectors from company buildings to sell at an auction, because there is not enough liquid value in the company to pay the fines, they (well, the others) would reconsider quite fast and stop with the illegal activities.

favflam 4 hours ago [-]
Sarah Williams (forgot the name) testified in US Congress as to Facebooks strategies on handling governments. Based on her book, it seems Brazil has been the most effective out of major democratic governments in confronting Facebook. Of course, you have China completely banning Facebook.

I think Mark Zuckerberg is acutely aware of the political power he holds and has been using this immense power at least for the last decade. But since Facebook is a US company and the US government is not interested in touching Faceebok, I doubt anyone will see what Zuckerberg and Facebook are up to. The US would have to put Lina Khan back in at the FTC, or put her high up in the Department of Justice to split Facebook into pieces. I guess the other hope is that states' attorneys' general when an anti-monopoly lawsuit.

kubb 4 hours ago [-]
Don't get me wrong, I don't "blame Facebook". I lament the environment that empowers Facebook to exist and do harm. These companies should be gutted by the state, but they won't because they pump the S&P.
FirmwareBurner 5 hours ago [-]
[flagged]
kubb 5 hours ago [-]
Funny, but this kinda implies that some person designed this way. It's a resultant sum of small vectors, with corporate lobbying playing a significant role. Corporate lobbying systemically can't do anything else than try to increase profits, which usually means less regulation. Clean slate design would require a system collapse.
graemep 4 hours ago [-]
> Corporate lobbying systemically can't do anything else than try to increase profits, which usually means less regulation.

Corporate lobbying can be for more regulation. It can disadvantage competitors. Zuckerberg has spoken in favour of greater regulation of social media in the past. The UK's Online Safety Act creates barriers to entry and provides and excuse for more tracking. I can think of examples, some acknowledged by the CEOs of the companies involved, ranging from British pubs to American investment banks.

moolcool 4 hours ago [-]
When Facebook releases an AI Model for free: "Based Facebook. Zuckerberg is a genius visionary"

When Facebook does something unforgivable: "It's a systemic problem. Zuck is just a smol bean"

kubb 4 hours ago [-]
Zuck can take his model onto his private island and talk to it instead of trying to be a normal human being.

Don't conflate me with the personality worshippers on HN, I'm not one of them, even though it seems like it to you because I also post here. You won't find a single instance of me glazing tech leaders.

FirmwareBurner 4 hours ago [-]
What's with this reductionist logic? Nothing is ever 100% good or 100% evil, everything is on a spectrum.

So just because Zuck does some good stuff for the tech world, doesn't mean he's work isn't a net negative to society.

moolcool 4 hours ago [-]
> doesn't mean he's work isn't a net negative to society

Oh he absolutely is.

I'm just saying that it's common in this community to attribute the achievements of big companies to leadership (E.g. the mythology of Steve Jobs), but dismiss all the evil stuff to "systemic issues".

exe34 4 hours ago [-]
> Funny, but this kinda implies that some person designed this way

How do you get to that implication? I'm missing a step or two I think...

kubb 4 hours ago [-]
From "do you want X? this is how you get X". This invokes an image of talking to a person who decided the how, because they can be questioned on whether they want the X.
lazide 5 hours ago [-]
I once ran across Zuckerberg in a Palo Alto cafe. I only noticed him (I was in the process of ordering a sandwich, and don’t really care about shit like that) because he was being ‘eeeeeeee’d’ by a couple of random women that he didn’t seem to know. He seemed pretty uncomfortable about the whole thing. One of them had a stroller which she was profoundly ignoring during the whole thing, which I found a bit disturbing.

The next time I saw him in Palo Alto (a couple months later on the street), he had 2 totally-not-security-dudes flanking him, and I saw at least one random passerby ‘redirected’ away from him. This wasn’t at the cafe though, it wouldn’t surprise me if he didn’t go there again.

This was a decade before Luigi. Luigi was well after meta was in the news for spending massive amounts of money on security and Zuck had a lot of controversy for his ‘compound’ in PA.

I can assure you, Meta is well aware of the situation, and a Luigi isn’t going to have a chance in this situation.

The reality in my experience that is any random person given the amount of wealth these folks end up with would end up making similar (or worse) decisions, and while contra-pressure from Luigi’s is important in the overall system, folks like Zuckerberg are more a result of the system and rules than the cause of them (but then influence the next system/rules in a giant Oroborous type situation).

Kind of a we either die young a hero, or live to be the villain kind of thing. But because the only reason anyone dies a young hero is because they lost the fight against the prior old villains. If they’d won (even in a heroic fashion), life would turn them into the old villains shortly.

The wheel turns.

FirmwareBurner 4 hours ago [-]
> he was being ‘eeeeeeee’d’ by a couple of random women

Maybe I'm too old, but what in the world does being eeee'd mean?

>I can assure you, Meta is well aware of the situation, and a Luigi isn’t going to have a chance in this situation.

With all due respect, Luigi was just a CS student with a six pack, a self made gun, and a aching back on a mission.

The Donald himself nearly got got by his ear while he had the secret service of the US of A to protect him, not some private goons for hire, and that was just a random redditor with a rifle, not a professional assassin.

So what would happen if let's say meta's algorithms push a teenage girl to kill herself by exploiting her self esteem issues to sell her more beauty products, and her ex-navy seal dad with nothing more to loose grabs his McMillan TAC-338 boom stick and makes his life mission to avenge his lost daughter at the expense of his own? Zuck would need to be lucky every time, but that bad actor would need to be lucky once.

I'm not advocating for violence btw, my comment was purely hypothetical.

potato3732842 4 hours ago [-]
Pretty much anyone without presidential quality security clearing the place ahead of them stands to get clapped Franz Ferdinand style by anyone dedicated enough to camp out waiting.
lazide 4 hours ago [-]
And yet, Mr. Trump is up there trolling the world like he loves to do, and Zuck is out there doing whatever he wants.

The reality is, all those ex-navy seal Dad’s are (generally) wishing they could make the cut to get on those dudes payroll, not gunning for them. Or sucking up to the cult, in general.

The actual religious idea of Karma is not ‘bad things happen to bad people right now’, the way we would like.

Rather ‘don’t hate on king/priest/rich dude, they did something amazing in a prior life which is why they deserve all this wealth right now, and if they do bad things, they’ll go down a notch - maybe middle class - in the next life’.

It’s to justify why people end up suffering for no apparent reason in this life (because they had to have done something really terrible in a prior life), while encouraging them to do good things still for a hopefully better next life (if you think unclogging Indian sewers in this life is bad, you could get reincarnated as a roach in that sewer in the next life!). So they don’t go out murdering everyone they see, even if they get shit on constantly.

There is no magic bullet. Hoping someone else is going to solve all your problems is exactly how manipulative folks use you for their own purposes. And being a martyr to go after some asshole is being used that way too.

This is also why eventually an entire generation of hippies turned into accountants in the 80’s.

shrug

s5300 4 hours ago [-]
[dead]
lightedman 4 hours ago [-]
"I can assure you, Meta is well aware of the situation, and a Luigi isn’t going to have a chance in this situation."

Luigi was a dude with a 3D printed gun.

I have LASERs with enough power to self-focus, have zero ballistic drop, and can dump as much power as a .50cal BMG in a millisecond burst of light which can hit you from the horizon's edge. All Zuck needs to do is stand by a window, and eyeballs would vaporize.

landl0rd 4 hours ago [-]
Mangione is going to either die rotting in prison, or preferably get sent to the electric chair. His life will be wasted. Meanwhile, UNH is continuing to do business as usual. One way or the other, mangione will die knowing his life was wasted, and that his legacy is not reform but cold-blooded murder.

Call it a “day of rage” or just babyrage but we build systems so our bus factor can increase above 1. Just killing people no longer breaks them. It makes someone nothing more than a juvenile murderer.

I don’t really care what lasers you have, I’d suggest you choose a different legacy for yourself.

FirmwareBurner 4 hours ago [-]
>His life will be wasted.

His life was already wasted due to his medical condition. Don't ever bet aginst people with nothing to loose.

s5300 4 hours ago [-]
[dead]
fHr 4 hours ago [-]
FBI open up
hobs 4 hours ago [-]
It's not the only way. The oppressed do not need to become the oppressor, its just the simplest rut for the wheel to turn in.
lazide 4 hours ago [-]
Sure, they can stay the oppressed?

Using the entropic model you seem to indicate (which I also favor), us vs them seems to be the lowest energy state.

It’s certainly possible to not be there at any given time, but seems to require a specific and somewhat unique set of circumstances, which are not the most energetically stable.

mschuster91 4 hours ago [-]
> I only noticed him (I was in the process of ordering a sandwich) because he was being ‘eeeeeeee’d’ by a couple of random women that he didn’t seem to know. He seemed pretty uncomfortable about the whole thing.

Pretty funny considering that Facebook's origin story was a women comparison site, or this memorable quote:

> People just submitted it. I don't know why. They 'trust me'. Dumb fucks.

lazide 4 hours ago [-]
Have you ever ordered a really good steak, like amazing. And really huge, and inexpensive too.

And it really is amazing! And super tasty.

But it’s so big, and juicy, that by the end of it you feel sick? But you can’t stop yourself?

And then at the end of it, you’re like - damn. Okay. No more steak for awhile?

If not steak, then substitute cake. Or Whiskey.

Just because you got what you wanted doesn’t mean you’re happy with all the consequences, or can stomach an infinitely increasing quantity of it.

Of course, he can pay to mitigate most of them, and he gets all the largest steaks he could want now, so whatever. I’m not going to cry about it. I thought it was interesting to see develop however.

mschuster91 3 hours ago [-]
Personally, I see it as poetic justice. He started off on objectifying women with FaceMash, he doesn't get to cry about being objectified and drooled over himself.
j33zusjuice 5 hours ago [-]
[flagged]
thisisit 2 hours ago [-]
5 years ago I was researching the iOS app ecosystem. As part of that exercise I was looking at the potential revenue figures for some free apps.

One developer had a free app to track some child health data. It was long time ago so I don't remember the exact data being collected. But when asked about the economics of his free app, the developer felt confident about a big pay day.

As per him the app's worth was in the data being collected. I don't know what happened to the app but it seemed that app developers know what they are doing when they invade privacy of their users - under the guise of "free" app. After that I became very conscious about disabling as many permissions as possible and especially not using apps to store any personal data, especially health data.

SoftTalker 42 minutes ago [-]
I don't understand why anyone would let these psychopathic corporations have any of their personal or health data. Why would you use an app that tracked health data, or use a wearable device from any of these companies that did that. You have to assume, based on their past behavior, that they are logging every detail and it's going to be sold and saved in perpetuity.
dm319 22 minutes ago [-]
I guess because people want to track some things about their health, and people provide good pieces of software with a simple UI to do it, and this is more useful that, say, writing it down in a notebook, or in a text file or notes app.

I guess also people feel that corporations _shouldn't_ be allowed to do bad things with it.

Sadly, we already know with experience in the last 20 years, that many people don't care about what information they give to large corporations.

However, I do see more and more people increasingly concerned about their data. They are still mainly people involved in tech or related disciplines, but this is how things start.

comrade1234 5 hours ago [-]
I don't think many of you read the article... the Flo app is the one in the wrong here, not meta. The app people were sending user data to meta with no restrictions on its use. Despite however the court ruled.
PunchTornado 4 hours ago [-]
> The app people were sending user data to meta with no restrictions on its use

And then meta accessed it. So unless you put restrictions on data, meta is going to access it. Don't you think it should be the other way around? Meta to ask for permission? Then we wouldn't have this sort of thing.

gruez 4 hours ago [-]
Do you think AWS should ask for permission before processing some random B2C app user's data?
SoftTalker 41 minutes ago [-]
If they are going to add it to a person's profile and/or sell ads based on it, yes.
paintbox 4 hours ago [-]
From the article: "The jury ruled that Meta intentionally “eavesdropped on and/or recorded their conversations by using an electronic device,” and that it did so without consent."

If AWS wanted to eavesdrop and/or record conversations of some random B2C app user, for sure they would need to ask for permission.

gruez 4 hours ago [-]
If you read the court documents, "eavesdropped on and/or recorded" basically meant "flo used facebook's SDK to sent analytics events to facebook". It's not like they were MITMing connections to flo's servers.

https://www.courtlistener.com/docket/55370837/1/frasco-v-flo...

Spivak 37 minutes ago [-]
I think it a distinction without a difference. To make it more obvious imagine it was one of those AI assistant devices that records your conversations so you can recall them later. Plainly obvious that accessing this data for any purpose other than servicing user requests is morally equivalent to easedropping on a person's conversations in the most traditional sense.

If the company sends your conversation data to Facebook that's bad and certainly a privacy violation but at this point nothing has actually been done with the data yet. Then Facebook accesses the data and folds it into their advertising signals; they have now actually looked at the data and acted on the information within. And that to me is easedropping.

raverbashing 3 hours ago [-]
Here's the restriction: don't send it to fb in the first place!
PunchTornado 1 hours ago [-]
here's another one: fb shouldn't use every piece of data they can collect.
pllbnk 4 hours ago [-]
Everybody misses the key information here - it’s a Belarusian app. CEO and CTO are Belarusian (probably there are more C-level people who are Belarusian or Russian). Not only are users giving up their private information but they are doing so to the malevolent (by definition) regimes.

When the Western app says they don’t sell or give out private information, you can be suspicious but still somewhat trustful. When a dictator-ruled country’s app does so, you can be certain every character you type in there is logged and processed by the government.

achempion 2 hours ago [-]
I would encourage you to read about the Edward Snowden guy and the PRISM program on wikipedia and most recent attempts of EU to ban the encryption.

Also, here is what Pavel Durov mentioned recently in interview to Tucker Carlson

> In the US you have a process that allows the government to actually force any engineer in any tech company to implement a backdoor and not tell anyone about it with using this process called the gag order.

It doesn't matter what anyone claims on the landing page. Assume if it's stored somewhere, it'll get leaked eventually and the transitioning/hosting government already has an access and decryption keys.

pllbnk 2 hours ago [-]
You are right. I still think it’s better if only our guys have this information than both, our guys and their guys. At least Western companies have the possibility to get regulated if political winds change.
ramanh 4 hours ago [-]
The company cut all ties with Belarus more than three years ago, and all employees relocated to Europe.
graemep 3 hours ago [-]
Where in Europe? Belarus is in Europe, and so is much of Russia (the largest European country). Plenty of variation in the rest of Europe.

What do you mean by cut all ties? The owners and management have no assets in Belarus or ties to the country?

ramanh 3 hours ago [-]
you can open "contact us" page on their website.
graemep 2 hours ago [-]
Not sure how that helps answer my question.

A list of contact addresses is not a list of all locations, or all employees or all a contractors or all shareholders or all financial interests.

The one thing the site tells me is that it is operated by two separate companies - Flo Inc and Flo health UK. The directors of Flo Health Limited live in the UK and Cypress, two are Belarusian nationals and one Russian.

pllbnk 3 hours ago [-]
I can only cite myself to emphasize the point that they didn’t:

> CEO and CTO are Belarusian (probably there are more C-level people who are Belarusian or Russian).

Actually, quick google search shows slavic (either Russian or Belarusian) names for CFO and CMO. Changing physical location means very little these days.

Chris2048 2 hours ago [-]
It looks like many of them now live outside Belarus; should have changed their names, and/or fired any slavic nationals?

* Dmitry Gurski; CEO

* Tamara Orlova; CFO

* Anna Klepchukova; Chief Medical Officer

* Kate Romanovskaia; Chief Brand & Communications Officer

* Joëlle Barthel; Director of Brand Marketing

* Nick Lisher (British); Chief Marketing Officer

Culonavirus 42 minutes ago [-]
> When the Western app says they don’t sell or give out private information, you can be suspicious but still somewhat trustful.

Hey guys, that ycombinator "hacker" forum thing full Champagne socialists employed by the Zucks/Altmans/Musks of the world told me everything is fine and I shouldn't worry. I remain trustful.

Surely not even some, ahem, spilled tea can't possibly occur again, right? I remain trustful.

Speaking of tea, surely all the random "id verification" 3rd parties used since the UK had a digital aneurysm have everything in order, right? I remain trustful.

---

Nah, I'll just give my data to my bank and that's about it. Everyone else can fuck right off. I trust Facebook about as much as I trust Putin.

everdrive 5 hours ago [-]
Don't use apps. It's a simple as that. 95% of the time they are not worth the incredible privacy invasion they impose on users.
amarcheschi 4 hours ago [-]
Mozilla did a comparison between period tracking apps and there are some that should respect user's privacy

https://www.mozillafoundation.org/en/privacynotincluded/cate...

zahlman 2 hours ago [-]
Even beyond that, I expect software developers to prove to me that an Internet connection is necessary for whatever it is they're trying to do.
setsewerd 4 hours ago [-]
Pardon my ignorance, but can't you just solve this by disabling location permissions, etc for a given app?
everdrive 4 hours ago [-]
You can -- the real problem here is that each app could violate your privacy in different ways. Unless you break TLS and inspect all the traffic coming from an app (and, do this over time since the reality of what data is sent will change over time) then you don't really know what your apps are stealing from you. For sure, many apps are quite egregious in this regard while some are legitimately benign. But, do you as a user have a real way to know this authoritatively, and to keep up with changes in the ecosystem? My argument would be that even security researchers don't have time to really do a thorough job here, and users are forced to err on the side of caution.
throwaway290 4 hours ago [-]
What they do then is create an app where location is necessary, make that app spin up a localhost server, then add js to facebook and every site with a like button to phone that localhost and basically deanon everyone.
cnity 4 hours ago [-]
How could this possibly work without port forwarding?
mzajc 4 hours ago [-]
2 months ago: https://news.ycombinator.com/item?id=44169115.

Of course Facebook's JS won't add itself to websites, so half of the blame goes to webmasters willingly sending malware to browsers.

throwaway290 2 hours ago [-]
It happens on the same device. No forwarding necessary. And it was documented to happen, the story was on HN
fHr 4 hours ago [-]
The sad truth
bell-cot 5 hours ago [-]
True. Unfortunately, users are all humans - with miserably predictable response patterns to "Look at this Free New Shiny Thing you could have!" pitches, and the ruthless business models behind them.
dr-detroit 2 hours ago [-]
[dead]
arkwin 2 hours ago [-]
To any other women in here, check out Drip. https://dripapp.org They seem to be the most secure.
footy 2 hours ago [-]
Honestly, this is something I would just self host. This isn't data I'd trust anyone with, and I don't even have sex with men.
arkwin 44 minutes ago [-]
I think that is the best approach for people who can do that. :)
2 hours ago [-]
_fat_santa 2 hours ago [-]
My wife uses Flo though every time I see her open the app and input information the tech side of my brain is quite alarmed. An app like that keeps very very personal information and really highlights for me the need to educate non-technical folks on information security.
oxqbldpxo 40 minutes ago [-]
Oh boy, what's Mark up to these days.
12_throw_away 12 minutes ago [-]
Thanks for asking! Also on the front page today: https://news.ycombinator.com/item?id=44898934

From that article:

   “It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards. [...] The document seen by Reuters, which exceeds 200 pages, provides examples of “acceptable” chatbot dialogue during romantic role play with a minor. They include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.”
princevegeta89 5 hours ago [-]
It's very rare to see any privacy related news without Meta being involved in the story.
thrance 3 hours ago [-]
Why would an app that tracks menstrual cycles need to integrate with the Facebook SDK?? Pure insanity.
zahlman 2 hours ago [-]
Why would an app that tracks menstrual cycles need to connect to the Internet at all? TFA mentions asking about quite a few other personal things as well. Is the app trying to do more than just tracking? If they're involved in any kind of diagnosis then I imagine there are further legal liability issues....
chubs 4 hours ago [-]
This is really disappointing. I used to have a fertility tracking app on the iOS App Store, zero data sharing, all local thus private. But, people don’t want to pay $1 for an app, and I can’t afford the marketing drive that an investor-backed company such as this has… and so we end up with situations like this. Pity :(
zahlman 2 hours ago [-]
Stories like this one can be the basis for effective marketing. We need to normalize paying $1 (or more, where warranted) for apps that provide value in the form of not doing the things that allow the $0 ones to be $0.
maxehmookau 2 hours ago [-]
No ifs, no buts. Stuff like this deserves ruinous fines for its executives.

Cycle data in the hands of many country's authorities is outright dangerous. If you're storing healthcare data, it should require IN BIG RED LETTERS an explicit opt-in, every single time, when that data leaves your device.

bell-cot 4 hours ago [-]
For those disinclined to read the article...

> [...] users, regularly answered highly intimate questions. These ranged from the timing and comfort level of menstrual cycles, through to mood swings and preferred birth control methods, and their level of satisfaction with their sex life and romantic relationships. The app even asked when users had engaged in sexual activity and whether they were trying to get pregnant.

> [...] 150 million people were using the app, according to court documents. Flo had promised them that they could trust it.

> Flo Health shared that intimate data with companies including Facebook and Google, along with mobile marketing firm AppsFlyer, and Yahoo!-owned mobile analytics platform Flurry. Whenever someone opened the app, it would be logged. Every interaction inside the app was also logged, and this data was shared.

> "[...] the terms of service governing Flo Health’s agreement with these third parties allowed them to use the data for their own purposes, completely unrelated to services provided in connection with the App,”

Bashing on Facebook/Meta might give a quick dopamine hit, but they really aren't special here. The victims' data was routinely sold, en mass, per de facto industry practices. Victims should assume that hundreds of orgs, all over the world, now have copies of it. Ditto any government or criminal groups which thought it could be useful. :(

cindyllm 4 hours ago [-]
[dead]
josefritzishere 3 hours ago [-]
Zuckerberg does not seem to repect the law. There really should be criminal charges by now.
ChrisArchitect 2 hours ago [-]
Previously: https://news.ycombinator.com/item?id=44763949
itsalotoffun 5 hours ago [-]
I mean.. there's simply no repercussions for these companies, and only rivers of money on the other side. The law is laughably inept at keeping them in check. The titans of Surveillance Capitalism don't need to obey laws. CFOs line-item-ing provisional legal settlement fees as (minor) COGS. And us digital serfs, we simply have no rights. Dumb f*cks, indeed.
potato3732842 4 hours ago [-]
The line between big business and the state is blurry and the state wants to advance big business as a means to advance itself. Once you understand this everything makes sense, or as much "sense" as it can.
dkiebd 5 hours ago [-]
Users gave their data to Flo, and Flo then gave it to Meta. What repercussions do you want for Meta?
Etheryte 5 hours ago [-]
Buying stolen goods does not mean they're yours because the seller never had any ownership to begin with. The same applies here, just because there's an extra step in the middle doesn't mean that you have any rights to the data.
Ekaros 5 hours ago [-]
Some percent of their revenue as fine per case. Only way to scare these companies at this point.
j33zusjuice 4 hours ago [-]
A significant portion, too, not fractions of a percent. Frankly, I want the fines to bankrupt them. That’s the point. I want their behavior to be punished appropriately. Killing the company is an appropriate response, imo: FB/Meta is a scourge on society.
pbiggar 5 hours ago [-]
Meta should never have used them. Deeply unethical behaviour
NickC25 2 hours ago [-]
Your mistake was expecting ethical behavior from Mark Zuckerberg.
aboringusername 3 hours ago [-]
Another aspect of this is why Apple/Google let this happen in the first place. GrapheneOS is the only mobile OS I can think of that lets you disable networking on an per-app level. Why does a period tracking app need to send data to meta (why does it even need networking access at all)? Why is there no affordance of user-level choice/control that allows users to explicitly see the exact packets of data being sent off device? It would be trival for apps to have to present a list of allowed IPs/hostnames, and users to consent/not otherwise the app is not allowed on the play store.

Simply put, it should not be possible to simply send arbitrary data without some sort of user consent/control, and to me, this is where the GDPR has utterly failed. I hope one day users are given a legal right to control what data is sent off their device to a remote server with serious consequences for non-compliance.

toast0 1 hours ago [-]
> Why does a period tracking app need to send data to meta (why does it even need networking access at all)?

In case you want to sync between multiple devices, networking is the least hassle way.

> Why is there no affordance of user-level choice/control that allows users to explicitly see the exact packets of data being sent off device? It would be trival for apps to have to present a list of allowed IPs/hostnames, and users to consent/not otherwise the app is not allowed on the play store.

I don't know that it ends up being useful, because wherever the data is sent to can also send the data further on.

pbiggar 5 hours ago [-]
Meta truly is the worst company. In almost everything Meta does, it truly makes the most user-hostile decisions, awful decision, every single time.

Cambridge Analytica The Rohingya Genocide Suppressing Palestinian content during a genocide Damage to teenage (and adult) mental health

Anyway, I mention this because some friends are building a social media alternative to Instagram: https://upscrolled.com, aiming to be pro-user, pro-ethics, and designed for people, not just to make money.

ivanmontillam 4 hours ago [-]
Your comment started very useful, then it became spam. Great way to lose goodwill.
Chris2048 2 hours ago [-]
Is posting a self-made alternative to meta not consistent with the rest of the post, even actively promoting the vibe?
ivanmontillam 12 minutes ago [-]
A Show HN post would have been more appropriate; this seemed to me opportunistic at best.
4 hours ago [-]