top | item 37546068

(no title)

paulusthe | 2 years ago

The magazine I worked for at the time was about to publish an article claiming that DeepMind had failed to comply with data protection regulations when accessing records from some 1.6 million patients to set up those collaborations—a claim later backed up by a government investigation. Suleyman couldn’t see why we would publish a story that was hostile to his company’s efforts to improve health care. As long as he could remember, he told me at the time, he’d only wanted to do good in the world.

In the seven years since that call, Suleyman’s wide-eyed mission hasn’t shifted an inch. “The goal has never been anything but how to do good in the world,” he says via Zoom from his office in Palo Alto, where the British entrepreneur now spends most of his time.

Thanks, I hate him already.

A messianic SV hand waver who doesn't care about anything but his special mission, doesn't care about breaking rules, and reflexively gaslights people who complain. As if "Why don't you support the mission bro?" is a reasonable response to "you should protect people's information."

discuss

order

sdenton4|2 years ago

There is a real argument on the other side, though. We're dealing with technologies that they've when given access to loads of data. Health data is heavily regulated, and rightly so, but that regulation greatly hinders innovation.

Hell medical data access problems are bad enough even when we aren't talking about innovation: simple problems in sharing data between different systems/providers leads to bad outcomes all the time.

https://www.techrepublic.com/article/data-quality-in-healthc...

So it's a case where fragmentation and regulation are already leading to bad outcomes for patients, and where innovation is suppressed because of lack of access, especially to population-level data.

Even without ai, imagine being able to identify various kinds of outbreaks by correlating nearby diagnoses in real time, and flashing the local nurses that there's a serious food poisoning outbreak happening fire their consideration when people call in with early symptoms. We should be able to do this easily.

We should protect people's information, but we also need to build a road to a better tomorrow. The current rules are, in fact, broken, and we need new rules which lead to better outcomes.

paulusthe|2 years ago

"regulations slow innovation" is not a valid reason to ignore any regulation one finds annoying.

That said, my problem isn't that he broke the rules. My problem is that, when confronted about having broken the rules, he lied about it then retreated into "why don't you believe the mission bro?" As if his solution is the only possible solution to the problem.

He's full of himself, doesn't care about rules, and gaslights those that criticize him. His messianic do-gooder-ism a bullshit marketing cover for him doing what he wants.

Veserv|2 years ago

Well if they are only going to use the data for good purposes and not for nefarious purposes or for sale, then there is no downside to just writing it into their contracts and privacy policy.

Just add a irrevocable guarantee that they will never sell or transfer to someone who will sell any data and if they do the company will immediately dissolve and become encumbered with a debt of the highest seniority equal to all lifetime company revenues to the people whose medical data they have. The C-suite and Board of Directors must also provide a personal financial guarantee equal to their entire compensation package, and must provide sworn testimony yearly that they are engaging in no business deals which include the sale of private medical data.

Since they do not intend to ever use the data for bad purposes, they have nothing to lose by keeping their word. Literally no downside to them since they were not going to do it anyways and it provides peace of mind to the public, a win-win.

BoiledCabbage|2 years ago

I mean, do people no longer have any concept of ethics? And I don't mean this in the abstract sense, I mean literal practical everyday ethics. Understanding the concept of tradeoffs and consequences of actions and the rest.

I feel like we've built a church (or possibly cult) that had mantras of "Innovation at all cost. Liquidity at all costs...." among a few others. With no view whatsoever as to what the implications

And I seriously am starting to think that HN and general SV culture. Specifically here on HN, the number of times I've seen a justification end in one is those thought terminating cliches is legitimately concerning. The amount of reasoning that boils down to "this is good because it improves innovation. And because it improves innovation it is good." And not only zero thought on the implications of taking the action suggested - what seems like an unawareness that one should even consider the consequences of taking the action. It's as if "we've reached the 'innovation is good' stage of the thought state machine so the state machine should terminate and return success".

It's absolutely mind-boggling to me that anyone could post a comment on saying yes we should give up medical privacy and not even have a single sentence on the negative consequences of doing that. "Why would one need to think about the negative consequences? It has a positive consequence so clearly we should do it."

Is it a gap in CS education? General education? Is it the personality type of us engineers? Is it nature? Nurture? Both? Is it social? Others don't step in to provide that feedback when it happens? How do we even approach it?

ethanbond|2 years ago

It’s weird you mention fragmentation and regulation as the culprits when it’s pretty obviously consolidation (to Epic and Cerner) that led to this and it’s regulation (like 21st Century Cures Act) that’s actually undoing it… by requiring that consolidated players can’t disrupt fragmentation efforts (via FHIR).

edgyquant|2 years ago

> There is a real argument on the other side, though

No there isn’t. The rest of your comment can be safely disregarded thanks to you opening with this.

“We need to build a better tomorrow!,” we will, the people actually trying to within accepted norms. Not SV grifters who’ve destabilized our entire society and ended privacy all for ad revenue.

choppaface|2 years ago

Finance is highly regulated and SBF also claimed to only want to do good. He even recently had 250 pages of thoughts and memoirs released that underscore his own self-confidence and belief in his own innocence. Should SBF had more room to innovate?

The argument isn't so much about how to go about technical progress but who and how to trust a Suleyman, or a SBF, etc. Some will do the hard work, meticulously build both pre- and post-regulation products, diligently deal with stakeholders, and succeed or fail to move the market. Being comfortable with saying divisive things on-the-record is a pretty key lapse in rigor.

soxicywn|2 years ago

I feel like part of the problem is that there’s a lot of difficulty in giving access to this kind of data for a specific purpose, and a specific purpose only (please correct me if I’m wrong). This is a problem that can be (and should be!) solved with time.

Advanced cryptographic techniques allow you (as the data owner) to restrict the function(s) you can compute on the data. In addition to that, they ensure that the only thing the parties on the other end would learn is the result of the function computed. But of course, we’re still a ways away from these techniques being practical, as the field of ML moves at a much higher pace.

candiddevmike|2 years ago

Great, feel free to share your own private health history online for folks to scrape and "innovate" on. I'd like to keep mine private though.

freeere|2 years ago

I would do the same.

Why?

Because I have a chronic issue and without mass data analysis no one will solve my issue.

You can dislike him but he faces the consequences and it's not necessary his job to appeal to everyone.

sdwr|2 years ago

If you believe in the mission, it's a good thing!

I've never understood the privacy boner. Sure, people can abuse information - can exploit or punish based on it.

But there are also so many positive uses of information. Research, understanding, a fuller picture of the world, helping people.

The need for privacy feels antisocial and backwards to me. We're not living in a totaliarian state where ppl get killed for tweeting the wrong thing, so let's not act like it. Part of maturing is accepting others for the good + bad, and you can't do that with a wall up.

Dudester230602|2 years ago

How did he manage to persuade investors to sign papers stating that the companies' mission is to do good and not make profit / "exit"?

true_religion|2 years ago

If you are able to do evil and profit then that is a failure in law and not capitalism.

baq|2 years ago

VC funding selects for messiahs so that’s what you get…

nobodyandproud|2 years ago

Another case where this quote is upheld: > It is difficult to get a man to understand something, when his salary depends on his not understanding it.