(no title)
paulusthe | 2 years ago
In the seven years since that call, Suleyman’s wide-eyed mission hasn’t shifted an inch. “The goal has never been anything but how to do good in the world,” he says via Zoom from his office in Palo Alto, where the British entrepreneur now spends most of his time.
Thanks, I hate him already.
A messianic SV hand waver who doesn't care about anything but his special mission, doesn't care about breaking rules, and reflexively gaslights people who complain. As if "Why don't you support the mission bro?" is a reasonable response to "you should protect people's information."
sdenton4|2 years ago
Hell medical data access problems are bad enough even when we aren't talking about innovation: simple problems in sharing data between different systems/providers leads to bad outcomes all the time.
https://www.techrepublic.com/article/data-quality-in-healthc...
So it's a case where fragmentation and regulation are already leading to bad outcomes for patients, and where innovation is suppressed because of lack of access, especially to population-level data.
Even without ai, imagine being able to identify various kinds of outbreaks by correlating nearby diagnoses in real time, and flashing the local nurses that there's a serious food poisoning outbreak happening fire their consideration when people call in with early symptoms. We should be able to do this easily.
We should protect people's information, but we also need to build a road to a better tomorrow. The current rules are, in fact, broken, and we need new rules which lead to better outcomes.
paulusthe|2 years ago
That said, my problem isn't that he broke the rules. My problem is that, when confronted about having broken the rules, he lied about it then retreated into "why don't you believe the mission bro?" As if his solution is the only possible solution to the problem.
He's full of himself, doesn't care about rules, and gaslights those that criticize him. His messianic do-gooder-ism a bullshit marketing cover for him doing what he wants.
Veserv|2 years ago
Just add a irrevocable guarantee that they will never sell or transfer to someone who will sell any data and if they do the company will immediately dissolve and become encumbered with a debt of the highest seniority equal to all lifetime company revenues to the people whose medical data they have. The C-suite and Board of Directors must also provide a personal financial guarantee equal to their entire compensation package, and must provide sworn testimony yearly that they are engaging in no business deals which include the sale of private medical data.
Since they do not intend to ever use the data for bad purposes, they have nothing to lose by keeping their word. Literally no downside to them since they were not going to do it anyways and it provides peace of mind to the public, a win-win.
BoiledCabbage|2 years ago
I feel like we've built a church (or possibly cult) that had mantras of "Innovation at all cost. Liquidity at all costs...." among a few others. With no view whatsoever as to what the implications
And I seriously am starting to think that HN and general SV culture. Specifically here on HN, the number of times I've seen a justification end in one is those thought terminating cliches is legitimately concerning. The amount of reasoning that boils down to "this is good because it improves innovation. And because it improves innovation it is good." And not only zero thought on the implications of taking the action suggested - what seems like an unawareness that one should even consider the consequences of taking the action. It's as if "we've reached the 'innovation is good' stage of the thought state machine so the state machine should terminate and return success".
It's absolutely mind-boggling to me that anyone could post a comment on saying yes we should give up medical privacy and not even have a single sentence on the negative consequences of doing that. "Why would one need to think about the negative consequences? It has a positive consequence so clearly we should do it."
Is it a gap in CS education? General education? Is it the personality type of us engineers? Is it nature? Nurture? Both? Is it social? Others don't step in to provide that feedback when it happens? How do we even approach it?
ethanbond|2 years ago
edgyquant|2 years ago
No there isn’t. The rest of your comment can be safely disregarded thanks to you opening with this.
“We need to build a better tomorrow!,” we will, the people actually trying to within accepted norms. Not SV grifters who’ve destabilized our entire society and ended privacy all for ad revenue.
choppaface|2 years ago
The argument isn't so much about how to go about technical progress but who and how to trust a Suleyman, or a SBF, etc. Some will do the hard work, meticulously build both pre- and post-regulation products, diligently deal with stakeholders, and succeed or fail to move the market. Being comfortable with saying divisive things on-the-record is a pretty key lapse in rigor.
soxicywn|2 years ago
Advanced cryptographic techniques allow you (as the data owner) to restrict the function(s) you can compute on the data. In addition to that, they ensure that the only thing the parties on the other end would learn is the result of the function computed. But of course, we’re still a ways away from these techniques being practical, as the field of ML moves at a much higher pace.
candiddevmike|2 years ago
freeere|2 years ago
Why?
Because I have a chronic issue and without mass data analysis no one will solve my issue.
You can dislike him but he faces the consequences and it's not necessary his job to appeal to everyone.
sdwr|2 years ago
I've never understood the privacy boner. Sure, people can abuse information - can exploit or punish based on it.
But there are also so many positive uses of information. Research, understanding, a fuller picture of the world, helping people.
The need for privacy feels antisocial and backwards to me. We're not living in a totaliarian state where ppl get killed for tweeting the wrong thing, so let's not act like it. Part of maturing is accepting others for the good + bad, and you can't do that with a wall up.
yellow_lead|2 years ago
Dudester230602|2 years ago
true_religion|2 years ago
baq|2 years ago
nobodyandproud|2 years ago