I mentioned a potential OpenAI insider in https://x.com/peterjliu/status/2024901585806225723, that was from 5 minutes of investigation. There are probably more. And then there's a lot of other companies.
Manifold actually explicitly encourages insider trading, arguing that it leads to more accurate pricing. This was possibly defensible back when it was a cute funtime project run by a Bay Area polycule, but it’s probably going to get them in deep shit sooner or later, even though they don’t even use real-money betting.
A fun aside: this person obviously created a bunch of new Bitcoin accounts to hide their activity.
It makes you think that if you were able to surreptitiously add malicious side channel software into a popular npm package that you wouldn't just need to hunt for crypto wallets with balances.
You could also probably find a market for crypto wallets with small balances or zero balances. The history and date of creation would be the value to some.
This openai employee should have gone on the dark web to buy older addresses to cloak their activity.
It's sad to say that almost all crypto use cases point to fraud. I'm excited about crypto and there is some fascinating research around anonymous transactions (like zcash). But, that real utility is always overshadowed by the actions of charlatans or worse.
you can't "change the password" on a wallet, so a "used" wallet is highly unattractive. anything you put in it could be taken by the original keyholder who sold it to you.
This is just one way information goes from being private to being public. It is sensible that people who provide intelligence to the market be compensated, whether they're better at inferring/predicting or whether they just know something we don't.
Obviously, in a case like this, an individual would be violating the terms of their employment/non-disclosure agreement. I agree that is bad!
I don't think that damns the concept of "predicting known information".
77 suspicious positions across 60 wallets, 13 brand-new accounts appearing 40 hours before the browser launch. First confirmed case of a major tech company firing over prediction market trades.
It's interesting that the both replies under this comment are saying exact same thing, with the exact same term ("raison d'etre"... how often do you hear two random people think of this phrase at the same time?).
It might be nothing, but it'd be funny if karma farming bots are doing some 'reply frontrunning' over the internet.
If you see prediction markets as how they were originally pitched (price ~approximating likelihood), then insider trading is good. It provides discovery.
If you look at what prediction markets are today (gambling, especially on sports, especially in states that have banned it), then insider trading is bad. Particularly when the people trading can influence the outcome (e.g. a pitcher purposefully throwing into the dirt.)
I do hope corporations in general take a harder stance on this. From a society perspective people with inside knowledge fleecing randoms is not a win. We've got that somewhat under control on the stock exchange, but have this absurd situation where on prediction markets it is a free for all and everyone pretends this is fine.
I also think corporations should distance themselves from individuals willing to fleece randoms. Trading in general is very wild west survival of the fittest but active exploitation of insider knowledge speak of very poor morale character
Honestly it seems stupid but fine to me. Like if someone random comes up to me on the sidewalk and says hey if OpenAI announces a browser tomorrow, you give me $100. If not I'll give you $1000. Obviously I'm not going to take them up on it, they clearly have inside information.
If you're betting on a prediction market without insider information then you're just... The fool who is soon parted from his money one way or another.
I generally feel like people should be free to do whatever insane stuff they want with their own lives.
Who would think that? At every corporation where I've worked it's been explicit in both the contract and in HR training that this is explicitly not allowed.
> The employee, she said, “used confidential OpenAI information in connection with external prediction markets (e.g. Polymarket).”
Note that “insider trading” is not illegal on prediction markets. The particular issue here is that the employee “disclosed” confidential information on a public forum by influencing the prices assigned to certain outcomes by prediction markets.
I don't think this is true, though enforcement is another thing and the standard is different than in securities markets. Prediction markets are regulated by the CFTC and the insider trading standard is “misappropriation of confidential information in breach of a pre-existing duty of trust and confidence to the source of the information” (vs any “material non-public information” for securities) https://www.cftc.gov/PressRoom/SpeechesTestimony/phamstateme...
I find it absurd that someone can create an unregulated market like Kalshi, and then all of us need to be beholden to it, even though the idea is stupid. How is it possible that someone can create a product that none of us agree on, and now everyone else has to conform to the rules around it because of the problems that it creates. I would rather Kalshi get shut down than the precedent of allowing this to control employees or people.
peterjliu|1 day ago
helsinkiandrew|1 day ago
https://news.kalshi.com/p/kalshi-trading-violation-enforceme...
https://x.com/polymarketmoney/status/2001056273500954784?s=4...
Analemma_|1 day ago
xrd|1 day ago
It makes you think that if you were able to surreptitiously add malicious side channel software into a popular npm package that you wouldn't just need to hunt for crypto wallets with balances.
You could also probably find a market for crypto wallets with small balances or zero balances. The history and date of creation would be the value to some.
This openai employee should have gone on the dark web to buy older addresses to cloak their activity.
It's sad to say that almost all crypto use cases point to fraud. I'm excited about crypto and there is some fascinating research around anonymous transactions (like zcash). But, that real utility is always overshadowed by the actions of charlatans or worse.
ruined|1 day ago
0x3f|1 day ago
MarceliusK|1 day ago
dontknowbtc|1 day ago
tell me you don’t understand crypto without telling me you don’t understand crypto.
ddp26|1 day ago
"Predicting" private, known information is the wrong use case.
hephaes7us|23 hours ago
Obviously, in a case like this, an individual would be violating the terms of their employment/non-disclosure agreement. I agree that is bad!
I don't think that damns the concept of "predicting known information".
7777777phil|1 day ago
I wrote about why prediction markets have a structural insider trading problem that nobody's solved yet: https://philippdubach.com/posts/the-absolute-insider-mess-of...
raincole|1 day ago
It might be nothing, but it'd be funny if karma farming bots are doing some 'reply frontrunning' over the internet.
ohyoutravel|1 day ago
unknown|1 day ago
[deleted]
gtowey|1 day ago
throwaway5752|1 day ago
seydor|1 day ago
morkalork|1 day ago
bookofjoe|1 day ago
mrkramer|1 day ago
rapind|1 day ago
shevy-java|1 day ago
tabs_or_spaces|17 hours ago
I would understand a low salaried person doing this, but not someone from a really high paying org
anonnon|17 hours ago
Given what OpenAI does, what kind of person, and with what moral character, do you think works there?
EDIT: AI in general seems to attract bad actors, for whatever reason. Remember Anthony Levandowski or Marvin Minsky?
blitzar|14 hours ago
cjonas|1 day ago
crazygringo|1 day ago
https://www.economist.com/leaders/2026/02/18/why-insider-tra...
> In prediction markets, informed trading is not a crime or an injustice—it is a valuable service.
A big exception, however, is using prediction markets to make predictions on events regarding publicly traded companies.
MarceliusK|1 day ago
tyre|1 day ago
If you see prediction markets as how they were originally pitched (price ~approximating likelihood), then insider trading is good. It provides discovery.
If you look at what prediction markets are today (gambling, especially on sports, especially in states that have banned it), then insider trading is bad. Particularly when the people trading can influence the outcome (e.g. a pitcher purposefully throwing into the dirt.)
idiotsecant|1 day ago
rohitpaulk|1 day ago
(a) how did they identify the employee, and (b) how come they weren't sent to jail
Havoc|1 day ago
I do hope corporations in general take a harder stance on this. From a society perspective people with inside knowledge fleecing randoms is not a win. We've got that somewhat under control on the stock exchange, but have this absurd situation where on prediction markets it is a free for all and everyone pretends this is fine.
I also think corporations should distance themselves from individuals willing to fleece randoms. Trading in general is very wild west survival of the fittest but active exploitation of insider knowledge speak of very poor morale character
strix_varius|1 day ago
If you're betting on a prediction market without insider information then you're just... The fool who is soon parted from his money one way or another.
I generally feel like people should be free to do whatever insane stuff they want with their own lives.
MarceliusK|1 day ago
chazftw|1 day ago
0xTJ|1 day ago
dgellow|1 day ago
re-thc|1 day ago
It's called <open>AI.
senkora|1 day ago
> The employee, she said, “used confidential OpenAI information in connection with external prediction markets (e.g. Polymarket).”
Note that “insider trading” is not illegal on prediction markets. The particular issue here is that the employee “disclosed” confidential information on a public forum by influencing the prices assigned to certain outcomes by prediction markets.
agency|1 day ago
qoez|1 day ago
unknown|1 day ago
[deleted]
blindriver|1 day ago
MengerSponge|1 day ago
Like, a 100k wager from a finance dude carries some information, but a 10k wager from a staffer says a lot more!
croes|1 day ago
djohnston|1 day ago