(no title)
firebirdn99 | 2 years ago
The fallout showed non-profit missions can't co-exist with for-profit incentives. And the power that investors were exerting, and employees (who would also benefit from the recent 70B round they were going to have) was too much.
And any disclaimer the investors got when investing in OpenAI was meaningless. It reportedly stated they would be wise in viewing their investment as charity, and they can potentially lose everything. And there was an AGI clause that said it will reconsider all financial arrangements, that Microsoft and other investors had when investing in the company was all worthless. Link to Wired article with interesting details -https://www.wired.com/story/what-openai-really-wants/
marcus0x62|2 years ago
They need employees to advance their stated mission.
> One of the comments a member made was, if the company was destroyed, it would still be consistent with serving the mission. Which is right.
I mean, that's a nice sound bite and everything, but the only scenario where blowing up the company seems to be consistent with their mission is the scenario where Open AI itself achieves a breakthrough in AGI and where the board thinks that system cannot be made safe. Otherwise, to be relevant in guiding research towards AGI, they need to stay a going concern, and that means not running off 90% of the employee base.
firebirdn99|2 years ago
That's why they presumably agreed to find a solution. But at the same time shows that in essence, entities with for-profit incentives find a way to get what they want. There certainly needs to be more thought and discussion about governance, and how we collectively as a species or each company individually governs AI.
khazhoux|2 years ago
I know you're quoting the (now-gone) board member, but this is a ridiculous take. By this standard, Google should have dissolved in 2000 ("Congrats everyone, we didn't be evil!"). Doctors would go away too ("Primum non nocere -- you're all dismissed!").
jacquesm|2 years ago
sanderjd|2 years ago
The silver lining is that this should clear the path to proper regulation, as it's now clear that this self-regulation approach was given a go, and just didn't work.
rvba|2 years ago