(no title)
slowtrek | 11 months ago
How is it that we can theorize that the model would get better with more data, but we can't theorize that the business model would need to get bigger (pay the content creators) to train the model? Shoot first and ask questions later (or rather, BEG later).
whatthedangit|11 months ago
Allow OpenAI and other AI companies to use all data for training, but require that they pay it forward by charging royalties on profits beyond X amount of profit, where X is a number high enough to imply true AGI was reached.
The royalties could go into a fund that would be paid out like social security payments for every American starting when they were 18 years old. Companies could likewise request a one time deferred payment or something like that.
It's having your cake and eating it. Also helping ease some tensions around job loss.
Sadly, what we'll likely get is a bunch of tech leaders stumbling into wild riches, hoarding it, and then having it taken from them by force after they become complacent and drunk on power without the necessary understanding of human nature or history to see why they've brought it on themselves.
Mountain_Skies|11 months ago
Another would be that they couldn't sell access to customers directly but rather must license it out to various entities at rates set by regulators. Those entities then would compete with each other for end customers. This of course might be prone to regulatory capture like happens with utilities.
slowtrek|11 months ago
re-thc|11 months ago
Who is we? How do you know? Never is a strong word.
> If we hold our principles to OpenAI (pay who you took from), they will go bankrupt.
i.e. their business wasn't feasible to begin with? Sounds fine? What's wrong with them being bankrupt (if needed).
addandsubtract|11 months ago