Regulating open-source AI will only help giants like Google and OpenAI, stifling innovation. It creates barriers only they can afford, limiting competition and diversity. Open-source fosters transparency and rapid progress. We do not need government regulations, or we'll end up like Europe with China leading.
Better question: Is it possible to restrict open source AI?
If it is open then everyone has access. Any restrictions would be akin to demanding that linux not use strong encryption or firefox implement content censorship.
Its possible to restrict DIY building of pretty much anything if your end goal was to stop people from doing something outside of their basement with it. I can't build my own open source coal fired power plant and except to sell power without the EPA coming to kill me. Same would be if i used a open source AI that violated some new consumer protection / anti fraud law if i choose to use it over the public internet / build it into a product. Hell you could probably go after the devs for being accessory if you really wanted to.
The license really does nothing to protect your project from regulation its just that the government doesn't care about open source yet.
It hasn't proven possible yet but who knows what will be developed the future?
A lot of open source efforts depend on big companies training million-dollar models and giving them away. These companies will often apply some censoring adjustment to the weights, which the open source community then undoes through fine tuning.
But perhaps in the future new methods of censorship will be developed, which are radically harder to undo?
And of course, there's always heavy-handed options available - if we can require hairdressers to hold professional licenses, we could require the same of anyone who wants to upload to huggingface or civitai.
Why? All three of those companies have an obvious and outspoken commitment to releasing models for free, they aren't trying to manipulate fear to sell a product like OpenAI.
fascinating how successful the bullshit rebranding of "sometimes we upload a TB file of weights" as "open source" has gone for them politically. I can't really imagine they thought it would go this well. will be interesting to see how this unexpected boon of a loophole will change their strategy, particularly """OpenAI""", who didn't think to do give themselves enough cover.
Say the government actually wanted to regulate open source AI, how exactly would they do that?
If the govt wants to regulate commercial AI, they can do that by going to the companies responsible for building that AI and say "do X, Y and Z or else we will punish you with A, B or C". But what do you say to a collective of developers that could be all around the globe? What stops that group from just saying "F* You" and continuing on?
Good. Effective Austruism is trying to destroy democratic institutions and it's likely a bigger threat to society than fascism. The pro-progress people needs to organize themselves to stop types like SBF from giving AI to China.
OldGreenYodaGPT|1 year ago
GauntletWizard|1 year ago
talldayo|1 year ago
I would argue this is inherent to the training and compute cost of all large language models.
b112|1 year ago
(As Microsoft explained years ago)
splwjs|1 year ago
laweijfmvo|1 year ago
burkaman|1 year ago
Unfortunately it's past the deadline for you to add your own comments on this, but I'm sure there will be future RFCs if you have thoughts.
talldayo|1 year ago
sandworm101|1 year ago
If it is open then everyone has access. Any restrictions would be akin to demanding that linux not use strong encryption or firefox implement content censorship.
grumpyinfosec|1 year ago
The license really does nothing to protect your project from regulation its just that the government doesn't care about open source yet.
michaelt|1 year ago
A lot of open source efforts depend on big companies training million-dollar models and giving them away. These companies will often apply some censoring adjustment to the weights, which the open source community then undoes through fine tuning.
But perhaps in the future new methods of censorship will be developed, which are radically harder to undo?
And of course, there's always heavy-handed options available - if we can require hairdressers to hold professional licenses, we could require the same of anyone who wants to upload to huggingface or civitai.
Legend2440|1 year ago
p0w3n3d|1 year ago
rvnx|1 year ago
talldayo|1 year ago
apwell23|1 year ago
[deleted]
ChrisArchitect|1 year ago
bananapub|1 year ago
_fat_santa|1 year ago
If the govt wants to regulate commercial AI, they can do that by going to the companies responsible for building that AI and say "do X, Y and Z or else we will punish you with A, B or C". But what do you say to a collective of developers that could be all around the globe? What stops that group from just saying "F* You" and continuing on?
war321|1 year ago
unknown|1 year ago
[deleted]
m3kw9|1 year ago
[deleted]
stonethrowaway|1 year ago
Updated link.
MP_1729|1 year ago
didntcheck|1 year ago