(no title)
asb | 1 year ago
We’re renewing our commitment to using Apache 2.0 license for our general purpose models, as we progressively move away from MRL-licensed models
asb | 1 year ago
We’re renewing our commitment to using Apache 2.0 license for our general purpose models, as we progressively move away from MRL-licensed models
diggan|1 year ago
It's a bit like developing a binary application and slapping a FOSS license on the binary while keeping the code proprietary. Not saying that's wrong or anything, but people reading these announcements tend to misunderstand what actually got FOSS licensed when the companies write stuff like this.
crawshaw|1 year ago
To consider just the power of fine tuning: all of the press DeepSeek have received is over their R1 model, a relatively tiny fine-tune on their open source V3 model. The vast majority of the compute and data pipeline work to build R1 was complete in V3, while that final fine-tuning step to R1 is possible even by an enthusiastic dedicated individual. (And there are many interesting ways of doing it.)
The insistence every time open sourced model weights come up that it is not "truly" open source is tiring. There is enormous value in open source weights compared to closed APIs. Let us call them open source weights. What you want can be "open source data" or somesuch.
eldenring|1 year ago
This kind of purity test mindset doesn't help anyone. They are shipping the most modifiable form of their model.
jacooper|1 year ago
Like every other open source / source available LLM?
zamalek|1 year ago
dismalaf|1 year ago
mcraiha|1 year ago
youssefabdelm|1 year ago
dismalaf|1 year ago
No one's going to pay for an inferior closed model...
mythz|1 year ago
littlestymaar|1 year ago
globular-toast|1 year ago