(no title)
sudosays | 1 year ago
If Meta open sources their models/tools and it gains wide adoption, ways will be found to run the models more efficiently or infrastructure/research built on top of Meta's work will ultimately end up saving them a lot of costs in future. Release the model that cost $10bn to make now, and save yourself billions when others build the tooling to run it at 1/10th the cost.
blueboo|1 year ago
Meanwhile Meta’s competitors commoditise and glean profits from actually-SOTA LLM offerings.
In any case, their hypothesis is testable: which open source innovations from Llama1/2 informed Llama3?
sudosays|1 year ago
I am not sure, but I agree that it is definitely testable.
If I had to guess/answer, I would argue that the open source contributions to Pytorch have a downstream contribution to the performance, and maybe the preparation and release of the models required an amount of polish and QA that would otherwise not have been there.
skinner927|1 year ago