top | item 45990343

(no title)

patrickk | 3 months ago

They're not doing it out of the goodness of their heart, they're deploying a classic strategy known as "Commoditize Your Complement"[1], to ward off threats from OpenAI and Anthropic. It's only a happy accident that the little guy benefits in this instance.

Facebook is a deeply scummy company[2] and their stranglehold on online advertising spend (along with Google) allows them to pour enormous funds into side bets like this.

[1] https://gwern.net/complement

[2] https://en.wikipedia.org/wiki/Careless_People

discuss

order

unsungNovelty|3 months ago

Not even closely OK with facebook. But none of the other companies do this. And Mark has been open about it. I remember him saying in an interview the same very openly. Something oddly respectable about NOT sugar coating with good PR and marketing. Unlike OpenAI.

arcanemachiner|3 months ago

Well, when your incentives happen to align with those of a faceless mega-corporstion, you gotta take what you can get.

GCUMstlyHarmls|3 months ago

You dont have to thank them for it though.

jayd16|3 months ago

We can still like it. We're not nominating Nobel Prizes or something.

_giorgio_|3 months ago

Among the top 10 tech companies and beyond, they have the most successful open source program.

These projects come to my mind:

SAM segment anything.

PyTorch

LLama

...

Open source datacenters and server blueprints.

the following instead comes from grok.com

Meta’s open-source hall of fame (Nov 2025)

---------------------

Llama family (2 → 3.3) – 2023-2025 >500k total stars · powers ~80% of models on Hugging Face Single-handedly killed the closed frontier model monopoly

---------------------

PyTorch – 2017 85k+ stars · the #1 ML framework in research TensorFlow is basically dead in academia now

---------------------

React + React Native – 2013/2015 230k + 120k stars Still the de-facto UI standard for web & mobile

---------------------

FAISS – 2017 32k stars · used literally everywhere (even inside OpenAI) The vector similarity search library

---------------------

Segment Anything (SAM 1 & 2) – 2023-2024 55k stars Revolutionized image segmentation overnight

---------------------

Open Compute Project – 2011 Entire open-source datacenter designs (servers, racks, networking, power) Google, Microsoft, Apple, and basically the whole hyperscaler industry build on OCP blueprints

---------------------

Zstandard (zstd) – 2016 Faster than gzip · now in Linux kernel, NVIDIA drivers, Cloudflare, etc. The new compression king

---------------------

Buck2 – 2023 Rust build system, 3-5× faster than Buck1 Handles Meta’s insane monorepo without dying

---------------------

Prophet – 2017 · 20k stars Go-to time-series forecasting library for business

---------------------

Hydra – 2020 · 9k stars Config management that saved the sanity of ML researchers

---------------------

Docusaurus – 2017 · 55k stars Powers docs for React, Jest, Babel, etc.

---------------------

Velox – 2022 C++ query engine · backbone of next-gen Presto/Trino

---------------------

Sapling – 2023 Git replacement that actually works at 10M+ file scale

---------------------

Meta’s GitHub org is now >3 million stars total — more than Google + Microsoft + Amazon combined.

---------------------

Bottom line: if you’re using modern AI in 2025, there’s a ~90% chance you’re running on something Meta open-sourced for free.