top | item 45390302

(no title)

samfriedman | 5 months ago

Maybe they've been working on it, but got scooped?

discuss

order

zamadatix|5 months ago

I don't think that's the case. The numbers in the paper suggest ~92% of the training data comes from pre-existing AI models, including AlphaFold, and they claim things like:

> We largely adopt the data pipeline implemented in Boltz-11 1https://github.com/jwohlwend/boltz (Wohlwend et al., 2024), which is an open-source replication of AlphaFold3

I believe the story here is largely that they simplified the architecture and scaled it to 3B parameters while maintaining leading results.