I like this paper and it appears to be one of the best in the literature so far for AI for materials. Even DFT is not really scalable for this, computing the ground state of even a dozen unit cells requires many many CPU-hours. They themselves in fact relax the proposed structures by minimizing the energy of psuedopotentials, for even DFT is too expensive for that step. I said already I think improving DFT itself is the most potentially impactful application of AI in this space, in my opinion. Of course approximations are always necessary, I’m not at all against that, but DFT ignores or approximates correlations by design so there is an inherent limitation there, which means, if you train your models to predict that, it will have the same limitation. It’s just like with LLMs, only imagine training principally on synthetic data. Obviously LLMs have succeeded with limitation sources of synthetic data, but they are principally trained with “real” data.
11101010001100|1 year ago