(no title)
milancurcic | 2 years ago
Getting a weather or climate model from zero to production grade requires approximately 100 person-years, or $20M (personal experience). Because of extremely high scientific expertise needed to correctly implement such models, it's more difficult to leverage open source contributions from a broader community, like it is with web or ML frameworks. So most of the development in these projects is done by full-time hires, and to a lesser extent by contributions from early adopters.
The key technical arguments that I hear/read for such transition projects are ease of portability to accelerators (e.g. GPUs), and higher availability of programmers in the workforce.
My intuition is that a $20M budget, if carefully allocated to compiler and hardware accelerator teams, could solve running Fortran on any accelerator in its entirety.
With Fortran's tooling and compiler support for offloading standard Fortran to accelerators slowly but surely improving over time, the rationale for porting to other languages becomes increasingly more questionable and is being revisited.
But regardless of technical arguments, it's a good thing for large models and frameworks to be written and explored in diverse programming languages.
wiz21c|2 years ago
Also, I understand the only Fortran compiler that supports GPU is the one from nvidia which is proprietary. I prefer to rely on open source for a code base that will last at least ten years...
But reading this HN's thread, I understand that Fortran is more alive than I thought. How many new developments are done with Fortran ? I mean, to me, Fortran is a bit like Cobol: it is so entrenched that, obviously, it still have a lot of activity but the momentum is moving towards more modern languages... But, well, that's all guesses an impressions...
bafe|2 years ago
counters|2 years ago
[1]: https://arxiv.org/abs/2311.07222
adgjlsfhk1|2 years ago
milancurcic|2 years ago
milancurcic|2 years ago
unknown|2 years ago
[deleted]