(no title)
Gluber
|
3 years ago
I agree with your sentiment. But those things exist (not that that validates the authors argument) and I still shake in terror when during covid I was asked to take a look at a virus spread simulation (cellular automaton) that was written by a university professor and his postdoc team for software engineering at a large university that modeled evey cell in a 100k x 100k grid as a class which used virtual methods for every computation between cells. Rewrote that in Cuda and normal buffers/ arrays.. and an epoch ran in milliseconds instead of hours.
phtrivier|3 years ago
Then again, at some point we had "Lisp machines", maybe some day there will be a computer architecture where memory / computations patterns are adapted to massive simulation - rather than shoehorning on existing architecture.
And those will fail just as miserably as Lisp machines.