This is a valid point, but we are still in the early stages of AI/LLMs, so one would expect the speed and efficiency to improve drastically (perhaps accuracy too) over the coming years.
At least AI & LLMs have large scale practical applications as opposed to crypto (IMO).
It's also interesting to think that IBM released an 8-trillion parameter model back in the 1980s [0]. Granted it was an n-gram model so it's not exactly an apples-to-apples comparison with today's models, but still, quite crazy to think about.
I wouldn't call the early McCulloch & Pitts work quite "full-fledged". Also backpropagation, essential for multi level perceptrons was not a thing until 1980s.
AlchemistCamp|1 year ago
IshanMi|1 year ago
[0]: https://aclanthology.org/J92-4003.pdf
varjag|1 year ago
unknown|1 year ago
[deleted]