(no title)
probably_wrong | 1 month ago
We've been doing Bayesian content (aka spam) filtering for over 20 years, based in no small part on Paul Graham's essay "A plan for spam". According to HP [1], a home computer at the time had a single 1.5Ghz core and 256Mb of RAM.
Using LLMs would achieve essentially the same while requiring a couple orders of magnitude more resources.
[1] https://www.hp.com/us-en/shop/tech-takes/specifications-pers...
alain94040|1 month ago