Last time I was doing image processing in C I was doing quantization of the colorspace using the technique out of a paper from 1982. Just because a source is old doesn't mean it is wrong.
It doesn't mean it's wrong. Like partial derivatives are centuries old. But we are moving on. Just recently I've seen a book on image recognition. While recent it was focused on math and not a word about NN. That looked so outdated. Someone spent the whole carer on this way. Another example would be linguistic approach to text translation. 'Stupid' LLM does it way better today. I can only guess how pro linguists feel about it.
Yes, those methods are old, but they’re explainable and much easier to debug or improve compared to the black-box nature of neural networks. They’re still useful in many cases.
Only partially. The chapters on edge detection, for example, only have historic value at this point. A tiny NN can learn edges much better (which was the claim to fame of AlexNet, basically).
I see it the same way I see 'Applied Cryptography'. It’s old C code, but it helps you understand how things work under the hood far better than a modern black box ever could. And in the end, you become better at cryptography than you would by only reading modern, abstracted code.
jandrese|11 months ago
numba888|11 months ago
vincenthwt|11 months ago
earthnail|11 months ago
rahen|11 months ago
unknown|11 months ago
[deleted]