My process control theory textbook has a chapter on neural networks and a lot of the language in control theory has an AI like tinge to it. I think this AI language is native to control theory so it might not be as overblown as it first sounds.
Huh, control theorists always try to rigorously prove the stability and performance of their algorithms. AI seems to be the opposite of that: just let the black box solve it and don't worry about any problems, we'll just add more training data if they happen!
I firmly believe control theory folks didn’t invent LLMs only because the idea of doing a big fit on everything sounds too much like a joke they were telling each other.
If you type ‘fuzzy logic’ in to google the autocomplete suggested search is ‘fuzzy logic rice cooker’. Control theory has been stealing ML terminology for a long time.
amelius|7 months ago
FirmwareBurner|7 months ago
fourthark|7 months ago
Hackbraten|7 months ago
dr_dshiv|7 months ago
bee_rider|7 months ago
jameshart|7 months ago
ethbr1|7 months ago
Speaking of, what does it actually mean? That the cooker isn’t using a timer?
Do most of them run off weight + time + heat response logic?
Onavo|7 months ago
raxxorraxor|7 months ago