There's an old saying: "Yesterday's AI is today's algorithm". Few would consider A* search for route-planning or Alpha-Beta pruning for game playing to be "Capital A Captial I" today, but they absolutely were back at their inception. Heck, the various modern elaborations on A* are mostly still published in a journal of AI (AAAI).
https://en.wikipedia.org/wiki/AI_effect We got it named already, it just needs to be properly propagated until there's no value left in calling things 'AI'.
This is a fair point and maybe someone more well versed can correct me but pretty much all state of the art image recognition is trained neural networks nowadays right? A* is still something a human can reasonably code, it seems to me that there is a legitimate distinction between these types of things nowadays.
Maybe it is easier to define what isn't AI? Toshiba's handwritten postal code recognizers from the 1970s? Fuzzy logic in washing machines that adjusts the pre-programmed cycle based on laundry weight and dirtyness?
Historically, we often call something AI while we don’t really understand how it works. After that it quietly gets subsumed into machine learning or another area and called X algorithm.
mysterymath|1 year ago
Teleoflexuous|1 year ago
mrbombastic|1 year ago
bitwize|1 year ago
level1ten|1 year ago
singpolyma3|1 year ago
rzzzt|1 year ago
ska|1 year ago
singpolyma3|1 year ago
An example of similar computer can do that isn't AI would be arithmetic