Referring to this type of optimization program just as “AI” in an age where nearly everyone will misinterpret that to mean “transformer-based language model” seems really sloppy
Referring to this type of optimization as AI in the age where nearly everybody is looking to fund transformer-based language models and nobody is looking to fund this kind of optimization is just common sense though.
I use "ML" when talking about more traditional/domain specific approaches, since for whatever reason LLMs haven't hijacked that term in the same way. Seems to work well enough to avoid ambiguity.
But I'm not paid by the click, so different incentives.
Thinking "nearly everyone" has that precise definition of AI seems way more sloppy. Most people haven't even heard of OpenAI and ChatGPT still, but among people who have, they've probably heard stories about AI in science fiction. My definition of AI is any advanced computer processing, generative or otherwise, that's happened since we got enough computing power and RAM to do something about it, aka lately.
>Most people haven't even heard of OpenAI and ChatGPT still
What? I literally don't know a single person anymore who doesn't know what chatGPT is. In this I include several elderly people, a number of older children and a whole bunch of adults with exactly zero tech-related background at all. Far from it being only known to some, unless you're living in a place with essentially no internet access to begin with, chances are most people around you know about chatGPT at least.
For OpenAI, different story, but it's hardly little-known. Let's not grossly understate the basic ability of most people to adapt to technology. This site seems to take that to nearly pathological levels.
This exact kind of sloppy equivocation does seem to be one of the major PR strategies that tries to justify the massive investment in and sloppy rollout of transformer-based language models when large swaths of the public have turned against this (probably even more than is actually warranted)
I know, but can we blame the masses for misunderstanding AI when they are deliberately misinformed that transformers are the universe of AI? I think not!
Web 3(.0) always makes me think of the time around 14 years ago when Mark Zuckerberg publicly lightly roasted my room mate for asking for his predictions on Web 4.0 and 5.0.
saithound|7 months ago
benterix|7 months ago
For me, when someone says, "I'm working on AI", it's almost meaningless. What are you doing, actually?
benterix|7 months ago
https://github.com/artificial-scientist-lab/GWDetectorZoo/
Nothing remotely LLM-ish, but I'm glad they used the term AI here.
bee_rider|7 months ago
rachofsunshine|7 months ago
But I'm not paid by the click, so different incentives.
Lionga|7 months ago
crypto must now be named cryptography and AI must now be named ML to avoid giving the scammers and hypers good press.
a_victorp|7 months ago
coldtea|7 months ago
The real problem is not people using the term incorrectly, it's papers and marketing material using the term incorrectly.
fragmede|7 months ago
IanCal|7 months ago
You can have your own definition of words but it makes it harder to communicate.
southernplaces7|7 months ago
What? I literally don't know a single person anymore who doesn't know what chatGPT is. In this I include several elderly people, a number of older children and a whole bunch of adults with exactly zero tech-related background at all. Far from it being only known to some, unless you're living in a place with essentially no internet access to begin with, chances are most people around you know about chatGPT at least.
For OpenAI, different story, but it's hardly little-known. Let's not grossly understate the basic ability of most people to adapt to technology. This site seems to take that to nearly pathological levels.
advael|7 months ago
pharrington|7 months ago
tomrod|7 months ago
victorbjorklund|7 months ago
layer8|7 months ago
andai|7 months ago
buu700|7 months ago
zeofig|7 months ago