"They're moving the goalposts" is increasingly the autistic shrieking of someone with no serious argument or connection to reality whatsoever.
No one cares about how "AGI" or whatever the fuck term or internet-argument goalpost you cared about X months ago was. Everyone cares about what current tech can do NOW, and under what conditions, and when it fails catastrophically. That is all that matters.
So, refining the conditions of an LLM win (or loss) is all that matters (not who wins or loses depending on some particular / historical refinement). Complaining that some people see some recent result as a loss (or win) is just completely failing to understand the actual game being played / what really matters here.
I'm just saying that AI critics like to say that they don't like AI, and to prove their point they constantly move up their definition of "good enough", and when and AI reaches that objective, they change their definition of good enough.
D-Machine|15 days ago
No one cares about how "AGI" or whatever the fuck term or internet-argument goalpost you cared about X months ago was. Everyone cares about what current tech can do NOW, and under what conditions, and when it fails catastrophically. That is all that matters.
So, refining the conditions of an LLM win (or loss) is all that matters (not who wins or loses depending on some particular / historical refinement). Complaining that some people see some recent result as a loss (or win) is just completely failing to understand the actual game being played / what really matters here.
_giorgio_|3 days ago
I'm just saying that AI critics like to say that they don't like AI, and to prove their point they constantly move up their definition of "good enough", and when and AI reaches that objective, they change their definition of good enough.