Likewise, same result for me too but I get it. It's a language model. It makes a best guess based on the information it was provided so I trust it on the things I'm familiar with but verify on those that I'm not.
I'm really stunned by the fact that, it seems, so few people get this. It's not a robot in the dense of Commander Data from Star Trek. It's a recursive software method that just has unfathomable amounts of data to base it's guess for some correct sentences on.
There is no logic, no reasoning, no nothing. Yet people complain it's 'giving me wrong answers'. Well, it doesn't know what's it's giving you either way. It only knows the statistics or odds about the sentences it creates being similar to what others have said before.
earth-adventure|2 years ago
There is no logic, no reasoning, no nothing. Yet people complain it's 'giving me wrong answers'. Well, it doesn't know what's it's giving you either way. It only knows the statistics or odds about the sentences it creates being similar to what others have said before.
tropicalbeach|2 years ago