LLMs do not lie. That implies agency and intentionality that they do not have.
LLMs are approximately right. That means they're sometimes wrong, which sucks. But they can do things for which no 100% accurate tool exists, and maybe could not possibly exist. So take it or leave it.
th0ma5|3 months ago
Legend2440|3 months ago
LLMs are approximately right. That means they're sometimes wrong, which sucks. But they can do things for which no 100% accurate tool exists, and maybe could not possibly exist. So take it or leave it.
fastball|3 months ago
blamestross|3 months ago
It kind of is that clear. It's IP laundering and oligarchic leveraging of communal resources.
satvikpendem|3 months ago
2. Open source models exist.