top | item 46208718

(no title)

akavi | 2 months ago

You are aware that insofar as AI chat apps are "hallucinatory text generator(s)", then so is Google Translate, right?

(while AFAICT Google hasn't explicitly said so, it's almost certainly also powered by an autoregressive transformer model, just like ChatGPT)

discuss

order

swiftcoder|2 months ago

> it's almost certainly also powered by an autoregressive transformer model, just like ChatGPT

The objective of that model, however, is quite different to that of an LLM.

parliament32|2 months ago

I have seen Google Translate hallucinate exactly zero times over thousands of queries over the years. Meanwhile, LLMs emit garbage roughly 1/3 of the time, in my experience. Can you provide an example of Translate hallucinating something?

lazide|2 months ago

Agreed, and I use G translate daily to handle living in a country where 95% of the population doesn’t speak any language I do.

It occasionally messes up, but not by hallucinating, usually grammar salad because what I put into it was somewhat ambiguous. It’s also terrible with genders in Romance languages, but then that is a nightmare for humans too.

Palmada palmada bot.

Teever|2 months ago

Every single time it mistranslates something it is hallucinations.

fouc|2 months ago

Google Translate hasn't moved to LLM-style translation yet, unfortunately