top | item 46026730

(no title)

nerevarthelame | 3 months ago

Generative text to speech models can hallucinate and produce words that are not in the original text. It's not always consequential, but a court setting is absolutely the sort of place where those subtle differences could be impactful.

Lawyers dealing with gen-AI TTS rulings should compare what was spoken compared to what was in the written order to make sure there aren't any meaningful discrepancies.

discuss

order

csallen|3 months ago

People can also make mistakes while reading, and I suspect we do so at just as much if not more frequency as gen AI text-to-speech algos.

It's the AI thinking that makes me wary, not AI text-to-speech.