(no title)
papyrus9244 | 3 months ago
Or, in other words, a Markov Chain won't hallucinate. Having a system that only repeats sentences from it's source material and doesn't create anything new on its own is quite useful on some scenarios.
papyrus9244 | 3 months ago
Or, in other words, a Markov Chain won't hallucinate. Having a system that only repeats sentences from it's source material and doesn't create anything new on its own is quite useful on some scenarios.
Sohcahtoa82|3 months ago
It very much can. Remember, the context windows used for Markov Chains are usually very short, usually in the single digit numbers of words. If you use a context length of 5, then when asking it what the next word should be, it has no idea what the words were before the current context of 5 words. This results in incoherence, which can certainly mean hallucinations.
stavros|3 months ago
vrighter|3 months ago