top | item 8863181

Markov Chains Explained

64 points| signa11 | 11 years ago |techeffigytutorials.blogspot.com | reply

15 comments

order
[+] learnstats2|11 years ago|reply
I know what Markov Chains are (the probabilities for the next state of the chain is dependent only on the current state)

What I would prefer to know is why are they more useful than a more nuanced model with non-fixed probabilities, or memory of previous states? - given that these are not really harder to simulate.

It seems to me that Markov Chains are very often used as an inappropriate simplification.

Is there some mathematical advantage to using them?

[+] thomaspaine|11 years ago|reply
Markov chains have a lot of nice features you can derive from them, like convergence to a stationary distribution. It's hard to say that it's an inappropriate simplification if it happens to be pragmatic (like most simplifications).

For example PageRank can be modeled using markov chains. You can surely develop more complicated models but markov chains provide a good starting point.

[+] dangerlibrary|11 years ago|reply
The formal definition of a single Markov Chain is one with a fixed transition matrix.

In practice, many software implementations of Markov models update/modify the transition matrix as new information becomes available. Formally, you're no longer working with the same Markov chain, but that doesn't mean that the model isn't useful.

[+] coherentpony|11 years ago|reply
What you are referring to are non-Markov processes. That is, process with a memory. These can be useful too, but the reason Markovian processes are preferred is because of the independence (lack of memory). Independence is a sufficient condition for many theorems that use Markov processes to prove things about average behaviour.
[+] IndianAstronaut|11 years ago|reply
Markov chains easily allow you to attach cost functions to them. Not particularly data mining oriented, but good for running simulations and cost-benefit.
[+] IndianAstronaut|11 years ago|reply
Another useful thing to study along with Markov chains is stochastic matrices. They are useful and come up in many situations such as NLP, combinatorics, etc.
[+] ramblerman|11 years ago|reply
Whats the difference between a 'first order' markov chain and a simple conditional probability?

I mean, what's the big mathematical revelation that grants this it's own name.