In this blog, I try to introduce the concept of Markov chain , so first, we give the definition of markov chain. And we state that every markov chain will have a fixed point.
Go back to Main page Notes for Complexity theory
Then we introduce the random walk and how to connect the concept of Markov Chain and the random Walk. And we proof some basic facts of the expection values.
How to leave the maze without memorize anything using random walk.
附上 wiki 的 Markov Chain inequality:
If X is a nonnegative random variable and a > 0, then the probability that X is at least a is at most the expectation of X divided by a:
From the definition of expectation:
However, X is a non-negative random variable thus,
From this we can derive,
From here it is easy to see that
Chebyshev's inequality
這個東西也重要
Chebyshev's inequality follows from Markov's inequality by considering the random variable
and the constant for which Markov's inequality reads
This argument can be summarized (where "MI" indicates use of Markov's inequality):
No comments:
Post a Comment