this dir | view | cards | source | edit | dark top

Probability and Statistics 2

Probability and Statistics 2
Markov chains

example 1

Markov chains

example 2

a fly which may get caught in a web … absorbing states

Markov chains

df: stochastic process (náhodný proces)

Markov chains

df: Markov process/chain

Markov chains

df: transition diagram/graph

Markov chains

Markov chain is equivalent to a random walk on the transition diagram

from state ii we go to jj with the probability of pijp_{ij} independent of the past

Markov chains

in example 1, assume X0X_0, what is P(X2=w)P(X_2=w)?

Markov chains

probability mass function (PMF, pravděpodobnostní funkce) for X0,X1,X_0,X_1,\dots

Markov chains

theorem

Markov chains

theorem

Markov chains

df: probability of kk-step transition … rij(k)r_{ij}(k)

Markov chains

theorem (Chapman-Kolmogorov)

Markov chains

df: jj is accessible from ii (where i,jSi,j\in S)

Markov chains

proof

Markov chains

df: a state iSi\in S is recurrent if we return to it with probability 1

Markov chains

df: Ti=min{t1:Xt=i}T_i=\min\set{t\geq 1:X_t=i}

Markov chains

ii is transient

Markov chains

example – random walk on a line

Markov chains

theorem

if iji\leftrightarrow j, then either both ii and jj are recurrent or both ii and jj are transient

Markov chains

for finite Markov chains

Markov chains

stationary distribution / steady state distribution

Markov chains

theorem: if a Markov chain is finite, aperiodic (→ not periodic) and irreducible, then

  1. \exists unique stat. distribution π\pi
  2. i:limn(Pn)ij=πj\forall i:\lim_{n\to\infty}(P^n)_{ij}=\pi_j
Markov chains

df: iSi\in S has period di:=gcd{t:rii(t)>0}d_i:=\text{gcd}\set{t:r_{ii}(t)\gt 0}

iSi\in S is aperiodic if di=1d_i=1

Markov chains

df: ii is null recurrent if ii is recurrent and E(TiX0=i)=\mathbb E(T_i\mid X_0=i)=\infty

ii is positive recurrent if ii is recurrent and E(TiX0=i)<\mathbb E(T_i\mid X_0=i)\lt\infty

Markov chains

theorem

Markov chains

theorem

Markov chains

the proof is not easy, here is a cheat proof

Markov chains

to find π\pi, solve system of linear equations πP=π\pi P=\pi, add iSπi=1\sum_{i\in S}\pi_i=1

Markov chains

detailed balance equation

Markov chains

to imagine this: ant colony moving independently according to a Markov chain

stationary distribution     \iff the same number of ants at each state at each time – ants don't "accumulate"

Markov chains

detailed balance equation implies πP=π\pi P=\pi

detailed balance equation is stronger than πP=π\pi P=\pi

Markov chains

MCMC algo. sketch

Markov chains

absorbing state i:pii=1i:p_{ii}=1

Markov chains

example: 0A0\in A (?)

ai=P(t:Xt=0X0=i)a_i=P(\exists t:X_t=0\mid X_0=i)

Markov chains

μi=E(TX0=i)\mu_i=\mathbb E(T\mid X_0=i)

T=min{t:XtA}T=\min\set{t: X_t\in A}

Markov chains

theorem: (ai)iS(a_i)_{i\in S} are the unique solution to

Markov chains

theorem: (μi)iS(\mu_i)_{i\in S} are unieque solutions to

Markov chains

proof

Markov chains

example: drunk person on their way home

Markov chains

2-SAT problem

Markov chains

Hidden Markov Model (HMM)

Probability

what is probability?

Probability

where to find/use it?

Probability

probabilistic method

Probability

statistics

Probability

θ^MAP=argmaxθpΘX(θx)\hat\theta_\text{MAP}=\text{argmax}_\theta\,p_{\Theta\mid X}(\theta\mid x)

θ^\hat\theta is a point estimate for Θ\Theta

Probability

Beta function

B(α,β)=01xα1(1x)β1=(α1)!(β1)!(α+β1)!B(\alpha,\beta)=\int_0^1x^{\alpha-1}(1-x)^{\beta-1}=\frac{(\alpha-1)!(\beta-1)!}{(\alpha+\beta-1)!}

Probability

Beta distribution

fΘ(x)=1B(α,β)xα1(1x)β1f_\Theta(x)=\frac1{B(\alpha,\beta)}x^{\alpha-1}(1-x)^{\beta-1}

Probability

(lnf)=(c+(α1)lnx+(β1)ln(1x))=α1xβ11x(\ln f)'=(c+(\alpha-1)\ln x+(\beta-1)\ln (1-x))'={\alpha-1\over x}-{\beta-1\over 1-x}

Probability

LMS point estimate

Probability

from posterior fΘXf_{\Theta\mid X} we can find

Probability

sampling

Probability

ΘX=x\Theta\mid X=x

Probability

conditional independence

Probability

conditional expectation

Probability

birthday paradox

1xex1-x\approx e^{-x}

Probability

balls into bins

Probability

questions

Probability

P(Xi=0)=(11n)memnP(X_i=0)=(1-\frac1n)^m\approx e^{-\frac mn}

Probability

distribution of XiX_i

Probability

applications

Probability

exact case vs. Poisson case

theorem: any event that happens with probability p\leq p in the Poisson case happens with probability pem\leq p\cdot e\sqrt m in the exact case

Probability

bernoulli process, poisson process

Probability

statistics – descriptive vs. inferential

Probability

example: earbuds

Probability

example: alpacas

Probability

tests

Probability

non-parametric version … paired test & permutation test

for paired test, we only permute XiX_i with YiY_i

Probability

sign test

Yi=Y_i= "sign of XiX_i"

Probability

example

Probability

test procedure (Wilcoxon signed rank test)

Probability

back to our example

Probability

power of test

Probability

paired test

Probability

Mann-Whitney U-test

Probability

types of data

Probability

bootstrapping

Moment generating functions

XBer(p)X\sim\text{Ber}(p)

Moment generating functions

theorem: MX(s)=k=0E[Xk]skk!M_X(s)=\sum_{k=0}^\infty\mathbb E[X^k]\cdot \frac{s^k}{k!}

E[Xk]\mathbb E[X^k]kk-th moment of XX

Moment generating functions

proof

E[esX]=E[(sX)kk!]=E[Xkskk!]=E[Xkskk!]=k=0E[Xk]skk!\mathbb E[e^{sX}]=\mathbb E[\sum\frac{(sX)^k}{k!}]=\mathbb E[\sum X^k\frac{s^k}{k!}]=\sum\mathbb E[X^k\frac{s^k}{k!}]=\sum_{k=0}^\infty\mathbb E[X^k]\cdot \frac{s^k}{k!}

Moment generating functions

back to Ber(p)

Moment generating functions

XN(0,1)X\sim N(0,1)

Moment generating functions

example usage

Moment generating functions

theorem

Moment generating functions

example

Moment generating functions

theorem

X,YX,Y independent     MX+Y=MXMY\implies M_{X+Y}=M_X\cdot M_Y

Moment generating functions

proof

MX+Y(s)=E[es(X+Y)]=E[esXesY]=E[esX]E[esY]=MX(s)MY(s)M_{X+Y}(s)=\mathbb E[e^{s(X+Y)}]=\mathbb E[e^{sX}\cdot e^{sY}]=\mathbb E[e^{sX}]\cdot \mathbb E[e^{sY}]=M_X(s)\cdot M_Y(s)

Moment generating functions

example

Moment generating functions

theorem

Moment generating functions

theorem (CLT)

Moment generating functions

proof

Moment generating functions

theorem (Chernoff inequality)

Moment generating functions

application

Hurá, máš hotovo! 🎉
Pokud ti moje kartičky pomohly, můžeš mi koupit pivo.