1. 程式人生 > >常見概率分佈總結

常見概率分佈總結

Discrete

Bernoulli distribution

  • pmf
    • fX(x)=P(X=x)={(1p)1xpxfor x = 0 or 10otherwisef_X(x) = P(X= x) =\left\{\begin{aligned}(1-p)^{1-x}p^x & \quad \text{for x = 0 or 1}\\ 0 & \quad\text{otherwise}\end{aligned}\right.
  • expectation
    • E(X)=
      pE(X) = p

Binomial distribution

  • pmf
    • fX(k)=P(X=k)={Cnkpk(1p)nkfor k=0,1,....,n0otherwisef_X(k) = P(X= k) =\left\{\begin{aligned}C_n^kp^k(1-p)^{n-k} & \quad \text{for k=0,1,....,n}\\ 0 & \quad\text{otherwise}\end{aligned}\right.
  • expectation
    • E(X)=npE(X) = np
  • variance
    • var(X)=np(1p)var(X) = np(1-p)

Geometric distribution

  • pmf
    • fX(k)=P(X=k)={p(1p)k1for k=1,2,3...0otherwisef_X(k) = P(X= k) =\left\{\begin{aligned}p(1-p)^{k-1} & \quad \text{for k=1,2,3...}\\ 0 & \quad\text{otherwise}\end{aligned}\right.
  • expectation
    • E(X)=1PE(X) = \frac{1}{P}

Negative binomial distribution

  • The negative binomial distribution arises as a generalization of the geometric distribution.

  • Suppose that a sequence of independent trials each with probability of success pp is performed until there are rr successes in all.

    • so can be denote as pCk1r1pr1(1p)(k1)(r1)p \cdot C_{k-1}^{r-1} p^{r-1}(1-p)^{(k-1)-(r-1)}
  • pmf

    • fX(k)=P(X=k)={Ck1r1pr(1p)krfor k=1,2,3...0otherwisef_X(k) = P(X= k) =\left\{\begin{aligned}C_{k-1}^{r-1}p^r(1-p)^{k-r} & \quad \text{for k=1,2,3...}\\ 0 & \quad\text{otherwise}\end{aligned}\right.

Hypergeometric distribution

  • Suppose that an urn contains nn balls, of which rr are black and nrn-r are white. Let XX denote the number of black balls drawn when taking mm balls without replacement.
  • pmf
    • fX(k)=P(X=k)={CrkCnrmkCnm0kr0otherwisef_X(k) = P(X= k) =\left\{\begin{aligned}\frac{C_r^kC_{n-r}^{m-k}}{C_n^m} & \quad 0\le k \le r\\ 0 & \quad\text{otherwise}\end{aligned}\right.

Possion distribution

  • can be derived as the limit of a binomial distribution as the number of trials approaches infinity and the probability of success on each trial approaches zero in such a way that np=λnp = \lambda,λ\lambda can be seen as the successful trials
  • pmf
    • P(X=k)=λkk!eλk=0,1,2...P(X = k) = \frac{\lambda^k }{k!} e^{-\lambda} \quad k = 0,1,2...

Continuous

Uniform distribution

  • A uniform r.v on the interval [a,b] is a model for what we mean when we say “choose a number at random between a and b”
  • pdf
    • fX(x)={1baaxb0otherwisef_X(x) = \left\{\begin{aligned}\frac{1}{b-a} & \quad a\le x \le b\\ 0 & \quad\text{otherwise}\end{aligned}\right.

Exponential distribution

  • Exponential distribution is often used to model lifetimes or waiting times, in which context it is conventional to replace xx by tt.
  • pdf
    • fX(x)={λeλxx00otherwisef_X(x) = \left\{\begin{aligned}\lambda e^{-\lambda x} & \quad x\ge 0\\ 0 & \quad\text{otherwise}\end{aligned}\right.
  • cdf(easy to get)
    • FX(x)={1eλxx00otherwiseF_X(x) = \left\{\begin{aligned}1-e^{-\lambda x} & \quad x\ge 0\\ 0 & \quad\text{otherwise}\end{aligned}\right.
  • expectation
    • E(X)=λE(X) = \lambda
  • variance
    • var(X)=λ2var(X) = \lambda^2

property

  • let X,YX,Y are independent Poisson r.v.s with θ1,θ2\theta_1,\theta_2,then X+YPoisson(θ1+θ2)X+Y\sim Poisson (\theta_1+\theta_2)

Gamma distribution

  • pdf
    • g(t)={λατ(α)tα1eλtt00otherwiseg(t) = \left\{\begin{aligned}\frac{\lambda^\alpha}{\tau (\alpha)}t^{\alpha-1}e^{-\lambda t} & \quad t\ge 0\\ 0 & \quad\text{otherwise}\end{aligned}\right.
  • τ(x)=0ux1eudu,x>0\tau(x) = \int _0^\infty u^{x-1}e^{-u}du,x>0
  • expectation
    • E(X)=αλE(X) = \frac{\alpha}{\lambda}