The Basics of Probability


  • Probability measures the amount of uncertainty of an event: a fact whose occurence is uncertain.
  • Sample space refers to the set of all possible events, denoted as Study notes for Discrete Probability Distribution_sed.
  • Some properties:

    • Sum rule: Study notes for Discrete Probability Distribution_ide_02
    • Union bound: Study notes for Discrete Probability Distribution_ide_03

  • Conditional probability:Study notes for Discrete Probability Distribution_sed_04. To emphasize that p(A) is unconditional, p(A) is called "marginal probability", and p(B, A) is called "joint probability", where p(A, B)=p(B|A) p(A) is called the "multiplication rule" or "factorization rule".
  • Total probability theorem: p(B) = p(B|A)p(A) + p(B|~A)p(~A)
  • Bayes' Theorem: Study notes for Discrete Probability Distribution_ide_05 Bayes' Theorem can be regarded as a rule to update a prior probability p(A) into a posterior probability p(A|B), taking into account the amount/occurrence of evidence/event B.
  • Conditional independence: Two events A and B, with p(A)>0 and p(B)>0 are independent, given C, if p(A, B|C)=p(A|C) p(B|C).
  • Probability mass function (p.m.f) of random variable X is a function Study notes for Discrete Probability Distribution_ide_06
  • Joint probability mass function of X and Y is a function Study notes for Discrete Probability Distribution_sed_07
  • Cumulative distribution function (c.d.f) of a random variable X is a function: Study notes for Discrete Probability Distribution_ide_08
  • The c.d.f describes the probability in a specific interval, whereas the p.m.f describes the probability in a specific event.
  • Expectation: the expectationof a random variable X is:

    • linearity: E[aX+bY]=aE[x]+bE[Y]
    • if X and Y are independent: E[XY]=E[X]*E[Y]
    • Markov's inequality: let X be a nonnegative random variable with Study notes for Discrete Probability Distribution_sed_09, then for all Study notes for Discrete Probability Distribution_ide_10

  • Variance: the variance of a random variable X is: , where is called the standard deviation of the random variable X.

    • Var[aX] = a2Var[X]
    • if X and Y are independent, Var[X+Y]=Var[X]+Var[Y]
    • Chebyshev's inequality: let X be a random variable Study notes for Discrete Probability Distribution_sed_11, then for all Study notes for Discrete Probability Distribution_ide_12


Bernoulli Distribution


  • A (single) Bernoulli trial is an experiment whose outcome is random and can be either of two possible outcomes, "success" and "failure", or "yes" and "no". Examples of Bernoulli trials include: flipping a coin, political option poll, etc.
  • The Bernoulli distribution is a discrete probability distribution ofone (a) discrete random variable X, which takes value 1 with success probability p: Pr(X=1)=p, and value 0 with failure probability Pr(X=0)=q=1-p. For formally, the Bernoulli distribution is summarized as follows:

    • notation: Bern(p), where 0<p<1 is the probability of success.
    • support: X={0, 1}
    • p.m.f: Pr[X=0]=q=1-p, Pr[X=1]=p
    • mean: E[X]=p
    • variance: Var[X]=p(1-p)
    • It is a special case of Binomial distribution B(n, p). Bernoulli distribution is B(1, p).


Binomial Distribution


  • The Binomial distribution is the discrete probability distribution of the number of successes in a sequence ofn independent Bernoulli trials with success probabilityp, denoted asX~B(n, p).
  • The Binomial distribution is often used to model the number of successes in a sample of sizen drawn with replacement from a population of sizeN. If the sampling is carried out without replacement, the draws are not independent and so the resulting distribution is a hypergeometric distribution, not a binomial one.
  • The Binomial distribution is summarized as follows:

    • notation: B(n, p), where n is the number of trials and p is the success probability in each trial
    • support: k = {0, 1, ..., n} the number of successes
    • p.m.f: Study notes for Discrete Probability Distribution_sed_13
    • mean: np
    • variance: np(1-p)

  • If n is large enough, then the skew of the distribution is not too great. In this case, a reasonable approximation to B(n, p) is given by the normal distribution: since a large n will result in difficulty to compute the p.m.f of Binomial distribution.

    • one rule to determine if such approximation is reasonable, or if n is large enough is that both np and np(1-p) must be greater than 5. If both are greater than 15 then the approximation should be good.
    • A second rule is than for n>5, the normal approximation is adequate if: Study notes for Discrete Probability Distribution_ide_14
    • Another commonly used rule holds that the normal approximation is appropriate only if everything within 3 standard deviation of its mean is within the range of possible values, that is if: Study notes for Discrete Probability Distribution_ide_15
    • To improve the accuracy of the approximation, we usually use a correction factor to take into account that the binomial random variable is discrete while the normal random variable is continuous. In particular, the basic idea is to treat the discrete value k as the continuous interval from k-0.5 to k+0.5.

  • In addition, Poisson distribution can be used to approximate the Binomial distribution when n is very large. A rule of thumb stating that the Poisson distribution is a good approximation oof the binomial distribution if n is at least 20 and p is smaller than or equal to 0.05, and an excellent approximation if n>=100, and np<=10: Study notes for Discrete Probability Distribution_sed_16

Poisson Distribution


  • Poisson distribution: Let X be a discrete random variable taking values in the set of integer numbers Study notes for Discrete Probability Distribution_ide_17 with probability: Study notes for Discrete Probability Distribution_ide_18My understanding. Poisson distribution describes the fact that the probability of drawing a specific integer from a set of integers is not uniform. For example, it is well-known that if someone is asked to pick a random integer from 1-10, some integers are occurring with greater probability whereas some others happen with lower probability. Although it seems that all possible integers get equal chance to be picked, it is not true in real case. I think this may be due to subjectivity of people, i.e., some one prefers larger values while other tends to pick smaller ones. This point needs to be verified as I got this feeling totally from intuitions.
  • The Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independent of the time since the last event.
  • The Poisson distribution is summarized as follows.

    • notation: Study notes for Discrete Probability Distribution_ide_19, whereStudy notes for Discrete Probability Distribution_ide_20 is a real number, indicating the number of events occurring that will be observed in the time intervalStudy notes for Discrete Probability Distribution_sed_21.
    • support: k = {0, 1, 2, 3, ...}
    • mean: Study notes for Discrete Probability Distribution_ide_22
    • variance: Study notes for Discrete Probability Distribution_sed_23

  • Applications of Poisson distribution

    • Telecommunication: telephone calls arriving in a system
    • Management: customers arriving at a counter or call center
    • Civil engineering: cars arriving at a traffic light

  • Generating Poisson random variables algorithm poisson_random_number:init: Let
    Study notes for Discrete Probability Distribution_ide_24,
    Study notes for Discrete Probability Distribution_sed_25, and
    Study notes for Discrete Probability Distribution_ide_26.do:
    Study notes for Discrete Probability Distribution_ide_27 Generate uniform random number u in [0, 1], and let
    Study notes for Discrete Probability Distribution_sed_28while p>L. return k-1.

References


  1. Paola Sebastiani, A tutorial on probability theory
  2. Mehryar Mohri, Introduction to Machine Learning - Basic Probability Notations.