Introductory Probability Course Note
Made by Mike_Zhang
所有文章:
Introductory Probability Course Note
Python Basic Note
Limits and Continuity Note
Calculus for Engineers Course Note
Introduction to Data Analytics Course Note
Introduction to Computer Systems Course Note
个人笔记,仅供参考
FOR REFERENCE ONLY
Course note of AMA1104 Introductory Probability, The Hong Kong Polytechnic University, 2021.
1. Probability
1.1 Permutations Rule
1.2 Combinations Rule
1.3 Collectively Exhaustive
A set of events is ${A_1, A_2,…, A_n}$ said to be collectively exhaustive if one of the events must occur (list all events), then the sample space is
1.4 Joint Probability
The probability of the intersection of two events is called their joint probability, which is:
1.5 Union Probability
The probability of the union of two events is called their union probability, which is:
1.6 Mutually Exclusive
Two events are said to be mutually exclusive if, when one of the two events occurs in an experiment, the other cannot occur, which is:
If the events, A and B, are mutually exclusive, the probability that either event occurs is
1.7 Conditional Probability
The probability of an event $A$ given that an event $B$ has occurred, is called the conditional probability of $A$ given $B$ and is denoted by the symbol $P(A|B)$ and read as ‘the probability of $A$ given that $B$ has already occurred.
If $A$ and $B$ are two events with $P(A)\neq 0$ and $P(B)\neq 0$,then
The probability that both of the two events $A$ and $B$ occur is
1.8 Independent
Two events $A$ and $B$ are said to be independent if the occurrence of one does not affect the probability of the occurrence of the other.
$A$ and $B$ are independent events if:
If two events $A$ and $B$ are independent, then:
1.9 Law of Total Probability
Assume that $B_1,B_2,…,B_n$ are collectively exhaustive events where $P(B_i)\gt 0$, for $i=1,2,…,n$ and $B_i$ and $B_j$ are mutually exclusive events for $i\neq j$.
Then for any event $A$:
1.10 Bayes’ Theorem
Suppose that $B_1,B_2,…,B_n$ are n exhaustive events and exhaustive events, then:
$\because P(B_k\cap A) = P(B_k)\cdot P(A|B_k) \; based\;on\;the\;Conditional\;Probability$
$P(A)=P(B_1)P(A|B_1)+P(B_2)P(A|B_2)+…+P(B_n)P(A|B_n) \; \mathrm based\;on\;the\;Law\;of\;Total \;probability$
2. Probability Distribution
2.1 Discrete Random Variable
2.1.1 Probability Distribution
- $0\le P(X)\le 1$ for each value of $x$;
- $\sum P(X)=1$;
- $P(x)=P(X=x)$
Mean or Expected value:
Variance:
2.1.2 Binomial Probability Distribution
$X\sim Bin(n,p)$
$n$ = total number of trials
$p$ = probability of success
$x$ = number of successes in $n$ trials
Mean or Expected value:
Variance:
2.1.3 Poisson Probability Distribution
$X\sim Poisson(\lambda)$
where $\lambda$ is the mean number of occurrences in that interval
Mean or Expected value:
Variance:
Poisson Approximation to the Binomial Distribution:
when the number of trials $n$ is large and at the same time the probability $p$ is small (generally such that $\mu=np\le7$ )
$X$ = number of success from n independent trials
$p$ = probability of success
$\lambda = \mu = np$
2.1.4 Negative Binomial Probability Distribution
The probability of performing $k$ independent trials until a total of $r$ successes is accumulated
$X \sim NegBin (r,p)$
$p$ = probability of each trial being a success
$k$ = independent trials, NOT fixed.
$r$ = number of successes, is fixed.
Mean or Expected value:
Variance:
2.1.5 Geometric Probability Distribution
The probability that the first occurrence of success requires k independent trials.
(i.e. $X \sim NegBin (1,p)$)
$X \sim Geo (p)$
$p$ = probability of each trial being a success
$k$ = independent trials, NOT fixed.
Mean or Expected value:
Variance:
2.1.6 Hypergeometric Probability Distribution
When sampling is without replacement, and the number of elements $N$ in the population is small (or when the sample elements $N$ in the population is small (or when the sample size $n$ is large relative to $N$), the number of “successes” in a random sample of $n$ items has a hypergeometric probability distribution.
$X\sim Hp(x)$:
$N$ = number of elements in the population
$r$ = number of successes in the population
$n$ = sample size (draw from $N$)
$x$ = number of successes in the sample (successful draw from $N$)
Mean or Expected value:
Variance:
where the $\bigg(\frac{N-n}{N-1}\bigg)$ is the finite population correction factor
2.2 Continuous Random Variable
2.2.1 Probability Distribution
$P(X=c) = 0$
$P(X\lt c)=P(X\le c)$
2.2.2 Normal Distribution
PDF:
$X$ follows a normal distribution with mean $\mu$ and $\sigma$(standard deviation), $\sigma^2$(Variance)
$X\sim N(\mu,\sigma^2)$
Mean or Expected value:
Variance:
2.2.3 Standard Normal Distribution
Normal Distribution with $\mu = 0$ and $\sigma=1$
$Z\sim N(0,1)$
Standardizing a Normal Distribution: converting an X value to a Z value
where $X\sim N(\mu,\sigma^2)$
Normal Distribution as an Approximation to Binomial Distribution:
when both
3 Steps:
- Get $\mu$ and $\sigma$ for binomial distribution;
Convert the discrete random variable to a continuous random variable;
Compute the required probability using the normal distribution
3. Sampling Distribution & Estimation
3.1 Sampling Distribution of the Sample Mean
Sampling Distribution of $\bar{X}$
Mean
Standard Error
When $n/N \gt 0.05$ ($N$ for population size):
Shape
3.2 Sampling Distribution of the Sample Proportion
Sampling Distribution of $\bar{p}$
Mean
Standard Error
When $n/N \gt 0.05$ ($N$ for population size):
Shape
3.3 Sampling Distribution of the Sample Variance
Sampling Distribution of $s^2$
For
So,
where
has a chi-square($\chi^2$) distribution with $n-1$ degrees of freedom
Mean
Variance
3.4 Confidence interval of population mean $\mu$ with known Variance
a $(1-\alpha)100\%$ C.I.
$\sigma$: population standard deviation
$n$: sample size
$Z_{\alpha/2}$ from the standard normal distribution table with $\alpha/2$ probability.
3.5 Confidence interval of population mean $\mu$ with unknown Variance
a $(1-\alpha)100\%$ C.I.
$s$: sample standard deviation
$n$: sample size
$t_{\alpha/2,n-1}$ from $t$ distribution table with $\alpha/2$ probability and $n-1$ degrees of freedom
References
Slides of AMA1104 Introductory Probability, The Hong Kong Polytechnic University.
个人笔记,仅供参考,转载请标明出处
FOR REFERENCE ONLY
Made by Mike_Zhang