Combinatorics
For
Multinomial Coefficients
Taking
Axioms of Probability
For a given experiment the set of possible outcomes is the sample space,
Subsets of the sample space can be combined with union and intersection, which are operators that obey commutativity, associativity, and distributivity. Furthermore, the complement of a subset
DeMorgan’s Laws
Taking the complement will invert the inside of a parenthesis from union to intersect or vice versa.
Axioms of Probability
Taking an event
Note that
Inclusion Exclusion Principle
Note that
Conditional Probability
The probability that
For a series of events,
Bayes formula states that
Discrete Random Variables
Let
The expectation is a linear function
The variance of
A Bernoulli random variable is a binary variable with value
A binomial random variable with parameters
A Poisson random variable with parameters
A geometric random variable is the number of trials needed for a success, given the probability of success
A negative binomial variable with parameters
A hypergeometric random variable with parameters
Continuous Random Variables
For a continuous random variable
Furthermore, the expectation and is defined to be
Note the same rules for expectation and variance hold in the continuous case similar to discrete.
A variable is uniformly distributed if over
A variable is normally distributed with parameters
Furthermore,
The cumulative distribution function will be denoted as
A binomial random variable can be approximated with a normal of
An exponential random variable with parameters
This type of variable represents the amount of time before an event occurs, and is uniquely memoryless, (past does not affect future)
For a continuous RV
We can recover the distribution function if we integrate from
The gamma function is a generalization of factorial to the reals, defined
A gamma distribution with parameters
This represents the amount of time needed for
Jointly Distributed Random Variables
The joint cumulative distribution function of
Independence for joint distributions occur when the following conditions are true
The convolution of
For
A sum of normal variables
The sum of two Poisson random variables is also Poisson with parameter
If two variables are independent, then
Conditional probabilities behave similarly, where
Joint Probability Distribution Functions
Take
Then we can come up with a density function for this change of variables
Expectations
Note that
If
Covariance
Correlation is defined
Limit Theorems
Markov’s Inequality
Chebyshev’s Inequality