Combinatorics

For objects, there are permutations, or ordered arrangements. If we want to pick objects from this, then there are combinations.

Multinomial Coefficients

Taking distinct items, and dividing them into groups of size where , the total number of combinations can be found with a multinomial coefficient

Axioms of Probability

For a given experiment the set of possible outcomes is the sample space, .

Subsets of the sample space can be combined with union and intersection, which are operators that obey commutativity, associativity, and distributivity. Furthermore, the complement of a subset denoted as is the set of all events in but not in .

DeMorgan’s Laws

Taking the complement will invert the inside of a parenthesis from union to intersect or vice versa.

Axioms of Probability

Taking an event , we define to be the ratio of successes to total occurrences, when we take the limit . This can also be called the probability Then we have the following axioms.

Note that , .

Inclusion Exclusion Principle

Note that . Generalizing this to different events, we have

Conditional Probability

The probability that occurs given occurs is denoted . If , then

For a series of events, , we have

Bayes formula states that

Discrete Random Variables

Let be some random variable, is the cumulative distribution function of . If is discrete, then is the probability mass function. The expectation is effectively the mean of the random variable.

The expectation is a linear function

The variance of is defined from the expectation

A Bernoulli random variable is a binary variable with value when success of probability , and when failure with probability .

A binomial random variable with parameters represents trials each with probability .

A Poisson random variable with parameters is an approximation of a binomial with for big and small .

A geometric random variable is the number of trials needed for a success, given the probability of success .

A negative binomial variable with parameters is the number of trials needed for successes with chance of success . A binomial variable is a negative binomial with . Equivalent to chance of successes, followed by a success on the th trial.

A hypergeometric random variable with parameters is expected number white balls, when are drawn from a selection of balls with white ones.

Continuous Random Variables

For a continuous random variable defined on the real line, the probability density function and distribution function are the following

Furthermore, the expectation and is defined to be

Note the same rules for expectation and variance hold in the continuous case similar to discrete.

A variable is uniformly distributed if over ,

A variable is normally distributed with parameters if the density of is

Furthermore, is a standard normal variable.

The cumulative distribution function will be denoted as

A binomial random variable can be approximated with a normal of , . Note that the endpoints must be shifted to account for continuity correction

An exponential random variable with parameters has density function for nonnegative .

This type of variable represents the amount of time before an event occurs, and is uniquely memoryless, (past does not affect future)

For a continuous RV with and , the hazard rate is

We can recover the distribution function if we integrate from to

The gamma function is a generalization of factorial to the reals, defined

A gamma distribution with parameters is defined

This represents the amount of time needed for events to occur.

Jointly Distributed Random Variables

The joint cumulative distribution function of and is . The joint probability mass function is . In this context, the marginal probability mass functions are and

Independence for joint distributions occur when the following conditions are true

The convolution of and is . Furthermore, note that

For , , . A gamma distribution with is a chi-square distribution with degrees of freedom.

A sum of normal variables then is normally distributed with parameters and .

The sum of two Poisson random variables is also Poisson with parameter , and binomial random variables sum to a binomial with parameters .

If two variables are independent, then

Conditional probabilities behave similarly, where

Joint Probability Distribution Functions

Take and as jointly continuous random variables, with density function . Let , . Note that the Jacobian is defined

Then we can come up with a density function for this change of variables

Expectations

Note that

If and are independent, then

Covariance

Correlation is defined

Limit Theorems

Markov’s Inequality

Chebyshev’s Inequality