## Foundational Probability Concepts

Probability is a branch of mathematics that deals with calculating the likelihood of a given event's occurrence, which is expressed as a number between 1 and 0. Here we explore foundational concepts like events, independence, and various rules that guide the calculation of probabilities in different scenarios.

## Basic Definitions

**Experiment:** A procedure that can be infinitely repeated and has a well-defined set of outcomes.**Sample Space (S):** The set of all possible outcomes of an experiment.**Event:** A subset of the sample space.

## Types of Events

**Simple Event:** An event with a single outcome.**Compound Event:** An event made up of two or more simple events.

## Independent and Dependent Events

**Independent Events:** The occurrence of one event does not affect the probability of the other.**Dependent Events:** The occurrence of one event affects the occurrence of another.

## Mutually Exclusive Events

Two events are mutually exclusive if they cannot occur at the same time. This means \(P(A \cap B) = 0\).

## The Complement Rule

The probability of an event not occurring is 1 minus the probability that it does occur. \(P(A^{c}) = 1 - P(A)\)

## Addition Rules

**For Mutually Exclusive Events:** \(P(A \cup B) = P(A) + P(B)\)**For Non-Mutually Exclusive Events:** \(P(A \cup B) = P(A) + P(B) - P(A \cap B)\)

## Conditional Probability

The probability of an event given that another event has occurred. \(P(A | B) = \frac{P(A \cap B)}{P(B)}\)

## Multiplication Rules

**For Independent Events:** \(P(A \text{ and } B) = P(A) \cdot P(B)\)**For Dependent Events:** \(P(A \text{ and } B) = P(A|B) \cdot P(B)\)

## Bayes' Theorem

Describes the probability of an event, based on prior knowledge of conditions that might be related to the event. \(P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}\)

## Random Variables

A random variable is a variable whose possible values are numerical outcomes of a random phenomenon. There are two types: discrete and continuous.

## Expected Value

The expected value of a random variable gives a measure of the center of the distribution of the variable. Essentially a weighted average, \(E(X) = \sum x_i \cdot P(x_i)\) for discrete variables, or an integral for continuous variables.

## Probability Distribution

A probability distribution describes how probabilities are distributed over the values of the random variable. Key distributions include Binomial, Normal, and Poisson.

## Binomial Distribution

Used for a finite number of trials, each with the same probability of success. The formula is \(P(X = k) = {n \choose k} p^k (1-p)^{n-k}\), where \(n\) is the number of trials, \(k\) is the number of successes, and \(p\) is the probability of success on a single trial.

## Normal Distribution

Describes a symmetrical, bell-shaped curve that is defined by its mean (\(\mu\)) and standard deviation (\(\sigma\)). The probability density function is \(f(x) = \frac{1}{\sigma\sqrt{2\pi}} e^{-\frac{1}{2}(\frac{x-\mu}{\sigma})^2}\).

## Poisson Distribution

Expresses the probability of a given number of events happening in a fixed interval of time or space, given the average number of times the event happens over that interval. \(P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!}\), where \(\lambda\) is the mean number of successes in the given interval.

## Z-Scores and the Standard Normal Distribution

A Z-score is a measure of how many standard deviations an element is from the mean. The standard normal distribution is a normal distribution with \(\mu = 0\) and \(\sigma = 1\). The Z-score is calculated by \(Z = \frac{X - \mu}{\sigma}\).

## Central Limit Theorem

The Central Limit Theorem states that the distribution of the sum (or average) of a large number of independent, identically distributed variables will be approximately normal, regardless of the underlying distribution.