These are some of my notes on probability theory.

# Conditional Probability

## Definitions

• The likelihood of an event occuring, assuming a different event has already happened.
• For independent events: \begin{alignedat}{1} P(A|B) &= P(A) \\ \\ P(A \cap B) &= P(A).P(B) \end{alignedat}
• For dependent events: \begin{alignedat}{1} P(A|B) &= \frac{Number \space of \space favorable \space outcomes}{Sample \space space} \\ \\ &= \frac{P(A \cap B)}{P(B)} \end{alignedat}

## Law of Total Probability

\begin{alignedat}{1} If\space A &= B_1 \cup B_2 \cup B_3 \cup ... \cup B_n \space \space \space \space \forall B_i: all \space B_i \space are \space mutually \space exclusive, \space then \\ \\ P(A) &= P(A \cap B_1) + P(A \cap B_2) + P(A \cap B_3) + ... + P(A \cap B_n) \\ \\ P(A) &= P(A|B_1).P(B_1) + P(A|B_2).P(B_2) + P(A|B_3).P(B_3) + ... + P(A|B_n).P(B_n) \end{alignedat}

$P(A \cup B) = P(A) + P(B) - P(A \cap B)$

## Multiplication Rule

$P(A \cap B) = P(A|B).P(B)$

## Bayes’ Theorem

\begin{alignedat}{1} We \space know \space P(A|B) &= \frac{P(A \cap B)}{P(B)} \\ \\ And \space P(B|A) &= \frac{P(A \cap B)}{P(A)} \\ \\ \therefore P(A|B).P(B) &= P(B|A).P(A) \\ \\ \boxed{P(A|B) = \frac{P(B|A).P(A)}{P(B)}} \end{alignedat}