These are some of my notes on probability theory.

Conditional Probability

Definitions

  • The likelihood of an event occuring, assuming a different event has already happened.
  • For independent events: P(AB)=P(A)P(AB)=P(A).P(B) \begin{alignedat}{1} P(A|B) &= P(A) \\ \\ P(A \cap B) &= P(A).P(B) \end{alignedat}
  • For dependent events: P(AB)=Number of favorable outcomesSample space=P(AB)P(B) \begin{alignedat}{1} P(A|B) &= \frac{Number \space of \space favorable \space outcomes}{Sample \space space} \\ \\ &= \frac{P(A \cap B)}{P(B)} \end{alignedat}

Law of Total Probability

If A=B1B2B3...Bn    Bi:all Bi are mutually exclusive, thenP(A)=P(AB1)+P(AB2)+P(AB3)+...+P(ABn)P(A)=P(AB1).P(B1)+P(AB2).P(B2)+P(AB3).P(B3)+...+P(ABn).P(Bn) \begin{alignedat}{1} If\space A &= B_1 \cup B_2 \cup B_3 \cup ... \cup B_n \space \space \space \space \forall B_i: all \space B_i \space are \space mutually \space exclusive, \space then \\ \\ P(A) &= P(A \cap B_1) + P(A \cap B_2) + P(A \cap B_3) + ... + P(A \cap B_n) \\ \\ P(A) &= P(A|B_1).P(B_1) + P(A|B_2).P(B_2) + P(A|B_3).P(B_3) + ... + P(A|B_n).P(B_n) \end{alignedat}

Additive Rule

P(AB)=P(A)+P(B)P(AB) P(A \cup B) = P(A) + P(B) - P(A \cap B)

Multiplication Rule

P(AB)=P(AB).P(B) P(A \cap B) = P(A|B).P(B)

Bayes’ Theorem

We know P(AB)=P(AB)P(B)And P(BA)=P(AB)P(A)P(AB).P(B)=P(BA).P(A)P(AB)=P(BA).P(A)P(B) \begin{alignedat}{1} We \space know \space P(A|B) &= \frac{P(A \cap B)}{P(B)} \\ \\ And \space P(B|A) &= \frac{P(A \cap B)}{P(A)} \\ \\ \therefore P(A|B).P(B) &= P(B|A).P(A) \\ \\ \boxed{P(A|B) = \frac{P(B|A).P(A)}{P(B)}} \end{alignedat}


Comments

comments powered by Disqus