1. Introduction to Probability
Probability is a measure of the likelihood or chance of an event occurring, expressed as a value between 0 (impossible) and 1 (certain).
2. Basic Concepts of Probability
Classical Definition: If there are equally likely outcomes of an event and of those outcomes are favorable, the probability of the event is:
Addition Rule: For two events and :
If and are mutually exclusive, .
If and are not mutually exclusive, .
Multiplication Rule:
For independent events and : .
For dependent events, , where is the conditional probability of given .
3. Probability Distributions
Probability distributions describe how probabilities are distributed over different outcomes.
a. Binomial Distribution
Definition: Describes the probability of obtaining a fixed number of successes in independent Bernoulli trials, each with a success probability .
Formula:
where is the binomial coefficient.
b. Poisson Distribution
Definition: Used for modeling the number of events occurring in a fixed interval of time or space when events occur independently.
Formula:
where is the mean number of occurrences.
c. Normal Distribution
Definition: A continuous probability distribution shaped like a bell curve. It is symmetrical around the mean , with standard deviation .
Probability Density Function (PDF):
\[ f(x) = \frac{1}{\sqrt{2\pi\sigma^2}} e{-\frac{(x-\mu)2}{2\sigma^2}} \]
4. Expected Value
The expected value (mean) of a random variable is the weighted average of all possible values, where weights are the probabilities of the values. For a discrete random variable :