You are on page 1of 70

PROBABILITY THEORY

Basic Terminologies:
Random experiment : An experiment whose
outcomes are not known in advance.
Sample space: In a random experiment, the set
of all possible outcomes of interest (to the
experimenter) is referred to as the sample space.
Usually denoted (read omega).
Experiment: Tossing a coin twice
Sample space: = {(H, H),(H, T),(T, H),(T, T)}

Event: A subset of the sample space is


referred to as an event.
Tossing a coin twice. Here, = {(H, H),(H, T),
(T, H),(T, T)}.
Let A = {(H, T),(T, H),(H, H)}. A is the event at
least one head appears

Let B = {(H, H),(T, T)}. B is the event first toss =


second toss.

Basic Set Operations:


Union: Union of two sets A and B, denoted A B
is the set of all elements that are either in A or B or
both.
Intersection: Intersection of two sets A and B,
denoted A B is the set of all elements that are in
both A and B.

Complement: Complement of a set A is the set of


all elements that are not in A.

Disjoint or Mutually Exclusive Events: Events A


and B are said to be disjoint or mutually exclusive
if A B = , i.e., these sets do not have any
element in common.

In general, a collection of events A1, A2, , An


are said to be disjoint or mutually exclusive
if Ai Aj for all i not equal to j.

Partition: A collection of disjoint events A1, A2,


, An is said to form a partition of the sample
space if A1 A2 An = .
Demorgan's Law:

-Algebra (read as sigma algebra):


Given a sample space , let F be a collection of subsets
of (i.e., is a collection of events). F is

said to be a -algebra if it satisfies the following axioms:


(A1) F,
(A2) If A F then A' F, and
(A3) If A F and B F then A B F

Example: Let = {H, T}. Then,


F1 = { {H, T}, {H}, {T}, } is a -algebra

Probability Measure:
Given an (, F), a probability measure P is a real-valued function
on the events in F (i.e.,P : F R) satisfying the following axioms:
1. 0 = < P(A) < = 1 for all A F,

2. P() = 1, and
3. If A and B are two disjoint sets in F, then P(A B) = P(A) +
P(B). (Same applicable for more than 2 disjoint sets)
Example: Tossing a coin: = {H, T} & F ={{H, T}, {H}, {T}, }
P({H, T}) = 1, P({H}) = 0.5, P({T}) = 0.5, and P() = 0 is a
probability measure.
In general, P({H, T}) = 1, P({H}) = p, P({T}) = 1 p, and P() = 0
is a probability measure for any p [0, 1].

Conditional Probability: The conditional


probability of event B given event A,
denoted P(B|A), is defined as,
P(B|A) = P(A B) / P(A)
If P(A) = 0, then P(B|A) is undefined.

Bayes Formula: If events A and B are such that


P(A) > 0 and P(B) > 0. Then,

P(A B) = P(A)P(B|A) = P(B)P(A|B).


Thus, we have :

Independence
Events A and B are said to be independent if
P(A B) = P(A)P(B).
Similarly, events A, B and C are said to be independent if the
following holds:
P(A B C) = P(A)P(B)P(C)
P(A B) = P(A)P(B)

P(A C) = P(A)P(C)
P(B C) = P(B)P(C).

In general, A1, A2, , An are said to be independent if every subcollection of these events ( of size greater than
2) are independent,

Random Variables: Let (, F, P) be a probability


space. A random variable X is a real valued

function of , i.e., X : R.

Cumulative Distribution Function (CDF)


Let (, F, P) be given. Then the cumulative
distribution function of a random variable X,

denoted FX, is given by


FX (x) = P (X x) for all x R.

Properties of CDF :
1. F is non-decreasing

2. Lim F(x) =1
x->

and

Lim F(x) = 0
x-> -

3. F is right continuous.

Discrete Random Variables:


A random variable X is said to be discrete if and
only if X takes on finite or countably infinite values.
Its distribution is represented using Probability
Mass Function (PMF).

The probability mass function (PMF) of a


discrete random variable X is given by :

pX (x) = P(X = x)

Some Discrete Random Variables

Bernoulli random variable:


A random variable with two possible values
1 (denoting Success) and 0 (denoting Failure)
having probabilities represented by:
P(X=1) = p

P(X=0)= 1 - p
Representation: X Ber(p)
where p [0, 1] (sometimes referred to as the
success probability)

Binomial random variable


Suppose that a random variable X represents no
of successes among n independent trials with
probability p, and the other trials result into failure
each having probability (1-p). Then, X is a
Binomial Random Variable. (Representation: X
Bin(n, p) )

PMF :

Geometric Random Variable

Suppose that independent trails, each having


probability p of being a success, are performed
until a success occurs. If we let X be the number
of trials required until the first success, then X is
said to be Geometric Random Variable with
parameter p , where p [0, 1].
Representation: X Geo(p)
PMF:

P(X= k) = (1 p)^(k-1) * p,

k= 1,2,.......

Only discrete RV that exhibits Memoryless


Property.

Memoryless Property of Geometric RV

Poisson Random Variable

A random variable X is said to be Possion RV if it


takes one of the values among 0,1,2,..... with
probability:

Here, (>0) is the parameter for Poisson


Random variable.
Representation: X Poi()

Continuous Random Variable


Continuous RV can take infinitely many values. A
RV X is said to be continuous if and only if there
is a non-negative function fx on R such that:

The function fx is referred to as the probability


density function (PDF) of the random variable X.

Some Continuous Random Variables


Uniform RV:

A random variables is said to be Uniformly


Distributed over the interval (a,b) if its probability
density function is given by:

where < a < b < .

PDF of Uniform RV:

CDF of Uniform RV:

Exponential Random Variable

Exponential RV exhibits memoryless property


among continuous RVs.

Gaussian Random Variable:

Rayleigh Random Variable

Joint Distribution Function:

Joint Mass Function

If X and Y are discrete, it is convenient to work


with joint mass function (or joint PMF).

Expectation of Discrete Random Variable

If X is a discrete RV having a probability mass


function p(x), then the Expected Value of X is
defined by:
E[X]= x* p(x)

List of Expectations of Discrete Random Variables:

Expectation of Continuous Random Variables

If X is a continuous random variable having a


probability density function f(x), then the expected
value of X is given by:

Variance of a Random Variable:

Properties of Variance:

Covariance:

List of MGFs of different Random Variables:

You might also like