You are on page 1of 27

1

Review of probability and Bayes theorem


Addition rule, multiplication rule, condition
probability
How one piece of information affects the
understand of the current state

Decision making under uncertainty


Decision analysis framework
Experimentation and value of information

Decision making is important in systems


engineering
E.g., at some decision gates to select the system
concepts for further development

Uncertainty is one important factor that


makes a decision difficult
Probability mathematical concept to
capture uncertainty

Classical

Frequentist

Subjectivist

Probability is the ratio of favorable cases to the total possible


cases
E.g., flip a coin which has only two faces probability of getting a
head = 0.5
Probability is the limiting value as the number of trials becomes
infinite of the frequency of occurrence of a random event that is
well-defined
E.g., flip the coin for many times to determine the probability
E.g., Team A has won 70% of the games of this season
Probability is an ideal rational agents degree of belief about an
uncertain event.
Related to Bayesian probability
E.g., I believe that Team A have 70% chance to win Team B tonight

Axiom = a premise or starting point of


reasoning (Wikipedia), common in math
Let S be the sample space and E be any event
in a random experiment.
Probability is a number that is assigned to
each member of a collection of events from a
random experiment that satisfies the
following properties:
P(S) = 1
0 P(E) 1
If E1 E2 = , then P(E1 E2) = P(E1) + P(E2)

Four outcomes {a, b, c, d} correspond to


probability 0.1, 0.3, 0.5, and 0.1,
respectively.
Let A denote the event {a, b}, B the event {b,
c, d}, and C the event {d}. Then determine:

P(A) and P(A)


P(B) and P(C)
P(A B) and P(A B)
P(A C)

Probability of a union
P(A B) = P(A) + P(B) P(A B)

If A and B are mutually exclusive events


P(A B) = P(A) + P(B)

The case of three events


P(A B C) = P(A) + P(B) +P(C) P(A B) P(A C)
P(B C) + P(A B C)

If all events are mutually exclusive

P(E1 E2 Ek) = P(E1) + P(E2) + + P(Ek)

The conditional probability of an event B


given an event A, denoted as P(B | A), is
P(B | A) = P(A B) / P(A)

Intuition: If it is known that event A happens,


this knowledge affects our knowledge toward
the chance of event B
Rolling two dices, if we know that one dice shows
three, we know that the chance of having twelve
in total is zero.

Formula
P(A B) = P(B|A)P(A) = P(A|B)P(B)

Example

There are two machining stages. The probability of


the first stage meeting specifications is 0.90.
Given that the first stage meets specifications, the
probability of the second stage meeting
specifications is 0.95.
What is the probability that both stages meet
specifications?

Case of two events


P(B) = P(B A) + P(B A)
= P(B|A)P(A) + P(B|A)P(A)

Assume E1, E2, , Ek are k


mutually exclusive and
exhaustive sets
P(B) = P(B E1) + P(B E2) +

+ P(B Ek)
= P(B|E1)P(E1) + P(B|E2)P(E2)
+ + P(B|Ek)P(Ek)

Two events are independent if any one of the


following equivalent statements is true:
(1) P(A|B) = P(A)
(2) P(B|A) = P(B)
(3) P(AB) = P(A)P(B)

In fact, if (1) is true, then both (2) and (3)


must be true. (Why?)
If the events E1, E2, , Ek are independent,
then
P(E1 E2 Ek) = P(E1)*P(E2)* *P(Ek)

If A and B are mutually exclusive events

P( A B) P( A) P( B)

If A and B are independent events

P( A B) P( A) P( B)
NOT mutually exclusive

P( A B) P( A) P( B) P( A B)
Dont know about independence

P( A B) P( A) P( B)(?)

Mutually exclusive

P( A B) P( A) P( B)
Implication: P( A B) 0

NOT independent

Famous and important in statistical analysis


E.g., Bayesian network
How one piece information affects our knowledge
of other things?
E.g., Many people on the platform metro train
will come soon

Formula:
Basic interpretation: P(B | A) P(A | B)
P( A | B)

P( B | A) P( A)
P( B)

for

P( B) 0

Suppose that the product has the probability of


0.2 being exposed to high level of
contamination. If the product is exposed to high
contamination, the probability of product failure
is 0.1. Otherwise, the probability of product
failure is 0.005. What is the probability of
product failure?

Key for problem solving: clearly define the events


with symbols and express them for given values

The probability that the medical test correctly


identifies someone with the illness as positive
is 0.99. The probability that the test correctly
identifies someone without the illness as
negative is 0.95. The incidence of the illness
in the general population is 0.0001.
Question: if the result of one test is positive,
what is probability that this person has the
illness?

Prerequisite of a decision problem


At least two choices and some goodness measures

Uncertainty means
The goodness outcomes are uncertain depending
on some other factors

Examples
Buy a stock or not (do not know the future)
Choose transportation to airport (do not know the
exact travel time of each transportation)

Decision for an oil company:


Drill for oil or sell the land

Payoff: a quantitative measure of the value to the


decision maker of the consequences of the
outcome
State of nature: possible situations that affect the
outcome and will be founded at the time that the
action is executed
Oil (1)
Drill for oil (a1)
$700,000
Sell the land (a2) $90,000
Chance of status 1 in 4

Dry (2)
-$100,000
$90,000
3 in 4

The decision maker needs to choose one of


the possible actions.
Nature then would choose one of the possible
states of nature.
Each combination of an action a and state of
nature would result in a payoff p(a,), which
is given as one of the entries in a payoff
table.
This payoff table should be used to find an
optimal action for the decision making
according to an appropriate criterion.

Maximin payoff criterion

For each possible action, find the minimum payoff


over all possible states of nature. Next, find the
maximum of these minimum payoffs. Choose the
action whose minimum payoff gives this maximum
Rationale: provide best guarantee of the payoff

Maximum likelihood criterion

Identify the most likely state of nature (the one with


the largest prior probability). For this state of
nature, find the action with the maximum payoff.
Choose this action.

Using the best available estimates of the


probabilities of the respective states of nature
(currently the prior probabilities), calculate the
expected value of the payoff for each of the
possible actions. Choose the action with the
maximum expected payoff.
Motivating example
Action a1: E[p(a1, )] = 0.25(700) + 0.75(-100) = 100
Action a2: E[p(a2, )] = 0.25(90) + 0.75(90) = 90

Comments:

It incorporates all the available information.


These estimates of the probabilities necessarily are
largely subjective.

Conduct surveys or experiments to obtain


additional information
Reduce the uncertainty for better decision
Better decision better expected outcome

We need extra resources (i.e., money) to obtain


additional information
Survey or experimental results usually provide
only better approximation
Conditional probabilities: express the quality of the
survey
Perfect information survey results 100% reliable

Determine posterior probabilities for decision


making

Conduct a seismic survey to tell us how likely


the land has oil
S = result obtained from seismic survey
S = 0: unfavorable oil is fairly unlikely
S = 1: favorable oil is fairly likely

Conditional probabilities of the survey quality


P(S=0 | =1) = 0.4 and P(S=1 | =1) = 0.6
P(S=0 | =2) = 0.8 and P(S=1 | =2) = 0.2

Find the posterior probabilities

If S = 0 (oil is fairly unlikely)


P 1 | S 0

0.4(0.25)
1

0.4(0.25) 0.8(0.75) 7

P 2 | S 0

6
7

If S = 1 (oil is fairly likely)


P 1 | S 1

0.6(0.25)
1

0.6(0.25) 0.2(0.75) 2

P 2 | S 1

1
2

Use the posterior probabilities for decision


making
If S = 0, we will sell the land (a2):
E[p(a2, | S=0)] = (1/7)(90) + (6/7)(90) = 90
P(S=0) = 0.4(0.25) + 0.8(0.75) = 0.7

If S = 1, we will drill for oil (a1):

E[p(a1, | S=1)] = (1/2)(700) + (1/2)(-100) = 300


P(S=1) = 0.6(0.25) + 0.2(0.75) = 0.3

Expected payoff = 0.70(90)+0.30(300) = 153


If experiment cost 30, do experiment or not?
Expected payoff without experiment = 100
Expected payoff with experiment = 153

Forks: nodes of the decision trees


Decision fork (square): decision need to be made
Chance fork (circle): indicate that a random event
occurs at that point

670

-130
60
Optimal Policy
Do the seismic survey
If S = 0, sell the land.
If S = 1, drill for oil.
The expected payoff is 123

670

-130
60
700

-100
90

Systems Engineering and Analysis, (4th


Edition), 2006, Benjamin S. Blanchard, Wolter
J. Fabrycky. Chapter 9.

Frederick S. Hillier and Gerald J. Lieberman,


Introduction to Operations Research.
Chapter 15.3 & 15.4.

27

You might also like