You are on page 1of 9

Final Exam II - 5DV019 – Artificial Intelligence

Saturday January 10th, 2009


skrivsal 6 (boknr 102176)
9.00 - 15.00

NUMBER: _________

Write this number on each sheet in the exam body.


-10pts if your name appears in the exam body

NAME: _______________________________

PERSON NUMBER: ________________

EMAIL: ________@cs.umu.se

Total extra credit from assignments: _____

Exam point total: _____

Total points: _____

Grade: _____

PROVIDE ANSWERS ON THE GIVEN SHEETS.


NO EXTRA PAGES ALLOWED!

1
EXAM NUMBER: (no name!) 1 TRUE OR FALSE - 200 POINTS

1 True or False - 200 points


Each question is worth 10 pts. If you mark a question wrong, then you loose 5 pts. The
lowest possible score on the entire section is 0 pts. Place answers in the space (___).

1. ______ A* using GRAPH-SEARCH always finds an optimal solution.


2. ______ GRAPH-SEARCH may give an exponential speed up to TREE-SEARCH.
3. ______ If α ⊢ β whenever α |= β then the proof method |= is sound.
4. ______ Some well-formed first-order formula has an infinite number of mod-
els.
5. ______ Perceptron learning converges quickly when learning the majority func-
tion.
6. ______ STRIPS planners typically employ task decomposition and maximum
expected utility in their planning process.
7. _____ Auto-association is a type of supervised learning.
8. _____ A utility function is a mapping from states to their desirability.
9. _____ You may use negation in the pre-conditions of STRIPS operators.
10. _____ John McCarthy invented LISP.
11. _____ Queries over singly connected (poly-tree) Bayesian networks may be
solved in linear time in the number of nodes in the network.
12. _____ Agent performance measures should be based solely on agent actions.
13. ______ Bayesian networks are a compact way in which to represent a joint
probability distribution.
14. ______ In Bayesian networks if the set of nodes Y d-separate the set of nodes
X and Z, then P (X|Y ) = P (X|Y, Z).
15. ______ To apply resolution, you must convert your formula to a CNF.
16. ______ A consistent heuristic always underestimates the cost to reach the goal.
17. ______ Decision trees may express any propositional formula.
18. ______ The closed world assumption states that those facts not expressly stated
(or entailed from stated sentences) are assumed false.
19. ______ Predicate logic has the quantifiers ∀ and ∃.
20. ______ Depth-first search requires constant memory.

2
EXAM NUMBER: (no name!) 3 SHORT ANSWERS: (50 POINTS)

2 Fill in the blank (100 points)


a. How many models are there for (a ∨ b ∨ c) among all of the interpretations over
three Boolean variables? ___________
b. How many bits of information are there in knowing the outcome of a six-sided
die? __________
c. If b is the branching factor and d is the depth of the shallowest goal node, what
is the space complexity of breadth-first search?___________
d. Define the probability of P (w1 · w2 · ... · wn |Signal) where w1 · w2 · ... · wn are a
sequence of words and Signal is a auditory speech signal. Identify the acoustic
and language model portions of the resulting expression.

3 Short answers: (50 points)


Answer in the space provided:
a. Define alpha-beta pruning.

b. Define the brain prosthesis thought experiment.

3
EXAM NUMBER: (no name!) 4 SEARCH - 100 POINTS

4 Search - 100 points


(From Russell and Norvig, 2003) Consider the state space where the start state is num-
ber 1 and the successor function for state n returns two states 2n and 2n + 2.
a. Draw the state space for states 1 to 15.

b. Suppose the goal state is 11, give the order states will be visited under

Breadth-first __________________________________
Depth-first __________________________________
iterative deepening __________________________________

c. Would bidirectional search be appropriate for this problem. If so, why?

4
EXAM NUMBER: (no name!) 5 LOGIC (200 POINTS)

5 Logic (200 points)


Over the unary predicate Event(X), the binary predicate Causes(X, Y ) (meaning X
causes Y) express the following:

(5 pts) a. Every event is caused by another event.

(5 pts) b. For arbitrary events X,Y and Z if X causes Y and Y causes Z, then X causes Z.

(5 pts) c. An event can not cause itself

(5 pts) d. Event A causes event B and event B causes event C

(5 pts) e. Event A does not cause event C

5
EXAM NUMBER: (no name!) 5 LOGIC (200 POINTS)

(75 pts) f. Convert the above formulas to CNF.

6
EXAM NUMBER: (no name!) 5 LOGIC (200 POINTS)

(50 pts) g. Using resolution, generate an empty clause (of course you should isolate your
attention to clauses for b,d and e above).

(50 pts) h. Is there a finite model of the a formulas a,b,c above. Discuss.

7
EXAM NUMBER: (no name!) 6 BAYESIAN NETWORKS (150 POINTS)

6 Bayesian Networks (150 points)


Assume that each of the 10 variables in the network are Boolean.
A J G

B E

C D H

F I

Each question (a-j) is worth 10 points. If you mark a question wrong, then you
loose 5 points. The lowest possible score on the entire section is 0 points.

a. Ignoring the network structure, how many parameters are required to specify the
full joint probability distribution over all of the variables? __________.
b. From a., what the sum of all these parameters? __________.
c. How many parameters would be required to specify the joint probability given
the network structure above? __________.
d. Is the cost for answering arbitrary queries over this network polynomial or expo-
nential in the number of variables? __________
e. P (C|B) = P (C|B, H)? __________
f. P (G) = P (G|A)? _________
g. P (D|B, E) = P (D|B, E, F )? _________
h. P (G) = P (G|A, F )? _________
i. P (E|A) = P (E|A, B, F )? _________
j. P (J|A) = P (J|A, I)? _________

(50 pts) k. Using the chain rule and d-separation, write an expression of the full joint distri-
bution for the Bayesian network.
P (A, B, C, D, E, F, G, H, I, J) =

8
EXAM NUMBER: (no name!) 7 ESSAY (200 POINTS)

7 Essay (200 points)


Discuss the idea of the technological singularity. Does it require strong AI?

You might also like