#
CS156 Fall 2017
Practice Final

To study for the final I would suggest you: (1) Know how to do (by heart) all the practice problems. (2) Go over your notes at least three times. Second and third time try to see how much you can remember from the first time. (3) Go over the homework problems. (4) Try to create your own problems similar to the ones I have given and solve them. (5) Skim the relevant sections from the book. (6) If you want to study in groups, at this point you are ready to quiz each other. The practice final is below. Here are some facts about the actual final: (a) It is comprehensive (b) It is closed book, closed notes. Nothing will be permitted on your desk except your pen (pencil) and test. (c) You should bring photo ID. (d) There will be more than one version of the test. Each version will be of comparable difficulty. (e) It is 10 problems (2pts each), 6 problems will be on materials since the second midterm, two problems will come from the topics for each midterm (four problems total). (f) Two problems will be exactly (less typos) off of the practice final, and one will be off of each practice midterm.

- What new rules of inference are allowed in first order natural deduction proofs that aren't allowed in propositional logic. Give an example of using at least one such rule.
- Consider the task of taking out the garbage, this might involve sorting refuse into garbage or recycle piles, putting piles into garbage or recycle bins, and taking each bin out to the curb. Model this problem as a PDDL planning problem. Give an example solution plan.
- Give and briefly explain the GraphPlan algorithm presented in class.
- In our categories and objects approach to knowledge representation show how the BunchOf predicate can be defined in terms of PartOf predicate.
- Give an example of a default logic knowledge base with at least four distinct extensions.
- Give an example of computing probabilities using conditioning.
- Draw an equilateral triangle in the plane of edge length 10. Around each vertex replace draw a square of edge length 2. Using all of the vertices you drew, explain how the 3-means hierarchal clustering algorithm would cluster these points.
- Write down the decision tree learning algorithm. Explain how information gain could be calculated.
- Give an example of a function not computable by a single perceptron gate. Show that it can be computed by a multi-layer perceptron network.
- Give the k-nearest neighbors algorithm. Define non-parametric learning and say why k-nearest neighbors is non-parametric.