Chris Pollett> Old Classes> CS185c
( Print View )

Student Corner:
[Submit Sec1]
[Grades Sec1]

[Lecture Notes]
[Discussion Board]

Course Info:
[Texts & Links]
[Description]
[Course Outcomes]
[Outcomes Matrix]
[Course Schedule]
[Grading]
[Requirements/HW/Quizzes]
[Class Protocols]
[Exam Info]
[Regrades]
[University Policies]
[Announcements]

HW Assignments:
[Hw1] [Hw2] [Hw3]
[Hw4] [Project] [Quizzes]

Practice Exams:
[Midterm] [Final]

X
Learning Outcomes versus Collected Course Materials
CLO1CLO2CLO3CLO4CLO5CLO6CLO7CLO8
HW1X
HW2XXXX
HW3XXXX
MT1X
MT2XX
MT3
MT4X
MT5X
HW4XXX
ProjectXXX

MTn = Midterm Problem n. FEPn = Final Exam Problem n. Within the class there were two versions of a given test; however, these two versions were just problem permutations of each other. The results above are all for the first of these two permutations. The two classes each had different tests which were variants of each other, testing the same learning outcomes.

CLO1 (Course Learning Outcome 1) -- Be able to create VR apps using WebVR, a VR mobile SDK, or Unity game engine.

CLO2 -- Be able to calculate by hand the effects of various VR rendering transformations on different input vectors.

CLO3 -- Be able to manipulate 360 and 360 3D image and video resources and programmatically display them in a virtual environment.

CLO4 -- Be able create models in a popular 3D model format, and programmatically read and display them into a VR environment.

CLO5 -- Be able to code a VR app in which objects move according to some kind of simulated physics and in which collisions are detected.

CLO6 -- Be able to code a VR app that does head motion tracking.

CLO7 -- Be able to code a VR app that syncs 3D audio and 3D video.

CLO8 -- Be able to code a VR app that makes use of hand based gesture inputs and haptic feedback.