1. Course Outline
VR headsets have been a very popular device for creating new
experiences. In this course we will provide an introduction
to all the aspects of virtual reality --- hardware, software, human
perception, and evaluation. The course syllabus includes the following.
a.
Introduction and history
b.
Geometry of Virtual
World
c.
Rendering of Virtual
World
d.
VR Hardware - lenses,
cameras, displays, controllers
e.
Perception in VR --
visual and auditory
f.
Motion, Tracking and
Interaction
g.
Evaluation of VR
Experiences
2. What this course will not cover
This course will not cover any augmented reality (AR) technology.
However, this course will be a good precursor to any AR course. This course
will not teach how to build the VR hardware, however building a simple VR
headset can be an excellent project for a subsequent project course.
3. Interaction with Other Visual Computing courses
Geometry and rendering of virtual worlds are the topics in this
course that overlaps with CS 112 (Computer Graphics). Although familiarity with
illumination, transformation and animation is important for VR, a 10 week course on VR cannot afford to delve in its depths.
Therefore, all the assignments for this course will be done in Unity that
shields the students from implementing all these elements of the graphics
pipeline. However, CS 112 solely focuses on these topics giving students a
detailed tour of the graphics pipeline, associated programming of vertex and
fragment shaders, deep mathematical details of setting up viewpoints, scenes,
illumination, transformations and animations. Interested students can follow up
CS 118 with CS 112 to learn the aforementioned topics in detail. Students who
have taken CS 112 prior to CS 118 will refresh their memories when we cover
geometry and rendering of VR worlds for 2 weeks and will additionally learn how
to use this knowledge in a VR application.
CS 118 will not overlap with any other visual computing course
like CS 111, CS 113, CS 114, CS 116 and CS 117. However, it will supplement
these courses creating a strong visual computing concentration, offering
students more options of courses to select from.
4. Programming Assignments
The programming assignments for CS 118 have been designed to
augment the course content. All assignments will be performed in Unity.
a.
Assignment
1: As students learn
geometry and rendering of VR world, they will be doing the programming
assignments of Lights, Camera, Action! This will teach them to set up a
VR scene, illuminate it, set up a camera in the scene and render the scene from
the camera location.
b.
Assignment
2: The second assignment, Going on a Trip, will allow the students to
animate the camera and create realistic sound to create a ride like
Disneyland’s It’s a small world. This is the first time the students
will create a complete VR visual experience in this course. They will also
learn how to make their VR application compatible to run on mobile VR.
Designing a full experience by the middle of the course helps to keep students
interested to explore more.
5. Hardware Required: The students will be required to test the output of their
programming assignments on a VR headset with the TA. The TAs and students will
have access to VR headsets via multiple headsets available in the iGravi lab of ICS and UCI science library VR headset loan
program to which Facebook/ICS donated multiple VR headsets in 2019.
6. Evaluation: This course will be a programming intensive course with most of
the grades being dependent on the programming assignments. However, in order to
prepare students on the theoretical background on the hardware, perception,
tracking, interaction and evaluation --- which together form a significant part
of the instruction -- we would have a midterm, a final, and a couple of written
assignments that will prepare students for the midterm and final. These exams
would make up the remaining part of the grades. The weights for evaluation
based on these multiple activities will be as follows.
a.
Programming Assignments:
60%
b.
Written Assignments: 5%
c.
Midterm: 10%
d.
Finals: 25%
7. Student Time Commitments: In addition to the 3 hours of lecture every week, students will
need to spend about 9 hours a week in reading, doing programming assignments,
answering written assignments questions and preparing for midterm and final
exams.
8. Assistance Required: Given this is a highly hands-on course with relatively high
theoretical fundamental exposure, the job of the assistant involves the
following.
a.
Helping with programming
assignments
b.
Helping testing
programming assignments on a VR headset from time to time
c.
Mandatory discussion
session for teaching Unity
d.
Grading written
assignments, midterms and final exams
e.
Helping with
understanding of theoretical concepts presented in class in discussion sessions
and additional office hours
With this high workload, the class would need 1TA per 25-30
students.
Adapting to Covid 19
With the current situation with Covid
19, testing programming assignments on VR headsets will not be possible. The
library is still not open to provide enough headsets. To address this
situation, all the assignments will be modified to be done on a regular PC or
laptop with simulated controls.
TENTATIVE SCHEDULE
Textbook: Virtual Reality By
Steven M. LaValle (Cambridge Press) - http://lavalle.pl/vr
Prerequisite: ICS 6N (linear algebra), ICS 33 (Algorithms and Programming)
Wk |
Title |
Topics |
Textbook Chapters |
Programming Assignments |
Written Evaluations |
1 |
Intro to VR |
What is VR, History of VR,
Applications of VR, VR Systems Overview -- Hardware, Software, Human
Perception in VR |
Chapter 1 and 2 |
||
2 |
Geometric Transformations |
Homogeneous Coordinates, 3D
Transformations, Coordinate Transformations, Concatenation of
Transformations, Vertex Shaders |
Chapter 3 |
Lights Camera Action: Build a 3D VR scene, illuminate it and capture it with
camera |
|
3 |
Rendering |
Illumination and shading, texture
mapping, fragment shaders, additional considerations for VR |
Chapter 7 |
Written Assignment 1: On Chapters
1-4, 11 |
|
4 |
Audio in VR |
Sound, Physiology of Ear, auditory
perception and rendering |
Chapter 11 |
Going on a Trip: Animation of Camera, Assigning Sounds, Mobile VR
compatibility, A Small World kind of Ride |
|
5 |
Optics and The Human Eye |
Lenses, Optical Abberations, Cameras, Color, Displays, the human eye,
Implications for VR |
Chapter 4 and 5 |
||
6 |
Visual Perception |
The physiology of the human eye;
depth motion and color perception; implications for VR |
Chapter 5 and 6 |
An Application of VR: 360 degree immersive video |
MIDTERM |
7 |
Motion and Tracking |
Physics of Motion in Real World,
Mismatched motion in VR, Tracking of 2D and 3D orientation |
Chapter 8 and 9 |
Written Assignment 2: On Chapters
5-10, 12 |
|
8 |
Tracking and Interaction |
Tracking of position and
orientation, tracking of 3D bodies, 3D scene scanning, Motor Functions and
Remapping |
Chapter 9 and 10 |
Track and Interact: Eye Gaze Tracker, Control Tracking and Responding with
Interaction |
|
9 |
Interaction |
Locomotion, Manipulation, Social
Interaction, Additional Mechanisms |
Chapter 10 |
||
10 |
Evaluating VR experiences |
Perceptual Trainings, developers' tips,
Motion and sickness, experimentation with human subjects, Frontiers and
Review |
Chapter 12 |
||
Finals Week |
FINAL EXAM |