Students, faculty, and technologists at Washington & Lee’s Integrative and Quantitative (IQ) Center have been experimenting with virtual reality (VR) for a couple years, starting with cell phone-based VR systems like Google Cardboard. This year W&L upgraded to a dedicated VR headset, called the HTC Vive. These new VR headsets provide a compelling (and immersive) way to visualize and interact with content but there is very little educational content currently available, especially for higher education. This means that, for the time being, getting the most out of these systems requires either creating original content or adapting existing material to work in VR.
Fortunately, when it comes to visualization, many of the workflows for generating and manipulating 3D content such as molecular modeling, 3D animation, motion capture, photogrammetry, geographic information systems, 360-degree photography and video translate well to VR platforms with a little work and a healthy respect for the current limitations of the hardware.
According to IQ Center Academic Technologist Dave Pfaff:
Developing interactive scenes for VR takes a little more work and some specialized skills, but the potential for creating educational tools that facilitate active and blended learning at all levels of education are virtually limitless.
Faculty and students, including a group from W&L Advanced Research Cohort (ARC), have launched a number of explorations this year that are highlighted in more detail on W&L’s Academic Technology Blog, including:
- Interactive structural biology models (catalyzed phosphorylation reaction)
- Photogrammetry models of campus buildings
- Laser scan model of a Wooley Mammoth
- Crystal structures in 3D
- “Grabbable” MRI scans of the brain from the “Glass Brain” project
- Motion capture animation from a dance class
Clip from W&L’s virtual reality lab