I will present a series of research prototypes of immersive virtual reality systems that reproduce not only what users see and hear, but also what users feel. Unlike traditional approaches to VR haptics, such as vibrating gloves, our systems reproduce large physical effects, such as solid walls, splashing water, levers that user can flip, and tornados that actually lift users up in the air. To explore all possible large-scale haptic effects fast, our systems is not based on machines - it is based on people. Our systems orchestrate so-called “human actuators” to do all the physical labor. Just-in-time, these human workers manually lift, tilt, or push the player, or present props to simulate walls or levers. I will show a motion platform based on five human actuators, a real-walking VR setup based on ten human actuators, and a “consumer scale” version of our concept that runs on a single GearVR headset actuated by a single human actuator.
Haptic Turk - A Motion Platform Based on People (paper at CHI 2014):
TurkDeck - Physical Virtual Reality Based on People (paper at UIST 2015):
Lung-Pan Cheng is a Ph.D. Candidate working with Prof. Dr. Patrick Baudisch in the Human Computer Interaction Lab at Hasso Plattner Institute. His research primarily focuses on virtual reality, specifically in haptics and mobile technology. His recent research targets on making immersive haptic experience mass available.
Prior to his Ph.D. study, he received his B.S. in CS from National Chiao Tung University and M.S. in CS from the Mobile HCI Lab at National Taiwan University.