top of page
Writer's pictureZhipeng Li

Color-to-Depth Mappings as Depth Cues in Virtual Reality

Updated: Jun 15

Zhipeng Li, Yikai Cui, Tianze Zhou, Yu Jiang, Yuntao Wang, Yukang Yan, Michael Nebeling, and Yuanchun Shi.




Despite significant improvements to Virtual Reality (VR) technologies, most VR displays are fixed focus and depth perception is still a key issue that limits the user experience and the interaction performance. To supplement humans' inherent depth cues (e.g., retinal blur, motion parallax), we investigate users' perceptual mappings of distance to virtual objects' appearance to generate visual cues aimed to enhance depth perception. As a first step, we explore color-to-depth mappings for virtual objects so that their appearance differs in saturation and value to reflect their distance. Through a series of controlled experiments, we elicit and analyze users' strategies of mapping a virtual object's hue, saturation, value and a combination of saturation and value to its depth. Based on the collected data, we implement a computational model that generates color-to-depth mappings fulfilling adjustable requirements on confusion probability, number of depth levels, and consistent saturation/value changing tendency. We demonstrate the effectiveness of color-to-depth mappings in a 3D sketching task, showing that compared to single-colored targets and strokes, with our mappings, the users were more confident in the accuracy without extra cognitive load and reduced the perceived depth error by 60.8%. We also implement four VR applications and demonstrate how our color cues can benefit the user experience and interaction performance in VR.

21 views
bottom of page