top of page
  • Writer's pictureZhipeng Li

Color-to-Depth Mappings as Depth Cues in Virtual Reality

Zhipeng Li, Yikai Cui, Tianze Zhou, Yu Jiang, Yuntao Wang, Yukang Yan, Michael Nebeling, and Yuanchun Shi. 2022. Color-to-Depth Mappings as Depth Cues in Virtual Reality. In The 35th Annual ACM Symposium on User Interface Software and Technology (UIST '22), October 29-November 2, 2022, Bend, OR, USA. ACM, New York, NY, USA 14 Pages.

Despite significant improvements to Virtual Reality (VR) technologies, most VR displays are fixed focus and depth perception is still a key issue that limits the user experience and the interaction performance. To supplement humans' inherent depth cues (e.g., retinal blur, motion parallax), we investigate users' perceptual mappings of distance to virtual objects' appearance to generate visual cues aimed to enhance depth perception. As a first step, we explore color-to-depth mappings for virtual objects so that their appearance differs in saturation and value to reflect their distance. Through a series of controlled experiments, we elicit and analyze users' strategies of mapping a virtual object's hue, saturation, value and a combination of saturation and value to its depth. Based on the collected data, we implement a computational model that generates color-to-depth mappings fulfilling adjustable requirements on confusion probability, number of depth levels, and consistent saturation/value changing tendency. We demonstrate the effectiveness of color-to-depth mappings in a 3D sketching task, showing that compared to single-colored targets and strokes, with our mappings, the users were more confident in the accuracy without extra cognitive load and reduced the perceived depth error by 60.8%. We also implement four VR applications and demonstrate how our color cues can benefit the user experience and interaction performance in VR.

4 views0 comments
bottom of page