Comparative assessment of spatial perception in augmented reality depending on the consistency of depth cues; pp. 326–332Full article in PDF format | 10.3176/proc.2021.4S.03
Discrepancies between depth cues (accommodation and vergence) is one of the major issues caused in a stereoscopic augmented reality at close viewing distances. It adversely affects not only user comfort but also spatial judgements. Images with consonant cues at different distances have become available due to the implementation of multifocal architecture in the head-mounted displays, although its effect on spatial perception has remained unknown. In this psychophysical study, we investigated the effects of consonant and conflicting depth cues on perceptual distance matching in the stereoscopic environment of augmented reality using a head-mounted display that was driven in two modes: multifocal mode and single-focal plane mode. The participants matched the distance of a real object with the images projected at three viewing distances (45 cm, 65 cm, and 115 cm). As a result, no significant differences in the accuracy of spatial perception were shown depending on the consistency of cues. However, the perceptual tasks were completed faster when the depth cues were consonant. Overall, the results of our experiment show that consonant depth cues facilitate faster judgements on spatial relations between real objects and images projected in augmented reality, which can be achieved when images are displayed using multiple depth planes in the head-mounted display. Further technological advancements might be required to improve the accuracy of spatial judgements in augmented reality.
1. Cutting, J. E. and Vishton, P. M. Perceiving layout and knowing distances: the integration, relative potency, and contextual use of different information about depth. In Handbook of Perception and Cognition: Perception of Space and Motion, vol. 5 (Epstein, W. and Rogers, S., eds). Academic Press, San Diego, CA, 1995, 69–117.
2. Viguier, A., Clément, G. and Trotter, Y. Distance perception within near visual space. Perception, 2001, 30(1), 115–124.
3. Peli, E. Optometric and perceptual issues with head-mounted displays. In Visual Instrumentation: Optical Design and Engineering Principles (Mouroulis, P., ed.). McGraw-Hill, New York, NY, 1999.
4. Condino, S., Carbone, M., Piazza, R., Ferrari, M. and Ferrari, V. Perceptual limits of optical see-through visors for augmented reality guidance of manual tasks. IEEE Trans. Biomed. Eng., 2020, 67(2), 411–419.
5. Peillard, E., Itoh, Y., Normand, J.-M., Argelaguet, F., Moreau, G. and Lécuyer, A. Can retinal displays improve spatial perception in augmented reality? In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality, Porto de Galinhas, Brazil, November 9–13, 2020. IEEE.
6. Hoffman, D. M., Girshick, A. R., Akeley, K. and Banks, M. S. Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. J. Vis., 2008, 8(3), 33.
7. Howarth, P. A. Potential hazards of viewing 3-D stereoscopic television, cinema and computer games: A review. Ophthalmic Physiol. Opt., 2011, 31(2), 111–122.
8. Watt, S. J., Akeley, K., Ernst, M. O. and Banks, M. S. Focus cues affect perceived depth. J. Vis., 2005, 5(10), 7.
9. Naceri, A., Moscatelli, A. and Chellali, R. Depth discrimination of constant angular size stimuli in action space: role of accommodation and convergence cues. Front. Human Neurosci., 2015, 9, 511.
10. Wee, S. W., Moon, N. J., Lee, W. K. and Jeon, S. Ophthalmological factors influencing visual asthenopia as a result of viewing 3D displays. Br. J. Ophthalmol., 2012, 96(11), 1391–1394.
11. Shibata, T., Kim, J., Hoffman, D. M. and Banks, M. S. The zone of comfort: Predicting visual discomfort with stereo displays. J. Vis., 2011, 11(8), 11.
12. Vienne, C., Sorin, L., Blondé, L., Huynh-Thu, Q. and Mamassian, P. Effect of the accommodation-vergence conflict on eye movements. Vision Res., 2014, 100, 124–133.
13. Liversedge, S. P., Holliman, N. S. and Blythe, H. I. Binocular coordination in response to stereoscopic stimuli. In Stereoscopic Displays and Applications XX. Proc. SPIE, 2009, 7237, 72370M.
14. Mon-Williams, M. and Tresilian, J. R. Some recent studies on the extraretinal contribution to distance perception. Perception, 1999, 28(2), 167–181.
15. Naceri, A., Chellali, R. and Hoinville, T. Depth perception within peripersonal space using head-mounted display. Presence: Teleoperators Virtual Environ., 2011, 20(3), 254–272.
16. Page, D., Thomas, T., Kelley, S., Jones, P. G. and Miller, D. A. Vergence and accommodation in simulation and training with 3D displays. In Proceedings of Interservice/Industry Training, Simulation, and Education Conference 2014, 14147.
17. Livingston, M. A., Ellis, S. R., White, S. M., Feiner, S. K. and Lederer, A. Vertical vergence calibration for augmented reality displays. In Proceedings of IEEE Virtual Reality Conference, Alexandria, VA, USA, March 12–15, 2006, 287–288.
18. Rolland, J. P., Krueger, M. W. and Goon, A. Multifocal planes head-mounted displays. Appl. Opt., 2000, 39(19), 3209–3215.
19. Zabels, R., Osmanis, K., Narels, M., Gertners, U., Ozols, A., Rutenbergs, K. and Osmanis, I. AR displays: Next-generation technologies to solve the vergence–accommodation conflict. Appl. Sci., 2019, 9(15), 3147.
20. Zhan, T., Xiong, J., Zou, J. and Wu, S.-T. Multifocal displays: review and prospect. PhotoniX, 2020, 1, 10.
21. Livingston, M. A., Ai, Z. and Decker, J. W. A user study towards understanding stereo perception in head-worn augmented reality displays. In Proceedings of 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Orlando, FL, USA, October 19–22, 2009. IEEE, 53–56.
22. Bakeman, R. Recommended effect size statistics for repeated measures designs. Behav. Res. Methods, 2005, 37(3), 379–384.
23. Cohen, J. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Lawrence Erlbaum Associates, Hillsdale, NJ, 1988.
24. Ping, J., Weng, D., Liu, Y. and Wang, Y. Depth perception in shuffleboard: Depth cues effect on depth perception in virtual and augmented reality system. J. Soc. Inf. Disp., 2019, 28(2), 164–176.
25. Matsushima, E. H., Vaz, A. M., Cazuza, R. A. and Ribeiro Filho, N. P. Independence of egocentric and exocentric direction processing in visual space. Psychol. Neurosci., 2014, 7(3), 277–284.
26. Singh, G., Ellis, S. R. and Swan, J. E. The effect of focal distance, age, and brightness on near-field augmented reality depth matching. IEEE Trans. Vis. Comput. Graph., 2018, 26(2), 1385–1398.
27. Drascic, D. and Milgram, P. Perceptual issues in augmented reality. Proc. SPIE, 1996, 2653, 123–134.
28. Lin, C. J., Caesaron, D. and Woldegiorgis, B. H. The effects of augmented reality interaction techniques on egocentric distance estimation accuracy. Appl. Sci., 2019, 9(21), 4652.
29. Rousset, T., Bourdin, C., Goulon, C., Monnoyer, J. and Vercher, J.-L. Misperception of egocentric distances in virtual environments: More a question of training than a technological issue? Displays, 2018, 52, 8–20.
30. Eckert, M., Volmerg, J. S. and Friedrich, C. M. Augmented reality in medicine: Systematic and bibliographic review. JMIR Mhealth Uhealth, 2019, 7(4), e10967.
31. Uppot, R. N., Laguna, B., McCarthy, C. J., De Novi, G., Phelps, A., Siegel, E. and Courtier, J. Implementing virtual and augmented reality tools for radiology education and training, communication, and clinical care. Radiology, 2019, 291(3), 570–580.
32. Douglas, D. B., Wilke, C. A., Gibson, J. D., Boone, J. M. and Wintermark, M. Augmented reality: Advances in diagnostic imaging. Multimodal Technol. Interact., 2017, 1(4), 29.
33. O’Shea, R. P., Blackburn, S. G. and Ono, H. Contrast as a depth cue. Vision Res., 1994, 34(12), 1595–1604.
Back to Issue