ESTONIAN ACADEMY
PUBLISHERS
eesti teaduste
akadeemia kirjastus
PUBLISHED
SINCE 1952
 
Proceeding cover
proceedings
of the estonian academy of sciences
ISSN 1736-7530 (Electronic)
ISSN 1736-6046 (Print)
Impact Factor (2020): 1.045

Comparative assessment of spatial perception in augmented reality depending on the consistency of depth cues; pp. 326–332

Full article in PDF format | 10.3176/proc.2021.4S.03

Authors
Linda Krauze, Tatjana Pladere, Roberts Zabels, Rendijs Smukulis, Viktorija Barkovska, Vita Konosonoka, Ibrahim Musayev, Aiga Svede, Gunta Krumina

Abstract

Discrepancies between depth cues (accommodation and vergence) is one of the major issues caused in a stereoscopic augmented reality at close viewing distances. It adversely affects not only user comfort but also spatial judgements. Images with consonant cues at different distances have become available due to the implementation of multifocal architecture in the head-mounted displays, although its effect on spatial perception has remained unknown. In this psychophysical study, we investigated the effects of consonant and conflicting depth cues on perceptual distance matching in the stereoscopic environment of augmented reality using a head-mounted display that was driven in two modes: multifocal mode and single-focal plane mode. The participants matched the distance of a real object with the images projected at three viewing distances (45 cm, 65 cm, and 115 cm). As a result, no significant differences in the accuracy of spatial perception were shown depending on the consistency of cues. However, the perceptual tasks were completed faster when the depth cues were consonant. Overall, the results of our experiment show that consonant depth cues facilitate faster judgements on spatial relations between real objects and images projected in augmented reality, which can be achieved when images are displayed using multiple depth planes in the head-mounted display. Further technological advancements might be required to improve the accuracy of spatial judgements in augmented reality.


References

1. Cutting, J. E. and Vishton, P. M. Perceiving layout and knowing distances: the integration, relative potency, and contextual use of different information about depth. In Handbook of Perception and Cognition: Perception of Space and Motion, vol. 5 (Epstein, W. and Rogers, S., eds). Academic Press, San Diego, CA, 1995, 69–117.
https://doi.org/10.1016/B978-012240530-3/50005-5

2. Viguier, A., Clément, G. and Trotter, Y. Distance perception within near visual space. Perception, 2001, 30(1), 115–124. 
https://doi.org/10.1068/p3119

3. Peli, E. Optometric and perceptual issues with head-mounted displays. In Visual Instrumentation: Optical Design and Engineering Principles (Mouroulis, P., ed.). McGraw-Hill, New York, NY, 1999.

4. Condino, S., Carbone, M., Piazza, R., Ferrari, M. and Ferrari, V. Perceptual limits of optical see-through visors for augmented reality guidance of manual tasks. IEEE Trans. Biomed. Eng., 2020, 67(2), 411–419. 
https://doi.org/10.1109/TBME.2019.2914517

5. Peillard, E., Itoh, Y., Normand, J.-M., Argelaguet, F., Moreau, G. and Lécuyer, A. Can retinal displays improve spatial perception in augmented reality? In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality, Porto de Galinhas, Brazil, November 9–13, 2020. IEEE. 
https://doi.org/10.1109/ISMAR50242.2020.00028

6. Hoffman, D. M., Girshick, A. R., Akeley, K. and Banks, M. S. Vergence-accommodation conflicts hinder visual perform­ance and cause visual fatigue. J. Vis., 2008, 8(3), 33. 
https://doi.org/10.1167/8.3.33

7. Howarth, P. A. Potential hazards of viewing 3-D stereoscopic television, cinema and computer games: A review. Ophthalmic Physiol. Opt., 2011, 31(2), 111–122.
https://doi.org/10.1111/j.1475-1313.2011.00822.x

8. Watt, S. J., Akeley, K., Ernst, M. O. and Banks, M. S. Focus cues affect perceived depth. J. Vis., 2005, 5(10), 7.
https://doi.org/10.1167/5.10.7

9. Naceri, A., Moscatelli, A. and Chellali, R. Depth dis­crimination of constant angular size stimuli in action space: role of accommodation and convergence cues. Front. Human Neurosci., 2015, 9, 511.
https://doi.org/10.3389/fnhum.2015.00511

10. Wee, S. W., Moon, N. J., Lee, W. K. and Jeon, S. Ophthalmological factors influencing visual asthenopia as a result of viewing 3D displays. Br. J. Ophthalmol., 2012, 96(11), 1391–1394.
https://doi.org/10.1136/bjophthalmol-2012-301690

11. Shibata, T., Kim, J., Hoffman, D. M. and Banks, M. S. The zone of comfort: Predicting visual discomfort with stereo displays. J. Vis., 2011, 11(8), 11.
https://doi.org/10.1167/11.8.11

12. Vienne, C., Sorin, L., Blondé, L., Huynh-Thu, Q. and Mamassian, P. Effect of the accommodation-vergence conflict on eye movements. Vision Res., 2014, 100, 124–133.
https://doi.org/10.1016/j.visres.2014.04.017

13. Liversedge, S. P., Holliman, N. S. and Blythe, H. I. Binocular coordination in response to stereoscopic stimuli. In Stereoscopic Displays and Applications XX. Proc. SPIE, 2009, 7237, 72370M.
https://doi.org/10.1117/12.807251

14. Mon-Williams, M. and Tresilian, J. R. Some recent studies on the extraretinal contribution to distance perception. Perception, 1999, 28(2), 167–181.
https://doi.org/10.1068/p2737

15. Naceri, A., Chellali, R. and Hoinville, T. Depth perception within peripersonal space using head-mounted display. Presence: Teleoperators Virtual Environ., 2011, 20(3), 254–272.
https://doi.org/10.1162/PRES_a_00048

16. Page, D., Thomas, T., Kelley, S., Jones, P. G. and Miller, D. A. Vergence and accommodation in simulation and training with 3D displays. In Proceedings of Interservice/Industry Training, Simulation, and Education Conference 2014, 14147.

17. Livingston, M. A., Ellis, S. R., White, S. M., Feiner, S. K. and Lederer, A. Vertical vergence calibration for augmented reality displays. In Proceedings of IEEE Virtual Reality Conference, Alexandria, VA, USA, March 12–15, 2006, 287–288.

18. Rolland, J. P., Krueger, M. W. and Goon, A. Multifocal planes head-mounted displays. Appl. Opt., 2000, 39(19), 3209–3215.
https://doi.org/10.1364/AO.39.003209

19. Zabels, R., Osmanis, K., Narels, M., Gertners, U., Ozols, A., Rutenbergs, K. and Osmanis, I. AR displays: Next-generation technologies to solve the vergence–accommodation conflict. Appl. Sci., 2019, 9(15), 3147.
https://doi.org/10.3390/app9153147

20. Zhan, T., Xiong, J., Zou, J. and Wu, S.-T. Multifocal displays: review and prospect. PhotoniX, 2020, 1, 10.
https://doi.org/10.1186/s43074-020-00010-0

21. Livingston, M. A., Ai, Z. and Decker, J. W. A user study towards understanding stereo perception in head-worn augmented reality displays. In Proceedings of 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Orlando, FL, USA, October 19–22, 2009. IEEE, 53–56.
https://doi.org/10.1109/ISMAR.2009.5336496

22. Bakeman, R. Recommended effect size statistics for repeated measures designs. Behav. Res. Methods, 2005, 37(3), 379–384.
https://doi.org/10.3758/BF03192707

23. Cohen, J. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Lawrence Erlbaum Associates, Hillsdale, NJ, 1988.

24. Ping, J., Weng, D., Liu, Y. and Wang, Y. Depth perception in shuffleboard: Depth cues effect on depth perception in virtual and augmented reality system. J. Soc. Inf. Disp., 2019, 28(2), 164–176.
https://doi.org/10.1002/jsid.840

25. Matsushima, E. H., Vaz, A. M., Cazuza, R. A. and Ribeiro Filho, N. P. Independence of egocentric and exocentric direction processing in visual space. Psychol. Neurosci., 2014, 7(3), 277–284.
https://doi.org/10.3922/j.psns.2014.050

26. Singh, G., Ellis, S. R. and Swan, J. E. The effect of focal distance, age, and brightness on near-field augmented reality depth matching. IEEE Trans. Vis. Comput. Graph., 2018, 26(2), 1385–1398.
https://doi.org/10.1109/TVCG.2018.2869729

27. Drascic, D. and Milgram, P. Perceptual issues in augmented reality. Proc. SPIE, 1996, 2653, 123–134.
https://doi.org/10.1117/12.237425

28. Lin, C. J., Caesaron, D. and Woldegiorgis, B. H. The effects of augmented reality interaction techniques on egocentric distance estimation accuracy. Appl. Sci., 2019, 9(21), 4652.
https://doi.org/10.3390/app9214652

29. Rousset, T., Bourdin, C., Goulon, C., Monnoyer, J. and Vercher, J.-L. Misperception of egocentric distances in virtual environments: More a question of training than a technological issue? Displays, 2018, 52, 8–20.
https://doi.org/10.1016/j.displa.2018.02.004

30. Eckert, M., Volmerg, J. S. and Friedrich, C. M. Augmented reality in medicine: Systematic and bibliographic review. JMIR Mhealth Uhealth, 2019, 7(4), e10967.
https://doi.org/10.2196/10967

31. Uppot, R. N., Laguna, B., McCarthy, C. J., De Novi, G., Phelps, A., Siegel, E. and Courtier, J. Implementing virtual and augmented reality tools for radiology education and training, communication, and clinical care. Radiology, 2019, 291(3), 570–580. 
https://doi.org/10.1148/radiol.2019182210

32. Douglas, D. B., Wilke, C. A., Gibson, J. D., Boone, J. M. and Wintermark, M. Augmented reality: Advances in diagnostic imaging. Multimodal Technol. Interact., 2017, 1(4), 29.
https://doi.org/10.3390/mti1040029

33. O’Shea, R. P., Blackburn, S. G. and Ono, H. Contrast as a depth cue. Vision Res., 1994, 34(12), 1595–1604.
https://doi.org/10.1016/0042-6989(94)90116-3


Back to Issue