We present a state of the art and scoping review of the literature to examine embodied information behaviors, as reflected in shared gaze interactions, within co-present extended reality experiences. Recent proliferation of consumer-grade head-mounted XR displays, situated at multiple points along the Reality-Virtuality Continuum, has increased their application in social, collaborative, and analytical scenarios that utilize data and information at multiple scales. Shared gaze represents a modality for synchronous interaction in these scenarios, yet there is a lack of understanding of the implementation of shared eye gaze within co-present extended reality contexts. We use gaze behaviors as a proxy to examine embodied information behaviors. This review examines the application of eye tracking technology to facilitate interaction in multiuser XR by sharing a user’s gaze, identifies salient themes within existing research since 2013 in this context, and identifies patterns within these themes relevant to embodied information behavior in XR. We review a corpus of 50 research papers that investigate the application of shared gaze and gaze tracking in XR generated using the SALSA framework and searches in multiple databases. The publications were reviewed for study characteristics, technology types, use scenarios, and task types. We construct a state-of-the field and highlight opportunities for innovation and challenges for future research directions.
Kathryn Hays, Arturo Barrera, Lydia Ogbadu-Oladapo, Olumuyiwa Oyedare, Julia Payne, Mohotarema Rashid, Jennifer Stanley, Lisa Stocker, Christopher Lueg, Michael Twidale, Ruth West, "A state of the art and scoping review of embodied information behavior in shared, co-present extended reality experiences" in Proc. IS&T Int’l. Symp. on Electronic Imaging: Engineering Reality of Virtual Reality, 2022, pp 298-1 - 298-19, https://doi.org/10.2352/EI.2022.34.12.ERVR-298