Gaze Analysis System for Immersive 360 Video for Preservice Teacher Education

Unified systems for multi-sensor devices, particularly eye-tracking in Virtual Reality (VR), are intricate and often require the listening and streaming of multichannel data. In this project, we propose a visual analysis framework for replicating a participant's viewing involvement by interpreting head movements as rotations and point-of-gaze (POG) as on-screen indicators. Our solution suggests an additional layer of system for near-real-time for processing and analyzing this multi-device data to connect with the data and enable both near-real-time or subsequent offline viewing of the entire VR eye-tracking session. Moreover, our method provides a no-batteries-need solution to create traditional eye-tracking visualization techniques. Finally, we apply three prior education technology analysis metrics: higher density gaze for students, shorter fixation time, and less fixation duration variance for students to determine expertise levels in this system. We systematically establish a ubiquitous, multi-device, eye-tracking solution to incorporate this approach. We evaluate the effectiveness of our system through a user study, using both expertise and non- expertise levels, and selectively surveying to ascertain the quality of the replicated experience and we test the system by running a real-world user study with sixty four different participants. We demonstrate the application's significance and potential to integrate prior analysis metrics using the collected data which this data collection and analysis have been approved by IRB.

Lenart, C., Ahadian, P., Yang, Y., Suo, S., Corsello, A., Kosko, K., & Guan, Q. (2023). Gaze analysis system for immersive 360 video for preservice teacher education. In A. El Saddik, T. Mei, R. Cucchiara (Eds.), Proceedings of ACM Multimedia (pp. 8608-8616). Association of Computing Machinery (ACM). https://doi.org/10.1145/3581783.3613908