Walking Balance Assessment with Eye-tracking and Spatial Data Visualization

Zhu Wang, Anat Lubetzky, and Ken Perlin. 2021. Walking Balance Assessment with Eye-tracking and Spatial Data Visualization. In ACM SIGGRAPH 2021 Immersive Pavilion (SIGGRAPH ’21). Association for Computing Machinery, New York, NY, USA, Article 10, 1–2. DOI:https://doi.org/10.1145/3450615.3464533

See the full publication here.

Author(s): Zhu Wang, Anat Lubetzky, and Ken Perlin

Virtual Reality (VR) based assessment systems can simulate diverse real-life scenarios and help clinicians assess participants’ performance under controlled functional contexts. Our previous work demonstrated an assessment paradigm to provide multi-sensory stimuli and cognitive load, and quantify walking balance with obstacle negotiation by motion capture and pressure sensing. However, we need to fill two gaps to make it more clinically relevant: 1. it required offline complex data processing with external statistical analysis software; 2. it utilized motion tracking but overlooked eye movement. Therefore, we present a novel walking balance assessment system with eye tracking to investigate the role of eye movement in walking balance and spatial data visualization to better interpret and understand the experimental data. The spatial visualization includes instantaneous in-situ VR replay for the gaze, head, and feet; and data plots for the outcome measures. The system fills a need to provide eye tracking and intuitive feedback in VR to experimenters, clinicians, and participants in real-time.


Leave a Comment

Your email address will not be published. Required fields are marked *