ShapeSonic: Sonifying Fingertip Interactions for Non-Visual Virtual Shape Perception

Jialin Huang, George Mason University, United States of America,
Alexa Siu, Adobe Research, United States of America,
Rana Hanocka, University of Chicago, United States of America,
Yotam Gingold, George Mason University, United States of America,

SA Conference Papers '23: SIGGRAPH Asia 2023 Conference Papers, Sydney, NSW, Australia, December 2023

For sighted users, computer graphics and virtual reality allow them to model and perceive imaginary objects and worlds. However, these approaches are inaccessible to blind and visually impaired (BVI) users, since they primarily rely on visual feedback. To this end, we introduce ShapeSonic, a system designed to convey vivid 3D shape perception using purely audio feedback or sonification. ShapeSonic tracks users’ fingertips in 3D and provides real-time sound feedback (sonification). The shape's geometry and sharp features (edges and corners) are expressed as sounds whose volumes modulate according to fingertip distance. ShapeSonic is based on a mass-produced, commodity hardware platform (Oculus Quest). In a study with 15 sighted and 6 BVI users, we demonstrate the value of ShapeSonic in shape landmark localization and recognition. ShapeSonic users were able to quickly and relatively accurately “touch” points on virtual 3D shapes in the air.

CCS Concepts:Computing methodologies → Virtual reality; • Computing methodologies → Shape modeling; • Hardware → Sound-based input / output; • Hardware → Tactile and hand-based interfaces; • Human-centered computing → Accessibility technologies;

Keywords: shape, perception, 3D, virtual reality, sonification, non-visual interfaces

ACM Reference Format:
Jialin Huang, Alexa Siu, Rana Hanocka, and Yotam Gingold. 2023. ShapeSonic: Sonifying Fingertip Interactions for Non-Visual Virtual Shape Perception. In SIGGRAPH Asia 2023 Conference Papers (SA Conference Papers '23), December 12--15, 2023, Sydney, NSW, Australia. ACM, New York, NY, USA 10 Pages.