A University of Illinois research group is defining a new sub-area of mobile computing that they call "earable computing." The team believes that earphones will be the next significant milestone in wearable devices, and that new hardware, software, and apps will all run on the platform.
The Systems and Networking Research Group (SyNRG) at the University's Coordinated Science Laboratory believes tomorrow's earphones will continuously sense human behavior, run acoustic augmented reality, have Alexa and Siri whisper just-in-time information, track user motion and health, and offer seamless security, among many other capabilities.
Computer science Ph.D. student Zhijian Yang and other members of the SyNRG group, including Yu-Lin Wei and Liz Li, have published a series of papers in this area. In "Ear-AR: Indoor Acoustic Augmented Reality on Earphones," presented at MobiCom '20, the group looks at how smart earphone sensors can track human movement, and, depending on the user's location, play 3D sounds in the ear.
A second paper, "EarSense: Earphones as a Teeth Activity Sensor," also from MobiCom '20, looks at how earphones could sense facial and in-mouth activities such as teeth movements and taps, enabling a hands-free modality of communication to smartphones.
A third publication, "Voice Localization Using Nearby Wall Reflections," investigates the use of algorithms to detect the direction of a sound. This means that if two people are having a conversation, one's earphones would be able to tune into the direction of the speaker's voice.
From University of Illinois at Urbana-Champaign
View Full Article
No entries found