Smiley face
Weather     Live Markets

Cornell University researchers have developed a new technology that uses sound to track eye movements and facial expressions without the need for visual surveillance. The technology, called GazeTrak and EyeEcho, uses sonar-like techniques to observe tiny changes in facial muscles and track eye movements. GazeTrak utilizes one speaker and four microphones placed around each eye frame of a pair of glasses, while EyeEcho only requires one speaker and one microphone per side aimed at a user’s cheek. This technology could potentially be incorporated into future eyewear and VR headsets for hands-free video calls and avatar interactions.

The system is small, inexpensive, and low-powered, making it suitable for use in devices like Meta’s Smart Glasses or VR headsets that utilize eye tracking for system navigation and control. The AI system processes the sound data to determine a user’s gaze focus or facial expressions. This technology could allow for more detailed facial expressions and gaze movements in virtual reality environments, enhancing interactions with other users. Cornell states that no other current smart glasses can continuously track facial expressions like EyeEcho.

The researchers believe that these technologies could run for several hours on a typical smart glasses battery or a full day on a VR headset battery. As prototypes, they expect that battery life could improve with further development. The ability to track eye movements and facial expressions using sound could revolutionize the way users interact with VR environments and could open up new possibilities for hands-free communication through avatars. The technology offers a promising alternative to visual surveillance for tracking users’ gestures and gaze movements.

Share.
© 2024 Globe Timeline. All Rights Reserved.