Weather     Live Markets

A research group at Osaka University has developed a technology that allows androids to dynamically express their mood states through facial movements. While androids may appear realistic in photographs, observing them in person can be unsettling as it can be difficult to determine their true emotional state. Previous methods of displaying facial expressions on robots involved a ‘patchwork method’ of pre-arranged action scenarios, which presented challenges in creating natural movements and transitions between expressions. The new technology utilizes waveform movements to generate complex facial movements in real time, eliminating the need for complex action data and minimizing unnatural movements during transitions.

Lead author Hisashi Ishihara and his research group developed a dynamic facial expression synthesis technology that uses individual waves to represent different facial gestures like “breathing,” “blinking,” and “yawning.” By propagating these waves to related facial areas and overlaying them, complex facial movements can be generated in real time. Waveform modulation allows for adjustments of individual waveforms based on the robot’s internal state, enabling instant reflection of changes in mood through facial movements. This advancement in dynamic facial expression synthesis aims to enable robots to exhibit more lively expressions and convey mood changes in response to their surroundings, including interactions with humans.

Senior author Koichi Osuka explains that by further developing dynamic facial expression synthesis, robots with complex facial movements can better communicate emotions, enriching emotional communication between humans and robots. Ishihara adds that by creating a system where the android’s internal emotions are reflected in every detail of its actions, androids may be perceived as having a heart. This technology has the potential to enhance communication robots by allowing them to adaptively adjust and express emotions, enabling more natural exchanges of information with humans.

The new technology allows androids to display a wide range of facial expressions in real time without the need for pre-arranged action scenarios. By utilizing waveform movements and waveform modulation, the robots can dynamically express their mood states through their facial movements, creating a more lifelike and humanlike interaction with humans. This advancement could lead to the development of androids that are perceived as having internal emotions and the ability to adaptively adjust their behavior based on their mood, enriching emotional communication between humans and robots.

The research group at Osaka University has developed a technology that enables androids to express their mood states through dynamic facial movements. By using waveform movements and waveform modulation, the robots can generate complex facial expressions in real time, eliminating the need for pre-arranged action scenarios. This advancement aims to enhance communication robots by allowing them to adaptively adjust and express emotions, responding to their internal state and surroundings. By creating a system where androids can convey emotions more naturally, the technology enriches emotional communication between humans and robots, making interactions more lifelike and engaging.

Share.
Exit mobile version