Innovative 'SonicSense' Technology Gives Robots Human-Like Abilities to Feel and Identify Objects - TUN


Innovative 'SonicSense' Technology Gives Robots Human-Like Abilities to Feel and Identify Objects - TUN

In a major stride towards humanizing robotic interactions, researchers at Duke University have unveiled a revolutionary system called SonicSense. This cutting-edge technology equips robots with the ability to 'hear' and 'feel' objects using acoustic vibrations, similar to human sensory perception.

The breakthrough research will be presented at the Conference on Robot Learning (CoRL 2024) taking place Nov. 6-9 in Munich, Germany.

"This is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch," Boyuan Chen, an assistant professor of mechanical engineering & materials science and computer science at Duke, said in a news release.

The SonicSense system features a robotic hand with four fingers, each embedded with a contact microphone in the fingertip. These microphones detect and record vibrations when the robot interacts with various objects through tapping, grasping or shaking.

Such a capability closely mimics human interaction with the physical world and enables robots to identify objects based on acoustic cues, filtering out ambient noise for more precise analysis.

"Robots today mostly rely on vision to interpret the world," lead author Jiaxun Liu, a first-year doctoral student in Chen's laboratory, said in the news release. "We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to 'feel' and understand the world."

Using acoustic vibrations to understand object properties is not entirely new, but SonicSense significantly enhances the technique. While previous attempts used a single finger and struggled in noisy environments, SonicSense uses four fingers and sophisticated AI algorithms, allowing it to accurately interpret complex objects in various conditions.

"SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects," Chen added. "While vision is essential, sound adds layers of information that can reveal things the eye might miss."

The practical applications of SonicSense are vast. For instance, the technology enables a robot to count the number of dice in a box by shaking it or to determine the amount of liquid in a bottle. Moreover, it can reconstruct an object's 3D shape and identify its material composition by tapping on it, even if the object has complex geometries or is made from multiple materials.

A critical aspect of SonicSense's development is its ability to perform in real-world settings, as opposed to controlled lab environments that are typically used for developing robotic sensing technologies.

"While most datasets are collected in controlled lab settings or with human intervention, we needed our robot to interact with objects independently in an open lab environment," Liu added. "It's difficult to replicate that level of complexity in simulations. This gap between controlled and real-world data is critical, and SonicSense bridges that by enabling robots to interact directly with the diverse, messy realities of the physical world."

Looking ahead, the research team aims to enhance SonicSense's capabilities, including the integration of object-tracking algorithms to better handle dynamic environments. The cost-effective nature of the system, which utilizes commercially available contact microphones similar to those used by musicians, adds to its potential for widespread application.

The development of more sophisticated robotic hands is also on the horizon, enabling SonicSense to perform tasks requiring delicate and precise touch.

"We're excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions," added Chen.

This pioneering work could very well shape the future of robotic interaction, bringing machines a step closer to human-like adaptability and intuition.

Previous articleNext article

POPULAR CATEGORY

industry

6804

fun

8669

health

6790

sports

8946