A highly skilled robot hand can operate in the dark - just like us.

Estimated read time: 5 min

Wireless

Think about what you do with your hands when you are at home at night pressing the buttons on the TV remote control, or in a restaurant with all kinds of cutlery and glassware. All of these skills are based on touch, while watching a TV show or selecting something from a menu. Our hands and fingers are incredibly dexterous mechanisms, and extremely sensitive to boot.

Robotics researchers have long tried to create “real” skill in robot hands, but the goal has been frustratingly elusive. Automated grippers and suction cups could pick and place items, but the more ingenious tasks such as grouping, inserting, forwarding, packing, etc., remained the domain of human manipulation. However, driven by advances in both sensor technology and machine learning techniques for processing perceptual data, the field of automated manipulation is changing very rapidly.

The highly dexterous robot hand works even in the dark

Researchers at Columbia University Engineering have demonstrated a highly dexterous robot hand, which combines an advanced sense of touch with motor learning algorithms in order to achieve a high level of dexterity.

As a demonstration of skill, the team chose a challenging manipulation task: executing an arbitrarily large rotation of an unequally shaped object in hand while always keeping the object in a stable and secure location. This is a very difficult task because it requires a static reset of a subset of the fingers, while the other fingers must maintain the stability of the object. Not only was the hand able to perform this task, but it also did it without any visual feedback at all, based solely on tactile sensing.

In addition to new levels of dexterity, the hand worked without any external cameras, so it’s immune to lighting, blockages, or similar issues. And the fact that the hand does not rely on vision to process objects means that it can do so in extremely difficult lighting conditions that would confuse vision-based algorithms — it can even work in the dark.

said Matti Ciocarli, assistant professor in the departments of mechanical engineering and computer science. “Some of the immediate uses could be in logistics and material handling, helping to alleviate supply chain problems like those that have plagued our economy in recent years, and in advanced manufacturing and assembly in factories.”

Utilizing optics-based tactile fingers

In previous work, Ciocarlie’s group collaborated with Ioannis Kymissis, professor of electrical engineering, to develop a new generation of opto-tactile robot fingers. These were the first robot fingers to achieve sub-millimeter-accurate contact localization while providing complete coverage of a complex multi-curved surface. In addition, the compact packaging and reduced number of wires of fingers allowed for easy integration into full robotic hands.

Teach the hand to perform complex tasks

For this new work, led by CIocarlie PhD researcher Gagan Khandate, the researchers independently designed and built a robotic hand with five fingers and 15 joints — each finger equipped with the team’s proprietary touch-sensing technology. The next step was to test the ability of the tactile hand to perform complex manipulation tasks. To do this, they used new methods of motor learning, or the ability of a robot to learn new physical tasks through practice. In particular, they used a method called deep reinforcement learning, augmented by new algorithms they developed for the efficient exploration of potential motor strategies.

The robot completed nearly one year of practice in just hours of real time

Inputs into the motor learning algorithms consisted exclusively of the team’s tactile and autonomic data, without any vision. Using the simulation as a training field, the robot completed nearly one year of practice in just hours of real time, thanks to state-of-the-art physics simulators and highly parallel processors. The researchers then transferred this simulation-trained manipulation skill to the real robotic hand, which managed to achieve the level of dexterity the team had hoped for. Ciocarlie noted, “The guiding goal of the field remains home assistance robots, the proving ground for true dexterity. In this study, we showed that robot hands can also be highly dexterous based on touch sensing alone. Once we are adding visual feedback to the mix alongside Along with touch, we hope to be able to achieve more dexterity, and one day start to get close to duplicating the human hand.”

The ultimate goal: to combine abstract intelligence with embodied intelligence

Ultimately, Ciocarlie notes that a physical robot to be useful in the real world needs both abstract and semantic intelligence (to understand how the world works conceptually) and embodied intelligence (the skill of physically interacting with the world). Paradigms of large languages ​​such as OpenAI’s GPT-4 or Google’s PALM aim to provide the former, while the manipulation ingenuity as achieved in this study represents a complementary advance in the latter.

For example, when asked how to make a sandwich, ChatGPT writes a step-by-step plan in response, but it takes a skilled bot to take that plan and actually make the sandwich. In the same way, researchers hope that physically adept robots will be able to take semantic intelligence out of the purely virtual world of the Internet, and put it to good use in physical tasks in the real world, perhaps even in our homes.

The paper has been accepted for publication at the upcoming Robotics: Science and Systems conference (Daegu, Korea, July 10-14, 2023), and is currently available as a preliminary publication.

Video: https://youtu.be/mYlc_OWgkyI

Source link

Post a Comment

Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser.
The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.