AI “Mini-Brains” Help Robots Detect Damage and Self Repair
Robot inspection, repair, and maintenance should be done on a regular basis, at least once a year and usually once every six months if they’re in use full time. But when maintenance is performed that infrequently, there may be minor performance issues even if large repairs aren’t required.
Now, using a brain-inspired approach, scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed a way for robots to have artificial intelligence (AI) to recognize pain and to self-repair when damaged.
The system has AI-enabled sensor nodes to process and respond to pain arising from pressure exerted by a physical force. The system also allows the robot to detect and repair its own damage when minorly injured, without the need for human intervention.
Currently, robots use a network of sensors to generate information about their immediate environment. For example, a disaster rescue robot uses camera and microphone sensors to locate a survivor under debris and then pulls the person out with guidance from touch sensors on their arms. A factory robot working on an assembly line uses vision to guide its arm to the right location and touch sensors to determine if the object is slipping when picked up.
Today’s sensors typically do not process information but send it to a single large, powerful, central processing unit where learning occurs. As a result, existing robots are usually heavily wired which result in delayed response times. They are also susceptible to damage that will require maintenance and repair, which can be long and costly.
The new NTU approach embeds AI into the network of sensor nodes, connected to multiple small, less-powerful, processing units, that act like mini-brains distributed on the robotic skin. This means learning happens locally and the wiring requirements and response time for the robot are reduced five to 10 times compared to conventional robots, say the scientists.
Combining the system with a type of self-healing ion gel material means that the robots, when damaged, can recover their mechanical functions without human intervention.
The NTU scientists’ research was published in Nature Communications in August.
Co-lead author of the study, Associate Professor Arindam Basu from the School of Electrical & Electronic Engineering said, “For robots to work together with humans one day, one concern is how to ensure they will interact safely with us. For that reason, scientists around the world have been finding ways to bring a sense of awareness to robots, such as being able to ‘feel’ pain, to react to it, and to withstand harsh operating conditions. However, the complexity of putting together the multitude of sensors required and the resultant fragility of such a system is a major barrier for widespread adoption.”
Basu, who is a neuromorphic computing expert added, “Our work has demonstrated the feasibility of a robotic system that is capable of processing information efficiently with minimal wiring and circuits. By reducing the number of electronic components required, our system should become affordable and scalable. This will help accelerate the adoption of a new generation of robots in the marketplace.”
Robust system enables injured robot to self-repair
To teach the robot how to recognize pain and learn damaging stimuli, the research team fashioned memtransistors, which are brain-like electronic devices capable of memory and information processing, as artificial pain receptors and synapses.
Through lab experiments, the research team demonstrated how the robot was able to learn to respond to injury in real time. They also showed that the robot continued to respond to pressure even after damage, proving the robustness of the system.
When injured with a cut from a sharp object, the robot quickly loses mechanical function. But the molecules in the self-healing ion gel begin to interact, causing the robot to stitch its wound together and to restore its function while maintaining high responsiveness.
First author of the study, Rohit Abraham John, who is also a Research Fellow at the School of Materials Science & Engineering at NTU, said, “The self-healing properties of these novel devices help the robotic system to repeatedly stitch itself together when ‘injured’ with a cut or scratch, even at room temperature. This mimics how our biological system works, much like the way human skin heals on its own after a cut.
“In our tests, our robot can ‘survive’ and respond to unintentional mechanical damage arising from minor injuries such as scratches and bumps, while continuing to work effectively. If such a system were used with robots in real world settings, it could contribute to savings in maintenance.”
Associate Professor Nripan Mathews, who is co-lead author and from the School of Materials Science & Engineering at NTU, said, “Conventional robots carry out tasks in a structured programmable manner, but ours can perceive their environment, learning and adapting behaviour accordingly. Most researchers focus on making more and more sensitive sensors, but do not focus on the challenges of how they can make decisions effectively. Such research is necessary for the next generation of robots to interact effectively with humans.
“In this work, our team has taken an approach that is off-the-beaten path, by applying new learning materials, devices and fabrication methods for robots to mimic the human neuro-biological functions. While still at a prototype stage, our findings have laid down important frameworks for the field, pointing the way forward for researchers to tackle these challenges.”
Building on their previous body of work on neuromorphic electronics such as using light-activated devices to recognize objects, the NTU research team is now looking to collaborate with industry partners and government research labs to enhance their system for larger scale application.
Source: NTU Singapore