3D Printed Robotic Arm Performs Sign Language
A group of engineering students from University of Antwerp in Belgium have created a 3D-printed robotic arm prototype that can perform sign language.
This is the Aslan Project’s sign language robot. The form is a robotic arm with articulated fingers, controlled by dedicated software. (Aslan stands for Antwerp’s Sign Language Actuating Node.)
“Designed and built over three years, the arm can form the gestures that make up the letters and numbers of basic sign language,” Adam Westlake, SlashGear, said.
“When the user types text into the software, the robotic hand translates the text into sign language,” said 3ders.org.
Project Aslan was started by a team of three engineering masters students: Guy Fierens, Stijn Huys, and Jasper Slaets.
Huys said in a video that “I was talking to friends about the shortage of sign language interpreters in Belgium, especially in Flanders for Flemish sign language.”
He said he wanted to work on robotics for his masters, “so we combined the two.”
One key advantage is that it can be considered as a low-cost option. “The project uses 3D printing combined with readily available components to make the robot affordable and easily manufacturable,” said 3D Hubs, a network of 3D printing services.
“This emphasis on using easily accessible materials in important, as the team plans to make the final design open source with the hope that the robot can be produced anywhere in the world where people are in need of a sign language interpreter,” Westlake said.
Technical details: The first prototype featured 25 3D printed parts (taking 139 hours to print). Also in the mix were 16 servo motors, three motor controllers, an Arduino Due and other components. The robot was printed with PLA-filament. Once printed, the assembly of the arm takes around ten hours, reports said. And once the mechanical design and software of the robot have reached a sufficiently advanced level, said 3D Hubs, all designs will be made open source.
The report from the 3D Hubs newsroom discussed how Aslan works: Users connected to the network can send messages, which then activate the hand, elbow and finger joints to process the messages.
One point made quite clear in reports is that the arm is not intended to replace interpreters. As Westlake said, it, “can’t make the complicated gestures of sign language that require two hands.” The 3D Hubs newsroom article said the goal of Project Aslan was not to replace human sign language translators but to support the short supply of sign language interpreters.
The Aslan project team have an interesting list of avenues they wish to explore. The future of the project involves some research topics to be picked up by incoming masters students—optimizing the design for a two-arm setup; implementing facial expressions to the design; and investigating if a webcam can be used to teach new gestures to the robot.
On the latter, 3Ders.org said “a webcam could potentially be integrated in order to teach a more developed robot the nuances of facial expressions and shoulder movement.”
Comments are closed, but trackbacks and pingbacks are open.