Home » Markets » AI

AI helps robots collaborate with a gently touch

Scientists at the University of Bristol are using AI to help two robot arms to share the handling of objects – something they call bi-manual manipulation.

UofBristol robots hold crisp credit Yijong Lin

“With our system, we can train AI agents in a virtual world within a couple of hours to achieve bi-manual tasks that are tailored towards the touch,” said research engineer Yijiong Lin. “More importantly, we can directly apply these agents from the virtual world to the real world without further training. The tactile bi-manual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”

Tactile feedback comes from TacTips (see photo) – small high-resolution sensors that include a tiny camera that watches the way a soft contact dome distorts from the inside.


In parallel, the researchers built a real-world tactile dual-arm robot, and an equivalent simulation of the two robot arms with there TacTips. Reward functions and a goal-update mechanism were designed for the virtual environment to encourage robot agents to achieve the bimanual tasks.


“The robot learns bimanual skills through deep reinforcement learning [Deep-RL],” according to the University. “It is designed to teach robots to do things by letting them learn from trial and error – the robot learns to make decisions by attempting various behaviours to achieve designated tasks.”

The AI agent has no vision, just proprioceptive feedback – knowing initial positions, the angles of the various arm joints, plus the tactile feedback.

It worked, and the real-world arms could handle individual Pringles without breaking them (photo).

“Our system shows a promising approach with affordable software and hardware for learning bimanual behaviours with touch in simulation, which can be directly applied to the real world,” said fellow researcher Professor Nathan Lepora. “Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”

The team has branded its scheme Bi-touch, and sees it being used in “fruit picking, domestic service and eventually recreate touch in artificial limbs”.

The work is described in ‘Bi-Touch: Bimanual tactile manipulation with sim-to-real deep reinforcement learning‘, published in IEEE Robotics and Automation Letters.

Steve Bush

Steve Bush is the long-standing technology editor for Electronics Weekly, covering electronics developments for more than 25 years. He has a particular interest in the Power and Embedded areas of the industry. He also writes for the Engineer In Wonderland blog, covering 3D printing, CNC machines and miscellaneous other engineering matters.

Leave a Reply

Your email address will not be published. Required fields are marked *

*