A robot using touch to examine an object
A robot examing an object by touch

We may do it subconsciously, but when we pick up an unfamiliar object, we feel our way around it in order to recover its shape through our sense of touch.

Touch is vital for manipulation – as important as sight – and, so far, even the most sophisticated robots can’t match human ability because their sensors are far less receptive.

World-leading AI expert Professor Jeremy Wyatt and his team have successfully built a system to improve robot manipulation by recovering the shape of novel objects through tactile exploration.

‘Robot touch is really primitive – nowhere near as good as human touch – and, as a consequence, manipulation is difficult,’ explains Jeremy, Professor of Robotics and Artificial Intelligence whose specialism is creating algorithms to enable robots to work in uncertain and unfamiliar worlds. ‘If you go to a car factory such as Jaguar Land Rover, the first half of the factory floor is all robots, but the second is all people. That’s because the more nuanced manipulation required in the later stages of car production is too difficult for robots. They don’t have the necessary sense of touch, or control of force required.

‘As human beings we think sight is very important – and so it is. But for the grasping and manipulation of objects, touch is much more important than we might think; it’s just as important as sight.’

So, being able to build robots with a better sense of touch and the ability to control that touch will lead to the development of machines with increased dexterity.

Jeremy and Birmingham colleague Dr Claudio Zito, a Lecturer in AI and Robotics in the School of Computer Science, together with researchers from the University of Pisa, detail their latest achievement in designing next-generation AI based manipulation systems in the paper ‘GPAtlasRRT: A local tactile exploration planner for recovering the shape of novel objects’. Published in the International Journal of Humanoid Robots, it recently won the College of Engineering and Physical Science’s Paper of the Month award.

‘The baseline is that in order to recover the precise shape of an object, you have to have some notion of its rough shape,’ explains Jeremy. ‘Also, you need to know just how uncertain you are about different parts so as to know where to explore. Our paper deals with the problem of choosing where on the object to explore, depending on how uncertain the robot is about the shape of different parts of the object.’

Claudio continues: ‘When you look at the object you see roughly the front surface. But some parts of the object are not visible – the back of a mug or a bowl, for example – so you’re trying to figure out, via tactile exploration, the shape of the “dark side of the moon”.’

Although researchers have devised ways to perform tactile exploration, Jeremy, Claudio and their colleagues are the first to use an algorithm that combines probabilistic AI to represent the surface uncertainty with state-of-the-art robot path-planning to plan progressive movements of the finger across the object surface. Because of this combination, their approach enables a robot to hold an object in one ‘hand’ and explore it with its other – which is also something new.

‘We’ve used a well-known mathematical trick to support tactile exploration for complex surfaces on which it is hard to plan motion,’ says Jeremy. ‘When you look at an atlas of the world, which is spherical, all the maps of the world are presented as flat. Imagine you are a 16th century sailor. You have a flat map of a small part of a much larger curved world. If you want to explore, you use the map to plan, and then you move across the surface of the earth discovering new places. When you reach the edge of your chart you need create another one so as to map your discoveries accurately. What we have done is similar to this: the probabilistic AI method, called a Gaussian Process, predicts the most likely shape of the unexplored surface of an object and the uncertainty in that prediction. Then a path planner, called an AtlasRRT, creates a tactile exploration path across this predicted surface, driving it efficiently towards the region of greatest uncertainty. This means that the robot explores efficiently.’

As the robot explores the path, gradually the areas of uncertainty diminish until it has fully recovered the object’s shape with a good degree of accuracy.

‘The goal – to get to the dark side of the moon – is achieved by moving from one chart to another to find which areas you’re uncertain about,’ adds Claudio. ‘This lessens the number of touches needed and so doesn’t take as long as if the exploration of the object were random. If you know where to go, you obtain better accuracy in a shorter period of time. And that’s what our system does. This will help to lead to next-generation robots that are more efficient and more dexterous.’