Robot Life: A User's Manual

Following his Inaugural Lecture, Professor Jeremy Wyatt discusses his research.

Professor Jeremy Wyatt is a leading robotics scientist – but the field of artificial intelligence almost lost him to the priesthood.

My first degree was in theology and I originally intended to join the Church,’ he says, adding with a wry smile: ‘I’d have made a terrible vicar!’

Jeremy, Professor of Robotics and Artificial Intelligence, then realised his true vocation lay in engineering.

‘I studied theology at Bristol, but I didn’t really know what I wanted to do with my life. I was offered various options, but essentially I think it came down to the fact that I realised I should have been an engineer. My dad is an engineer; one of my uncles was an engineer and so were both my grandfathers. But my dad always used to tell me engineers weren’t valued in this country. Eventually I woke up and realised that despite that, it was what I would enjoy the most.’

But then came another twist. ‘The mother of a friend said to me: “There’s this subject called artificial intelligence (AI) that I think you might be interested in”. She lived in New Jersey and posted me by airmail a Readers’ Digest book about AI. I read it cover-to-cover in about 45 minutes; I was hooked.’

That was 23 years ago – and Jeremy hasn’t looked back. After gaining an MSc in Intelligent Knowledge-based Systems from the University of Sussex, he moved to the University of Edinburgh for his doctorate, working on how agents such as robots or animals should gather information during learning to perform their task.

Since coming to Birmingham as a lecturer in 1997, he has helped to drive robotics and AI to new, exciting levels. A mind-blowing example is Boris, one of the first robots in the world to be able to intelligently and dexterously pick up and manipulate unfamiliar objects in a humanlike grasp.

His unveiling at the British Science Festival, held at the University in September, made national and international headlines.

Developed by Jeremy, his colleague Prof. Ales Leonardis, and their team as part of the multi-European university PaCMan project, Boris can grasp objects in various ways. These include a power grip, using his whole hand to curve around an object, and a pinch grip, which uses two or three fingers. Boris is then able to learn the different grips and adapt them to other unfamiliar objects without breaking or dropping them.

The goal, says Jeremy, is for Boris to be loading a dishwasher by next year.

‘It is fairly commonplace to program robots to pick up particular objects and move them around – factory production lines are a good example of this – but when those objects vary in size or shape, robots tend to get clumsy,’ he explains. ‘The system we’ve developed allows the robot to assess the object and generate hundreds of different grasp options. It also selects the right type of grasp on the fly. That means it’s able to make choices about the best grasp for the object it has been told to pick up and it doesn’t need to be continually re-trained each time the object changes.’

Another robot Jeremy was involved with, Bob, also hit the headlines this year when he became the first autonomous robot to be deployed in a working office environment to do a real job. Bob is part of a project called Strands, led by long-time collaborator Dr Nick Hawes. In May Bob joined the workforce of security solutions group G4S at its headquarters in Gloucestershire, where for two weeks he patrolled the offices, mapping the rooms in 3D and storing the information on his internal hard-drive. He then analysed the data and if he found any anomalies since his previous visit, he reported them to his human colleagues.

Although among the most sophisticated robots ever created, Boris and Bob’s capabilities are still a world away from those of his human counterparts. They can’t think subtly or intuitively, for instance; they can’t even walk up a flight of stairs.

Yet despite real-life robots still being a long way from those depicted in sci-fi blockbuster movies, the work being done by Jeremy and his colleagues in the School of Computer Science’s Intelligent Robotics Laboratory (IRLab) is taking them to a another level – making them significantly ‘smarter’ so that instead of being capable of performing only one, specific task, they can carry out multiple or complex activities.

This means they can be used not to replace humans, but work alongside us in unstructured settings such as offices, and hazardous environments such as deep sea exploration.

However, although the intelligence and robustness of robots have increased hugely over the past 15 years, there remains a plethora of challenges to overcome before they become properly integrated into the human world. Yet again, Jeremy and his team are stepping up to the challenges.

‘What we want is robots that can operate in unstructured, uncertain and unfamiliar environments – and we’ve been developing technologies that address some of these problems.’

Dora the Explorer is an example of a robot to deal with just such environments. The culmination of a four-year project called CogX, Dora is a task-driven robot capable of ‘epistemic planning’, which means she can reason about what she knows, what she doesn’t know and how what she knows changes under the actions she performs. This level of AI – a combination of vision and laser-based mapping – incorporates ways to deal with uncertainty, such as where a book or a CD might be located (in a bedroom or a lounge).

Another major challenge is manipulation, says Jeremy, who recently delivered his inaugural lecture, entitled Robot Life: A User’s Guide. ‘Once you begin to have contact with the world, everything changes. So we are now looking at the dynamics necessary for robots to have contact with the world.

‘That leads on to another of the things that remains hard – issues of scaling. We have many of the right pieces in order to build robots that can move around in the world, but there are some pieces missing. Vision, for instance, remains extremely hard.’

In view of the extraordinary leaps forward that have been made over the past few years, one suspects it won’t be long before Jeremy and his team find ways to overcome some of these remaining difficulties to make even smarter, more human-like robots. 

Jeremy's Inaugral Lecture: