A team of scientists from the University of Birmingham has successfully demonstrated how virtual and augmented reality technologies can enable boats, submarines and other autonomous and semi-autonomous systems to be controlled remotely.

A unique demonstration was recently staged by the University of Birmingham’s Department of Electronic, Electrical & Systems Engineering (EESE). Video and sensor data were transmitted from a remotely controlled vessel deployed at sea off Plymouth, back to a unique Virtual Reality “Science Station” located in the Gisbert Kapp Building on-campus.

Human Interface Technologies bait boat with sensors on it

Led by Professor Robert Stone and Professor Peter Gardner, a small group of academics, technicians and postgraduate students representing the Human Interface Technologies Team and the Wireless Communications Remote and Sensing Group, collaborated on this highly innovative project. The end result saw the completion of a successful six month “telepresence” research and development project that included part-funding from the Engineering and Physical Sciences Research Council’s (EPSRC) Institutional Sponsorship Fund.

Human Interface Technologies team remote station virtual and augmented displays

This research shows the potential for using Virtual and Augmented Reality technologies to help support remote surveillance for both autonomous and semi-autonomous systems in the future. The trial indicates that the situational awareness of the human supervisor of complex remote systems can be enhanced through the clear presentation of data in the form of an immersive and interactive “wrap-around” dataspace. This has been achieved through the use of commercially available components and by providing clear attention to the human factors of advanced workstation design.

As proof-of-concept demonstration, the trials vessel took form of a modified angler’s “bait boat”, a small catamaran with an integrated echo sounder and colour camera, which was controlled by a radio control system giving an operational range up to 4000 meters.

The EESE team modified the catamaran to include live footage from a gimbal-stabilised GoPro camera for recording at sea alongside a webcam which provided video footage via Wi-Fi link to the HIT Team’s Science Station located at the University. A sensor module was also included that captured data such as GPS, temperature, pressure, humidity and the output of a digital compass. The catamaran was launched into the waters surrounding the Cawsand Bay and Plymouth’s famous Breakwater and Fort, previously the focus of HIT Team research.


This research was inspired by the Mayflower Autonomous Ship, a crewless vessel designed to sail across the Atlantic in 2020 to celebrate the original sailing of vessel and it Pilgrim crew in 1620. Research into the concept of an advanced VR Science Station was initially carried out by the HIT Team under sponsorship from BAE Systems and the Ministry of Defence. This work is continuing to grow, and will allow members of the public and school children to experience the historic transatlantic crossing and to conduct future science experiments based on a myriad of sensors onboard the vessel.

Reflecting on the demonstration, Professor Stone said, “We were delighted with the outcome of this trial. So many things could have gone wrong, especially when placing a complex set of electronics into a small boat at sea, and relying on telephone communications and Wi-Fi links. Instead, the trial was a complete success and will now help both EESE groups to collaborate in a variety of future high-tech domains where, despite advancements in autonomous technologies, the role and involvement of the human in system supervision and safety will always be paramount”.