Retracing the Journey of the Pilgrims, autonomously.

In 1620, over 100 English Separatists embarked on a life-changing adventure, sailing from Plymouth in South West England to the New World on the other side of the Atlantic.  The arrival of the Mayflower and the establishment of the New Plymouth Colony are, today, celebrated as important chapters in the history of both England and United States of America.

Autonomous ship project

Now, Professor Bob Stone of the Department of Electronic, Electrical & Systems Engineering’s HIT Team has teamed up with Mayflower Autonomous Ship Ltd in an ambitious project to sail a totally unmanned vessel across the North Atlantic in 2020.  The initial voyage will follow in the footsteps of the Pilgrim Fathers and will form part of the Mayflower 400th anniversary celebrations. There are even longer-term plans for the autonomous vessel to retrace Drake’s expedition in circumnavigating the globe in 1580.

The extraordinary-looking, 100ft Mayflower Autonomous Ship (MAS), will be propelled by advanced sail and solar technology, and will be fully equipped with the most modern navigation aids. In addition, this revolutionary trimaran will carry a variety of equally advanced sensors, including Autonomous Underwater Vehicles, to conduct oceanographic research on route.

Together with researchers from Professor Peter Gardner’s Communications & Sensors group, the HIT Team aims to develop a “Mixed Reality” (MxR) system that will allow the remote ‘crew’ of the Mayflower to not only see the view from onboard, but operate onboard systems and conduct remote diagnosis.  A similar system will allow members of the public and schoolchildren experience the historic transatlantic crossing.

Remote diagnosis and navigation

The MxR system design is based on a number of successful recent projects undertaken over the past 2 years for BAE Systems and the Ministry of Defence.  Addressing the future of command and control for military tactics and counter-insurgency, the HIT Team developed a unique test bed designed to investigate the human factors aspects of MxR.  End users are able to don a head-mounted display and, courtesy of an optical motion capture system, visualise an evolving 3D scenario on an otherwise empty table in front of them.  They are able to interact with the 3D display using gestures or special 3D-printed “pucks” and are presented with other sources of data, including video from remote aerial and land-based robots, even television news feeds, in the form of “floating windows”, appearing around the immediate workspace.