A New Human Interface Concept for Subsea Unmanned Operations
The University of Birmingham’s Human Interface Technologies (HIT) Team has recently demonstrated an advanced human-system interface concept to support future operations employing unmanned vehicles (UxVs), funded through the Defence and Security Accelerator as part of a Ministry of Defence project led by ATLAS ELEKTRONIK UK (AEUK).
It is widely accepted that the introduction of more complex and capable systems to UxVs will result in increasing pressure on operators as they attempt to process and interpret large volumes of data in a timely fashion. For the purpose of this demonstration, the team looked at Unmanned Underwater Vehicles (UUVs) equipped with multiple advanced sonars. Sonar displays tend to be one of two types: a “waterfall” display (a graphical representation of signal amplitude or strength across a fixed frequency range varying with time), or, for hull mounted sonars, a 360o plan position indicator display (a radar-like representation, presented in polar coordinates of range and angle, with the origin representing the source of the signal). Current operational solutions are typically based on a single sonar or single sensor system, feeding one display monitored by one operator. As more and more devices are added in the future, each with a significantly enhanced capability, this may result in the need for more operators and shipborne workstations, with associated cost, space and human-system performance implications. In the past, very little attention has been paid to how best to engage the human operator within integrated UUV and support vessel systems, or how to assimilate adaptive machine learning within changeable operating conditions.
Building on previous experience with command and control projects for the Ministry of Defence, the HIT Team researchers were able to collaborate with AEUK subject matter experts to develop an early proof-of-concept Mixed Reality subsea mine detection workstation demonstrator. The demonstrator is based on a simulated holographic/volumetric interface “command table” with virtual and mixed media displays, located within a generic military unmanned systems cabin. The demonstrator provides an operator with high quality, timely mission information to support the assessment of multiple targets once UUV data collection and Automatic Target Recognition (ATR) analysis is complete. At any stage in the mission, the operator is able to select a target amongst many (sorted by ranking probability), at which point supporting information is displayed on multiple virtual screens, organised in space around the command table. Such information may include:
- Statistics (e.g. the probability of object matching a known mine type, as recognised via ATR automated matching)
- Synthetic Aperture Sonar (SAS) imagery
- Bathymetric imagery
- Forward-looking sector sonar
- Upward-looking sonar (for surface or water column targets)
As far as the team could ascertain, this is the first time that this technology has been employed with military unmanned systems and as these platforms proliferate, it could offer a real game changer for future defence capability.