BAE Systems: Future Mission Systems 2035

We developed a Mixed Reality interface test bed for BAE Systems to evaluate advanced human interaction techniques in a command and control context for BAE Systems.

BAE Systems Future Mission Systems 2035 Mixed Reality 'command table' interfaceThe end user’s motions and gestural input commands are tracked using a motion capture system and the outputs from the captured data are used to display – in real time – a range of different tactical information types.  These include a cityscape that appears to exist in 3D on an otherwise empty “command table”; the locations of multiple unmanned air vehicles (UAVs); “floating” menu screens, enabling the user to tailor the amount and quality of the data being presented; and simulated aerial 3D “keyhole”/laser sensor scans of suspect terrorist assets.An avatar provides additional spoken information and interface configuration support.

Other applications for the Mixed Reality “command table” include an aerial 3D representation of a fictional mansion house being used for a “G8 Summit” and images from ground-based cameras (police-worn, social media-sourced, robotic patrol systems, etc., etc.) monitoring a protest in real-time, as it develops into something more sinister. 

Another scenario is based on a small city, this time with information relating to the predicted early spread of the contents of a “dirty bomb” detonated outside a metro station and being monitored by small UAVs.

With all of these scenarios, the user is free to translate, rotate and zoom into/out of the command table image in real time by using gestures tracked by the motion capture cameras.  They can also point to any of a range of land, air and sea assets, using “drag”-like gestures (similar to those used with mobile phones, tablets and so on) to direct assets towards the focal point(s) of the developing incidents.