AirMed&Rescue spoke to Professor Bob Stone of the University of Birmingham about how technology is enhancing simulated training for military personnel
It cannot have gone unnoticed over the past five years or so that Virtual Reality (VR) – technologies that attempt to immerse the wearers of often quite outlandish wearable devices such as head-mounted displays (HMDs) and gloves into computer-generated worlds – have experienced something of a resurrection, appearing regularly on technology sites, social media, in the newspapers and on TV. Unlike the field’s rollercoaster ride of success and failure in the 1990s, there is, today, evidence that the current and emerging generations of wearable devices have the potential to offer a much more acceptable and affordable alternative to conventional flat-screen interactive experiences, particularly for computer-based training. There has also been a significant and positive change in affordability, both for these new devices and the underlying software that supports the development of quite impressive virtual environments.
Some of these features will be demonstrated during a presentation by Professor Bob Stone at the 2018 Medical Innovation event in October, at the Edgbaston Stadium in Birmingham, UK. Director of the Human Interface Technologies Team at the University of Birmingham, Professor Stone will be presenting a new and unique training concept for future Medical Emergency Response Team (MERT) personnel. Sponsored by the Royal Centre for Defence Medicine (RCDM), the challenge to the University team is to develop a technology-based training solution to replace the small number of expensive wooden mock-ups currently used for training by the Armed Forces, and based on the rear cabin of a Chinook helicopter. The team’s solution has to be affordable, transportable and, importantly, easily reconfigurable to represent a range of MERT platforms, not just the rear cabin of a Chinook. The solution also has to support the training of groups of two to three personnel at a time, not in the clinical skills necessary for pre-hospital trauma care, but in small team activities – it also has to be as realistic a simulated operational environment as possible.
Learning from history
In the early phases of the project, Professor Stone turned to the results of a previous search and rescue VR project with which he was closely involved in the late 1990s. This project, undertaken in collaboration with instructors at RAF Valley and Shawbury, addressed the training of Voice Marshalling Aircrew located in the rear cabin of the Griffin (Bell 412) helicopters, whose task is to monitor the external environment through the open cabin door and verbally relay important flight commands to the pilot in order to guarantee an accurate and safe approach of the aircraft to a landing site or target object. The output from that project was a successful series of Mixed Reality (MR) trainers, although this term was not widely used at the time. In contrast to VR, (in which the end user interacts in real-time with totally computer-generated scenes), MR exploits the existence of real-world objects in order to enhance the believability, and indeed usability, of the virtual elements of a simulated scene. The Voice Marshalling training solution, then, enabled trainees to experience flight in a simulated helicopter, whilst wearing a VR HMD. However, they were able to physically experience the constraints and safety features of moving within a wooden mock-up representing the door space available within the Griffin platform.
Working closely with RCDM stakeholders and medical instructors at the Tactical Medical Wing (TMW) of RAF Brize Norton, Professor Stone and his team undertook observations of MERT training sessions and rapidly came to the conclusion that any technology-based training solution had to include an appropriate blend of the virtual with the real, especially given the constraints in which the medics operate, the casualties they attend to and the extensive equipment they use. The team’s early concept solution employed an inflatable enclosure, ‘cluttered’ with a range of physical objects, including replica weapons, military Bergens and a highly realistic casualty mannequin. A VR headset enabled the end users to experience the interior of a Chinook in fight (the illusion enhanced with external ‘in-flight’ effects based on video captured from one of the Team’s drones flying backwards over a barren region of Dartmoor).
However, due to registration and tracking problems between the VR user’s real and virtual hands and the subsequent lack of credible ‘touch’ experiences with the physical enclosure contents, it was necessary to revise the technical solution to take advantage of ‘blue-screen’ chroma key effects, as used in the computer-generated film industry and the integration of a camera onto the VR headsets. The resulting effect is that users can now see their own bodies and limbs and can, more importantly, touch those physical objects that are defined as essential to the training process, including medical equipment and the casualty mannequin. The chroma key effect means that anything blue that is picked up by the camera within the enclosure will be presented as the virtual Chinook interior. Recent usability trials with TMW personnel have confirmed that this MR approach offers significant benefits over purely VR-based training; the ability to interact with important and training-relevant physical objects in particular attracted positive feedback from the participants.
Already, new VR interiors of MERT platforms are being developed, including Royal Marines’ hovercraft and landing craft. External ‘in-transit’ effects have been captured using a combination of GoPro and 360o ‘panoramic’ cameras attached to the outside of the vehicles, filming high quality videos whilst undertaking at-sea, river and beach-landing manoeuvres. A similar exercise has been undertaken with a British Army Mastiff land vehicle. Additional training-relevant effects are also being developed, such as sound (including attack from small arms fire and discharge of onboard weapons), cabin incursion by dust during a brownout landing and night vision. Even simulated smells are being considered, although technologies in this area are still at a low level of maturity. Motion capture (MOCAP) techniques are also being exploited to animate avatars – other platform crewmembers (pilots, loadmasters, force protection soldiers, etc.) – performing functions that are background to the main medical training scenarios.
Although the team does not expect the MR simulation to be complete and ready for additional testing by the Armed Forces until the middle of 2019, interest is already being shown in transferring the results of this research to the civilian sector, for example for training in support of major incidents, and for familiarisation with new medical equipment fits for helicopters used by air ambulance teams across the UK.