The INGENIOUS 1st FSX was successful
The 1st Full Scale Exercise (FSX) of the INGENIOUS project was hosted by our partner APHP-SAMU in Villejust near Paris, France, on May 25, 2022. The APHP-SAMU team supported the exercise sessions with their first response teams and standard equipment they are using.
The 1st FSX demonstrated how the various tools and components of the INGENIOUS Next Generation Integrated Toolkit (NGIT) are integrated and used in combination to achieve an effective emergency response in a real scenario environment. The 2nd Prototype of components which creates the 1st NGIT was tested in a large earthquake scenario in an urban area with cascading effects, e.g. destroyed/collapsed buildings, lack of communication, energy grid failure, suspected gas leaks etc. The operational teams of practitioners used the technologies in realistic conditions providing valuable feedback to the techno-provider partners to further improve the tools in order to address their needs and develop the 3rd and final Prototype of the components which will create the INGENIOUS 2nd Toolkit.
The teams deployed on the field
A total of 2 teams of First Responders (FRs) with 4 members in each team were deployed on the field. There were also 2 extraction teams with 3 FRs in each team, a K9 Unit and a medical team with 4 FRs and one doctor located in the triage tent to provide first aid to the victims and prioritise the treatment and transportation to the hospital.
1st FSX deployment
Deployment of the INGENIOUS technologies outdoors
According to the scenario, several collapsed buildings were reported to the Command and Control Center. The scenario included registration of the components in the central system, the association of these to each member of the teams, the exercise began after receiving the emergency notification, then self-verification and automatic connection of the elements and deployment in the field in a coordinated way. The INGENIOUS First Responder and technical teams arrived at the scene to map the area and provide assistance to the injured people. The Field Communication system (ICCS), the Common Operational Picture and Command, Control and Coordination System (COP/C3) (STWS) and the INGENIOUS smart mapping and localization devices were the first tools deployed on the field.
The need for rapid disaster assessment was urgent, so DLR’s Modular Aerial Camera System – Search and Rescue (MACS-SaR), which consists of a drone and a special camera system, flew over the site to map the area and send this map to the Ground Control Station.
Our partners from FOI demonstrated the Multi-purpose Autonomous eXploring drone (MAX) intended as an autonomous helper for the FR team that can explore potentially dangerous buildings. The demonstration was divided into two steps highlighting different functionalities. First, a flight was performed outdoors by a building facade showing the possibility to rapidly alternate between manual and automatic control by the flick of a switch on the operator’s control unit. During this flight, visual and thermal images from MAX’s onboard cameras were sent in real-time via the INGENIOUS Field Communication System to the Ground Control Station at the command post where they were analyzed for the presence of persons/victims in images of the inspected building. In addition, temperature sensors onboard the drone sent data as well, illustrating the potential to detect the presence of fire at the scene.
Then, a demonstration was given of MAX navigation and path planning functionality in indoor environments. In this step, a flight was simulated by a person carrying MAX around by hand along the path suggested by the planning algorithms running in real-time on MAX’s onboard computers. Carrying MAX around was a safety measure called for by the presence of persons (simulated victims) close to the drone, but also due to collision avoidance mechanisms not being fully tested yet which could lead to MAX crashing and getting damaged. Integrating a reliable “emergency brake” function is the focus for the last part of the project and should enable completely autonomous flight in complex environments at the 2nd FSX in November, in Spain.
At this stage INGENIOUS team 1 was deployed outside the collapsed building and used the Multilingual Operations App (MOA) (developed by UPF) to facilitate the communication between the FRs from different countries, while the Worksite App was used to have an overview of the mission’s status. The Worksite App is developed by EXUS to assist with the infield resource and casualty management of a mission. The INGENIOUS mobile and web applications, designed to enhance situational awareness and facilitate decision making during worksite operations, were all tested and evaluated.
The MOA was tested as a potential addition or alternative to traditional radio communications, facilitating the coordination between teams from different countries. First responders from Sweden, Spain and France used the app to send voice messages in their own languages, which are then received by the other teams translated to English as a spoken message (in addition to a written transcript in the app as well as the COP). The FSX was a step towards showing the viability of such an approach as an alternative to using a lingua franca (typically English) for communication between multilingual teams, thus eliminating a language barrier that can often be a significant obstacle. Results using Spanish as the source language were very promising, even in noisy environments, and showed great improvement over earlier versions of the system. As a next step, we hope to transfer these same improvements to the other languages, as well as enable the direct translation to different target languages (instead of using English as the common target as in this prototype).
In parallel, the INGENIOUS Social Media App (CERTH) integrated in the COP was used to collect information related to the earthquake scenario from social media. FR teams used this information to their advantage before they arrived at the field (for potential close roads, fallen bridges etc.). SAMU partners checked the tweets that provided information regarding the related earthquake event and informed the other FRs through a walkie-talkie. In fact, real tweets were posted, in real-time, on the Twitter platform. These tweets were collected from a past earthquake incident. However, as Twitter is a public platform, tweets were encoded with particular hashtags and codes to avoid false alerts to the platform’s users. Regarding the Social Media App, the new upcoming tweets are stored and processed (filter out the irrelevant, detect the referred location) near-real-time by the appropriate services. Then, the event detection method collected and grouped the relevant tweets based on the time posted and the location. In particular, each group of tweets refers to an event detection message. The related events with the post-analysis information were sent from FE to COP in order to be monitored on the COP platform and sent to the appropriate FRs.
Deployment in the collapsed building
The K9 Unit began searching for victims and located two of them in the debris. Through the live footage of the thermal camera of the INGENIOUS Helmet (KIRO), the Command Center was able to confirm this. The INGENIOUS team used the Smart Insole Boots, the Augmented Reality Glasses and the Uniform to send data to the Command Center.
Based on the exercise scenario, the FR wearing the Smart Insole for Boots entered an area where gas emissions knock her and her team unconscious. The Smart Insole Boots detect immobilisation and automatically notify the Command and Control Center via the INGENIOUS NGIT, which in turn responds by sending a rescue team to extract the distressed FRs.
The Information Flow involved several of the NGIT tools from different INGENIOUS partners and can be summarised in the following steps:
- Smart Insole Boots (CyRIC) detect critical events (FR immobilised)
- Smart Uniform (TEK) collects and forwards such events to using the Field Comms (ICCS)
- Fusion Engine (EXUS) and Expert Reasoning (CERTH) process events to generate Alerts
- Common Operation Picture (SATWAYS) visualises Alerts
- Field Teams receive Alerts and information needed to respond to and assist the immobilised FR
Smart Insole Boots is one of the innovative wearable technologies developed within INGENIOUS to support the FR of the Future. The patented smart insole technology is developed by CyRIC, an SME based in Nicosia, Cyprus. They instantly become smart boots by placing the insoles inside operational FR boots. The Smart Insole Boots integrate sensors and edge processing algorithms to detect critical and life-threatening events for the FR. Their wearable and completely wireless operation make the Smart Insole Boots a non-intrusive and indispensable tool for First Responders.
Augmented Reality (AR) was the next step in providing Situational Awareness and operational assistance to the field units, as it offered new possibilities in terms of information display and contextual communication. During the 1st FSX, CS GROUP deployed the Augmented Reality Response Platform (ARRP) and provided the First Responders with an AR application deployed on a HoloLens 2 device, used in indoor/outdoor search and rescue scenarios. The goal of the ARRP is to enhance, in a non-intrusive fashion, the situational awareness of FRs in the field while keeping the FRs’ hands free of additional devices. Proposed features were:
- The capability to share geo-localized annotations (such as identified hazards and victims) between the COP and the AR application (through the Fusion Engine). The annotations are displayed as AR holograms in the field and as icons on an AR map.
- The display of a mission sent from the COP and the update of the mission status from the FR using the AR application.
- The display of user, teammates, and annotations position on an AR map.
- The capability to interact with the AR application using voice commands or by pressing holographic buttons displayed in front of the user.
Feedbacks from FRs were overall positive, and some improvements have been requested such as having the capability to update AR annotations directly from the field (not just creating them) or making holographic buttons easier to be pressed in any circumstances.
The INGENIOUS uniform (TEKNIKER) has been integrated with the rest of the INGENIOUS components in this joint exercise where the FRs were able to verify the result of incorporating the technology developed in a hands-on experience close to reality. In particular, in combination with the Smart Insole Tools, the ability to identify that a member of the rescue team is unconscious was demonstrated, this situation is automatically detected and sent to the command centre, perform data aggregation, raise an alert condition that is notified to the person in charge of the equipment who confirms its reception and adopts the corresponding actions. At the same time, continuous monitoring of operating conditions and self-recovery in case of loss of communication were tested. During the exercise, miniaturized physical prototypes were used, with robust, comfortable casings and simple and effective user interfaces. The Triage App (ICCS) was used to simplify the triage procedure in the field, transferring the triage information and victims’ details to the command and control centre in real time.
After the completion of the Search and Rescue mission of the 1st building, the victims were taken to the triage tent, where their level of emergency was accessed. The victims were identified using the Face Recognition App (ICCS).
Deployment in a hotel partially destroyed by a fire
The mission continued in a 2nd building, a hotel partially destroyed by a fire that broke out in the front part. The building was marked using the Augmented Reality Glasses and the K9 Unit began the search. Mission members who entered this building collapsed due to a gas leak. After receiving data from the Boots and Bracelet, Expert Reasoning generated an alert to the COP, which was relayed through the Fusion Engine.
The INGENIOUS Fusion Engine (FE) allows the integration of multiple sources through its infrastructure and utilizes the data gathered for the extraction of conclusions providing real-time situational awareness and decision support functionalities to crisis operators. The Expert Reasoning Engine allows inferring useful insights that enhance the management of FRs and their resources. The tools and algorithms were combined during the 1st FSX and the provided information was visualized in the COP to assist in the overall handling of incidents.
Expert Reasoning in this exercise aimed to monitor the health status of the FR in the field. To tackle that, it analyses sensor data, measurements, and relations from the Uniforms’ sensors (vitals) and from the Boots’ sensors that the FRs were wearing during their involvement in the disaster. In the context of the 1st FSX, we tested some of the new semantic querying frameworks, the methodology for the elicitation of the ontology’s requirements and the protocol for the alerts that our component will generate. Particularly, Expert reasoning facilitated the semantic reasoning to formulate an alert that uses data from different sensors, the vitals, and the boots. In fact, we tested the condition in which a first responder has a severe accident that would leave him immobile and with abnormal Heart Rate values. An example of this condition is a Neurogenic shock, which is caused by a severe spinal cord injury, which can result in dangerously low blood pressure and a slowed heart rate. The analysis results from the sensors were populated to the Knowledge base via Kafka broker which is hosted in the Fusion Engine. The event from boots came from a real immobilized scenario (a first responder from SBFF was wearing the boots) and the abnormal heart rate generated from a data generator after the immobilized was detected. Then the conditions were met and a specific alert was generated and sent to the COP, where the necessary data can be retraced (e.g., device, location timestamp). The particular alert monitored in the COP and SAMU partners observed that.
The INGENIOUS K9 vest (ICCS) was tested with a rescue dog and a dog handler provided by I.S.A.R. Germany. The three days of the 1st FSX were used to make the dog first familiar with the K9 vest because rescue dogs usually wear no vest during work on rubble or in damaged buildings. The K9 vest was tested outside on the rubble as well as inside the damaged buildings. The thermal and RGB camera of the K9 Vest and the location of the dog were tested successfully. The thermal video stream and the locations were made available to the Command and Control Center. On the command monitor, the victims were visible. The location of the dog was also available in the ARRP of CS Group for the other FR. Voice command was also tested to communicate with the victim. Only one-way communication from the Command Center to the victim was possible. Part of the test was a discussion between ICCS and I.S.A.R. Germany about possible improvements to the camera position as on the dog’s back is not an optimal choice.
DLR’s Deployable Positioning System (IPS) was used by the second team searching and rescuing victims in the partially damaged hotel. The position of the member wearing the IPS Helmet was transmitted to the COP in realtime and displayed there during the exercise. As the building was rather small the scenario was not suitable to demonstrate the capability of IPS to localize a first responder over long distances in GPS-denied areas. However, the integration of the component could be demonstrated successfully. Further development will also allow to display the (already transmitted) georeferenced images and point clouds from IPS in the COP accordingly.
Next, the Ground Control Station (GCS) of ITC ran AI-based image processing algorithms, including scene semantic analysis and victim detection. Both algorithms exploited RGB images taken by MAX, and the latter algorithm was able to detect victims who were partially buried under debris. In addition, ITC tested data communication through GCS, which enabled the drones (MAX, MINs, MACS-SaR), IPS and the Worksite Communications to register in the same coordinate system and share data not only with these components but also with the other INGENIOUS components. For example, MAX generated and shared an OctoMap, which would be used by the Micro Indoor droNes (MINs) (developed by SINTEF) as a prior map for navigation in the next tests. The locations of detected victims or the FR who wore IPS were sent outside to the other INGENIOUS components, and therefore could be visualized on a global map, and was useful for FR to make a rescue plan. ITC will optimize the algorithms to run faster. Also, as the leader of the above components, ITC will continue coordinating data integration between them, and prepare for the next test.
All the victims located in the building were transferred to the triage tent, where they were given first aid and received appropriate treatment.
The 1st Full Scale Exercise was successfully completed. All the INGENIOUS technologies were tested, evaluated and appropriately worked with significant results that will be leveraged for the 2nd FSX in Spain, from 7 up to 10 November 2022. End users provided useful feedback regarding the functionality and usability of the tools, testing them in real world conditions through a number of scenarios. Comments have been overall positive.
In conjunction with the 1st FSX, the INGENIOUS 1st International Workshop on the “Tools for the First Responder of the Future” took place in a hybrid mode, focusing on the different needs and challenges faced by First Responders today. Prominent speakers from industry and First Responder Agencies, both internal and external of the INGENIOUS consortium, as well as key experts from other European projects presented the latest technologies developed for the First Responder of the Future. The dialogues will continue at the INGENIOUS 2nd International Workshop on “Tools for the First Responder of the Future”, which will be held at the same time as the 2nd FSX. Stay tuned for more to come!