
The INGENIOUS 2nd Full Scale eXercise (2nd FSX)
The 2nd Full Scale eXercise (2nd FSX) of the INGENIOUS project took place from the 7th to the 10th of November 2022, at ERTZAINTZA’s training facilities in Brigada Movil, Bilbao, Spain with the aim to present the final prototype of the INGENIOUS Next Generation Integrated Toolkit (NGIT) for the First Responder of the Future. The INGENIOUS NGIT is a novel toolkit of technologies that ensures a high level of protection and augmented operational capacity for First Responders (FRs) during a natural or man-made disaster.
The goal of the 2nd FSX was to demonstrate the complete and integrated NGIT of the project to a broad audience of First Responders (FRs), ranging from Search and Rescue (SAR) to Police and Fire Service personnel. The NGIT itself includes novel technologies and tools like Unmanned Aerial Vehicles (UAVs) for mapping, victim search and indoor positioning, Augmented Reality (AR) visors and camera-equipped helmets, wearables with biometric sensors for FR monitoring during field operations, integrated Common Operational Picture (COP) platform for the field commanders, as well as several front-end and back-end solutions for reliable field communications and decision-support tools.
The technology providers and the practitioners of the project worked closely together to test and evaluate the effectiveness of the INGENIOUS final solution in real conditions during the 2nd FSX. The scenario simulated an emergency response during a terrorist attack in a highway tunnel. Significant feedback was received and the efficiency of the INGENIOUS innovations was assessed.

Deployment of the INGENIOUS technologies
According to the scenario, several cars have been affected by a denotation during the attack. The Social Media App – SMA (CERTH) is getting alerts that something is happening. The relevant to the incident tweets were filtered and delivered to the End Users through the Common Operational Picture – COP (STWS).
In detail, ERTZ had provided some synthetic tweets that referred to a hypothetical terror attack that happened in Iurreta. The tweets were divided in two groups. One with the first reports and observations from the citizens about the incident and a second with more precise details on what happened and what is the situation in the field after the attack.
These tweets were posted on Twitter with coded location names in order to avoid the spread of misinformation and panic. The posted tweets were collected from the crawler that analyzed them in order to estimate the reliability of the tweets and detect the location names that were mentioned in the tweet’s text and connect them with coordinates. The event detection module that is part of the SMA detected two events, clustering the collected tweets based on the time that they were posted and their geoinformation. Therefore, two events were detected, one for the start of the incident and one for the aftermath of the attack. The events contain information about the location where the event was detected, a short description of the event and the type of the event along with the individual tweets that compose it. The detected events are sent to Kafka from where the COP reads them and displays them. The image below shows how the individual tweets of an event are displayed on the map (on COP) as pins with the Twitter logo.

Before any 112 calls from the incident, the INGENIOUS Modular Aerial Camera System – Search and Rescue – MACS-SaR (DLR) was deployed to the scene to assess the situation and create a map of the incident.



DLR’s MACS was used to provide and transmit the large-scale operational picture (LSOP) of the scene and send it to the COP. Only a short amount of time (10min) was needed to prepare the drone, acquire the aerial images and send them as planned to the common operational picture. In contrast to Small Scale Test #5 (SST#5) a Tri-copter version was used as camera-carrier instead of a fixed-wing VTOL due to strict borders where the drone was allowed to enter/operate.


In addition to the LSOP, the coded markers (QR-codes/Apriltags) have also been used during the 2nd FSX to co-register all the INGENIOUS smart mapping and localisation devices. MACS-SaR was able to survey and extract them automatically followed by triangulation and subsequent provision to all other tools via the Ground Control Station – GCS (ITC).

In this Exercise, ITC’s Ground Control Station (GCS) played an important role as it provided the data communication and integration of the INGENIOUS smart mapping and localization devices. All these components sent collected geo-referenced sensor data to GCS, and according to FR’s interest, GCS forwarded some of the data to the Fusion Engine. COP visualised the data (point clouds, trajectories, detected victims, etc.) to provide the FRs with enhanced situational awareness.
Besides, GCS ran AI algorithms to analyse the scene and provided useful semantic information. Objects or building structures of interest to the FR were marked in different colors on a semantic map. GSC also ran a detection algorithm on RGB images to help FR find victims. As shown in Figure 10, the output of the algorithm was visualised on COP. According to the discussion with the COP team during SST#14, when GCS sent an image with detected victims it also set the message field “has_victim” to True. Therefore, COP was able to filter and only show these victim images to FR (Figure 11). In conclusion, ITC’s algorithm detected the victim as expected, and resulted in a successful project test.


In the meantime, police forces from ERTZ arrived at the scene with K9 unit equipped with the K9 vest (ICCS) which includes bidirectional audio, GPS location and two cameras (thermal/normal) and search for the terrorist. The cameras provided video streams in the COP for the mission commander. The terrorist is arrested and identified through the Facial Recognition App (ICCS).



The area is unsafe, and the Multi-purpose Autonomous eXploring drone – MAX (FOI) is sent in to map the area and search for victims. The MAX acts as an extra teammate for first responders that they can send in to explore unknown and potentially dangerous environments. The 2nd FSX offered several opportunities to showcase the role of the MAX drone within the context of INGENIOUS, including its ability to fly autonomously while sending sensor data (e.g. visual, thermal and 3D data) to the Ground Control Station for analysis and then for visualisation in the COP. During the Exercise there were also generous slots of time where our partners in FOI demonstrated MAX and revealed more details for End Users, focusing more on the capabilities of the drone as well as the performance of algorithms embedded on-board the drone.

More specifically, during this demonstration, MAX flew autonomously through a door(-like opening) and navigated on its own through the environment. The navigation was achieved through 3D mapping in real-time, reliable target point selection in the 3D map, planning of collision-free paths, and path following including obstacle detection and local re-planning around obstacle. While in the air, MAX continuously sent images from its on-board visual and thermal cameras to the ground computer. Below is a picture from one of the demonstration sessions where interested End Users got the opportunity to ask questions and give feedback. Overall, the feedback was very positive about the MAX concept as such and the achievements in this project, resulting in a fully working prototype serving the purpose of showing how novel technological aids can support first responders in future missions.

The scenario continued and a victim was located using the INGENIOUS tools. A request for an extraction team was transmitted in French. The Multilingual Operations App – MOA (UPF) translated the request into Spanish and then back again into French to facilitate communication among First Responders of different nationalities.

The Micro Indoor drones – MINs (SINTEF) were deployed to enrich the map of the area and provide the location of the First Responders involved in the mission. Our partners in SINTEF showed the feasibility of the incrementally deployed localisation system for First Responder localisation in GNSS denied areas. More specifically, they used some pre-deployed anchors and only did a live deployment of 3 additional MINs to speed up the demo. The deployed swarm could, together with a tag worn by one of the First Responders, locate said FR and show its live position in global coordinates in the COP, showing and verifying the concept of the MIN system.



They could also verify some improvements in flight and localisation stability that were implemented in-between in the few weeks before the FSX, targeting problems they experienced during the SST#14 in September. Automatic deployment of a swarm of MINs was tested both in the scenario area as well as in a larger free space before and after the FSX.

Due to the focus on fixing the deployment problems experienced earlier, they, unfortunately, could not implement and show a more autonomous deployment using cradles that would charge and start-up the single MINs in sequence. Their portable Base-Station however, together with a newly implemented small user interface, as in the SST, proved again to ease the setup significantly as compared to previous tests. The automatic configuration of newly activated MINs and FR-Tags, together with automatic calculations of the next target for each MIN, optimising area coverage and connectivity, the general workflow to deploy the swarm is now, under ideal conditions, as easy as 4 clicks to setup the base-station and 2 more clicks for each MIN after switching it on.



After MINs deployment, a rescue team was deployed wearing Boots (CyRIC), Uniform (TEK), Bracelet (ICCS), and Gas sensor (TUW, ALPES LASERS) to extract the victim. The victim was heavily injured in the face and needed to be identified via the Facial recognition app. They carried the victim away to the triage area and put the triage necklace on. After triage, the information was available in the Worksite Operations app – WOA (EXUS), allowing the team leader to have an overview status of the mission.
Boots and Uniform worked in harmony to identify a series of incidents such as crawling, immobilisation, limping, heavy load, hanging, running and walking. The team leader receives the alert and presses a button to confirm that he is informed of the incident.
The Fusion Engine (EXUS) was running in the backend collecting all the information and fusing them to relevant component. The Expert Reasoning Engine – ERE (CERTH) analysed the raw data from Boots and the Gas sensor to identify potential threats and sent alerts in the COP. Communications are running in the background to provide a common network for all the components to exchange information.
The Expert Reasoning Engine’s functionalities that were tested were mainly around the population of the knowledge base, the semantic reasoning on top of it, and the new alerting rules that came out of the reasoning process about the gas sensor, the battery levels and the boot events:
- Gas Level alerts: The gas sensor was sending measurements of two poisonous gases Ammonia gas and Carbon monoxide. These sequences of measured values were aggregated and stored in a semantic database (GraphDB) and on top of these data the rules with the specific values of thresholds (duration of exposure and concentration of each specific gas e.g., 300ppm for max 3 mins). The rule was realized by bypassing these values and an alert describing the severity of the situation was visualized
- Battery Level alerts: The ERE was constantly receiving indications of the battery status of the equipment, linking their source with information already stored in the Knowledge Base and when a battery was at “LOW” status an alert was visualized at the COP
- Boot event alerts: ERE was also constantly receiving the events of the Boot sensors that they could be both an Alerting event and a Canceling event. The service again linked the data with the boots’ source and sent an alert that other than that included its severity and the type of the event (e.g., “FR limping”, “Moderate”).
The results were quite encouraging. The integration with the other components was successful and the tested alerts were visualized in the COP. The overall evaluation from the user was positive.
As for the Gas Sensor (ALPES LASERS, TUW), it was used in a scenario where the toxic gas levels were below the critical thresholds. The position of the FR carrying the gas sensor was recorded and transmitted to the COP, and then displayed as a green area, indicating that the region was safe from the presence of toxic gases. Furthermore, the low battery level warnings and the toxic gas alerts were also tested in mock up conditions.
The Smart Insole Boots is one of the innovative wearable technologies used to support the FR of the Future. The patented smart insole technology is developed by CyRIC, an SME based in Nicosia, Cyprus. By placing the insoles inside operational FR boots, they instantly become smart boots. The Smart Insole Boots integrate sensors and edge processing algorithms to detect critical and life-threatening events for the FR. Their wearable and completely wireless operation make the Smart Insole Boots a non-intrusive and indispensable tool for First Responders.


The responsible First Responders for wearing the insoles during the 2nd FSX were the Hellenic Rescue Team of Attica (HRTA) members. The users with the appropriate training acquired from the previous SSTs and 1st FSX utilised the tool effectively and demonstrated the effectiveness by performing gait-related events such as crawling and heavy load activities during the demonstration. The heavy load event was triggered while the user was lifting a victim and warding him off the affected area. The insoles successfully detected the events, and they forwarded them through the INGENIOUS NGIT to the ERE and COP to inform the center and the team leader about the situation and enhance situational awareness.

The Information Flow involved several of the NGIT tools from different INGENIOUS partners and can be summarised in following steps:
- Smart Insole Boots (CyRIC) detect critical events (FR immobilised)
- Smart Uniform (TEK) collects and forwards such events to using the Field Comms (ICCS)
- Fusion Engine (EXUS) and Expert Reasoning (CERTH) process events to generate Alerts
- Common Operation Picture (STWS) visualises Alerts
- Field Teams receive Alerts and information needed to respond and assist the immobilised FR
HRTA participated not only in the Boots testing but also throughout the field activities, from initial setup and testing during the first days to the official FSX on the main event day on the 10th of November. HRTA participated as a field team during the scenarios, primarily for FR biometrics monitoring via wearable, e.g. health status, heavy load and possible injury/incapacitation, providing automatic alerts to the team leader in the field, as well as the team commander via the COP platform.
One more of the INGENIOUS tools played a significant role in the 2nd FSX. The Augmented Reality Response Platform – ARRP (CS GROUP) was used to demonstrate near-real-time annotation and status update between the field team and the command center, improving the situational awareness, safety and efficiency of the FR team deployed inside the “hotzone”. Two types of FRs used the AR application deployed on the HoloLens. The AR device display was also shown on an external screen for the audience to witness what the FR was seeing and doing in AR.

First, a member of the rescue team took part in the extraction of the victim. The FR equipped with the AR device used the AR capabilities to annotate, using a voice command, the specific location of the victim. The FR also notified, using the AR hand menu to select the appropriate risk annotation, the positions of hazards encountered during the mission. The AR annotations were also shared with the COP and displayed on its map.


Then, once the incident was secured, a member of the police investigation team used the AR application to handle the crime scene and search for evidence. The investigator was able to see in the environment and on the AR map, previously created annotations (from the rescue team) and new ones created by the COP operator (for instance, to indicate the victim’s and suspect’s car, and the “common approach path”). He, then, used the AR application to mark and share points of interest linked to possible evidence and was also able to see the position of one of his team members (who was using the IPS INGENIOUS component) on the AR map.

After the execution of the scenario, the participants were able to test the AR applications for themselves to have a deeper understanding of the capabilities and possibilities of the tool to share geo localized information in real time, keeping in mind the current limitation linked to the hardware (AR device) itself.

Police team arrived at that point and used the ARRP and the Integrated Positioning System – IPS (DLR) to investigate the incident and collect information. IPS was demonstrated to transmit the carrier’s position and user-triggered images to the COP in realtime. Also, a point cloud generated by IPS was successfully displayed in the COP.


First, IPS was referenced with the global coordinates of the AprilTags provided by MACS-SaR in front of the main hall as already successfully practiced during SST#14. After co-registering, IPS was given to a first responder who examined the two cars taking photos with annotated position and angle. In the COP the user was able to demonstrate that the photos appeared in real time at the corresponding locations on the map (as position tags with a photo-symbol, as shown in Figure 37). The user was able to click on each of these photos to view its content, as also demonstrated during the FSX#2. Also, the position of the first responder carrying IPS was displayed in real time on the COP during the exercise (blue police symbol in the center of Figure 37). In addition, the point cloud recorded by IPS was displayed in the COP as a coloured point cloud. Figure 37 clearly shows the two cars (in red, orange and yellow points) at the correct position. Even the open door of the exploded (left) car is visible in the point cloud.


The COP was the central platform that collected all the information from the different tools in real time during the operation and helped the command center to manage the mission.

A final search for victims was conducted using the Helmet (KIRO) which provided live footage at the Command center. All victims were triaged via the Triage application (ICCS) and prioritised for hospitalisation.




End Users’ perspective from using, testing and validating the INGENIOUS NGIT
End users provided useful feedback regarding the functionality and usability of the tools, testing them in real world conditions. Comments have been overall positive. For our partners in Södertörns brandförsvarsförbund (SBFF), who are also the leading beneficiary for the deliverable D1.9 End User Requirements (Revisited), the challenges became clear in terms of working with expectation on the tools as they are at different stages of development, ranging from laboratory development to field ready equipment. Despite these differences, the project managed to create a proof-of-concept toolkit system that combines all the input from the smart tools into one main situational awareness system, which was successfully tested during the 2nd FSX.

Officers from the Police Service of Northern Ireland (PSNI) tested and validated the IPS which they found simple to use and the information, when displayed in the COP, was accurate and correlated with other sources of data obtained from other pieces of equipment tested during the exercise. DLR explained that although the tool they were demonstrating was a prototype a deployable system for First Responders would be much more compact and fully integrated into a bespoke helmet.
As a Law Enforcement Agency, PSNI considered a few fundamental questions such as: “Where could we use this technology?”
They identified two main areas of business:
- Firstly, in locations where no GPS location data is available- an example of this would be a tunnel or collapsed structure where voids are present and cannot be correlated accurately for excavation or extraction. This could easily be attributed to natural disasters such as earthquake or subsidence after heavy rainfall. Secondly, in a collapsed building such as an underground carpark. This would provide ideal conditions for this type of mapping if the incident was of a terrorist or criminal based nature.
- The second example would be when an incident has progressed from the post rescue to the recovery and investigatory phase. An example of this could be a crime scene where there is a real and present danger to responders entering to carry out assessment of the scene. The IPS data could be overlaid with previously known data to find anomalies and structural weaknesses. During the testing of the 2nd FSX DLR and First Responders discussed future iterations of the technology suggesting that the IPS could be mounted on an unmanned ground vehicle to further protect First Responders from entering a heavily contaminated scene. This would have applications at a CBRN incident or an industrial accident. Working in sites such as nuclear power plants poses significant issues where the shielding materials make accurate localisation data difficult to obtain.

PSNI’s tester noted that the application provided by DLR would be a valuable tool to protect officers on their initial response to a major incident. The tools capabilities can be utilised to better inform and assist all First responders and Government agencies during the recovery and rescue process through to the investigation phase. Ultimately this will increase confidence in our responding emergency services and assist the local population to resume to normal life as quickly and efficiently as possible.
PSNI’s second evaluation involved the Augmented Reality goggles. Their tester noted that have evolved throughout the course of the project into a tool which will bring real benefit to many different First responders. He was particularly impressed with the ease by which he was able to add and remove annotations which could be displayed both on the map and on an external screen. He noted that this functionality will enhance situational awareness for commanders and other agencies when they enter a disaster or a crime scene. It was noted that the ease of use of this tool was impressive and it responded instantly to voice commands. Moving forward PSNI would be very interested to have this technology integrated into a helmet which would greatly add to the safety of responders in the field.

For our partners in Ertzaintza (ERTZ), who were the hosts of the 2nd FSX, testing the INGENIOUS tools and components has been both an enriching and challenging experience. All the technical solutions and tools applied, although they dazzle individually when you see them acting together in practice, we see the triumph and the need for multidisciplinary and transversal activity in research and development. Not all solutions presented are applicable to security agencies such as Ertzaintza; however, only the police are not involved in real practical situations.
The different departments of the Basque police have positively evaluated the INGENIOUS solutions that are of interest to them and applicable to their daily work and special cases they manage. Although they are aware that not all present a degree of development and evolution that makes them possible in the short and medium term, they will monitor those that are considered most interesting for their work with society.
Both the preparation and the development of the activity in the final exercise demonstrated the know-how, professionalism, human quality and empathy of all participants, as well as respect for fundamental rights and ethical values. Technology alone can bring knowledge and solutions, but if it does not take people into account it is neither valid nor useful.

Tabletop demonstration of the INGENIOUS technologies
Following the successful completion of the 2nd FSX, the INGENIOUS technologies were available to the interested stakeholders in a tabletop demonstration. The technical partners responsible for their development provided clarification and bilateral demonstration of all the advanced features and benefits of the technologies.
2nd FSX Video presentation
The following video presents the results of the 2nd FSX. Watch it for more!
INGENIOUS 2nd International Workshop on “Tools for the First Responder of the Future”
The INGENIOUS partners had the opportunity to present the INGENIOUS NGIT during the INGENIOUS 2nd International Workshop on “Tools for the First Responder of the Future” which took place one day earlier on the 9th of November in a hybrid mode. The intriguing presentations were mainly focused on the final functionalities and benefits for the First Responders. Videos documenting the results achieved during the project’s Integration, Testing and Field Validation activities were also presented. The INGENIOUS End Users also actively participated in the Workshop, sharing their experiences and impressions from using the advanced prototypes of the INGENIOUS tools and components in the testing field. The second part of the Workshop included an open discussion with 4 cluster projects that are at a similar stage of development of their technologies. Overall, both the 2nd FSX and the 2nd International Workshop were successfully conducted yielding valuable feedback and insights for the future steps of the project.