3rd round of INGENIOUS Laboratory Integration Tests (LITs) #19-22, #23, #24-27
The 3rd round of Integration and Testing Laboratories (LITs) of the INGENIOUS project has been successfully carried out.The LIT#23 was held with physical participation during 28-30 March 2022 and the other LITs (#19-22, #24-27) remotely from April 5 to 7, 2022. This is the last round planned in the INGENIOUS project that contemplates incremental and iterative integrations from the Laboratory Integrated Tests (LITs), Small Scale Tests (SSTs) and Full Scale Exercises (FSXs). On this occasion, unlike the previous ones, the scope has been extended beyond the laboratory and focused on the connection of specific tools, but oriented towards the preparation for the 1st FSX given the level of development, interdependence, and integration between the components.
LITs #19-22, #24-27
Taking into account the limitations imposed by the pandemic situation, most of the integration activities were carried out with remote participation of all technological partners with their corresponding tools, operating in a joint scenario and conducted as a sequence of different use cases, verifying the correct interaction between all the elements. Specifically, the LITs and the components tested remotely were: LIT #19 Helmet – Augmented Reality Services, LIT #20 Gas sensor, LIT #21 Uniform – Boots, LIT #22 K9 Vest, LIT #24 Triage App/ DVI & Face Recognition App, LIT #25 Fusion Engine & Expert Reasoning – Worksite Operations App, LIT #26 Social Media App, LIT #27 COP Platform & C3 – Multilanguage Operations App.
Main use cases included: Element registration; association to First Responders (FRs); field deployment; search for people in intense smoke was implemented through normal and IR video transmission; marking of dangerous conditions, victims, annotations, triage app and augmented reality services; automatic identification of RF in distress, data aggregation, generation, signalling, recognition of alarms through Boots, Uniform, Expert reasoning and Fusion Engine; search and rescue of people with K9 units; facial recognition; automatic translation of messages; extraction of information from social networks; coordination and control; status monitoring. The verification was carried out live with active participation and immediate resolution of minor details.
LIT#23 MAX – MINs – MACS-SaR – IPS – Situational Awareness Algorithm
LIT#23 was hosted at our partner DLR’s site in Berlin, Germany. The test was planned to have an emphasis on the integration of the INGENIOUS “Smart” Devices in the air and on the ground (Multi-purpose Autonomous eXploring drone (MAX), Micro Indoor droNes (MINs), Modular Aerial Camera System – Search and Rescue (MACS-SaR), Deployable Integrated Positioning System (IPS), Situational Awareness Algorithm).
During LIT#23 the MAX drone (developed by FOI) captured data using its on-board lidar, cameras and inertial sensors. The sensor data was processed in real-time to estimate the drone’s position, build a 3D map of the environment and provide images for further analysis and scene assessment. Data was shared in collaboration with ITC, DLR and SINTEF, resulting in geo-referenced images, reference 3D point clouds and 3D occupancy grids, using a combination of reference AprilTags and MAX’s on-board positioning system.
The MIN testing (SINTEF) during LIT#23 had two main objectives: (1) to show the deployment of the drones and the tracking of a moving target and (2) to verify the communication protocols. Several MIN drones were deployed from a referenced point by the base station and commanded to fly a path between waypoints that were inserted manually on a map that was provided by the MAX drone. After landing at the targeted waypoint the MIN’s onboard Ultra-Wideband (UWB) beacon automatically changed from receiver mode to transmitter mode so that the MINs could form a mesh network to enable localisation of a moving First Responder (FR). The MIN drones would relay the position of an FR wearing a localisation device to the central WP3 station.
As for the MACS-SaR drone and corresponding camera system developed by our partner in DLR, they can be used to create a large-scale map (e.g. 1 square kilometer at 1.5cm ground resolution in less than 20min) for fast disaster overview and assessment. The system consists of a vertical take-off and landing (VTOL) drone stored in a ruggedized box (see Figure 1), an (already integrated) camera system inside the front of the drone and a ground control station (see Figure 2).
During LIT#23, MACS-SaR delivered a large scale operational picture, which was sent to the Ground Control Station (see Figure 3) for visualization and fast disaster assessment.
MACS-SaR was also used to triangulate coded markers and tie them to precise coordinates in the world reference frame (e.g. WGS84, UTM, …). These markers then can be observed by all the other drones and components tested (MINs, MAX, IPS) to transfer and visualize all the data products combined in a common coordinate system. They are derived as part of a fast and robust aero triangulation directly after landing and delivered within a few minutes for further use/consumption (see Figure 4).
ITC is the partner responsible for the development of the Ground Control Station (GCS), which was designed to provide FRs with enhanced situational awareness. Researchers at ITC had designed a novel method that utilises harmonious composite victim images to train a victim detection tool [1]. This tool was also tested during the LIT#23, and it was useful to find partially buried victims.
Besides providing scene understanding algorithms, GCS also served as a platform for data exchange and integration in this test. It received the global coordinate system created by MACS-SaR, and converted MAX’s local positions into global ones, thus making two of the INGENIOUS drones registered in a same coordinate system. In addition, its internal SFTP sever enabled other devices to exchange data. For example, it allowed MIN to download MAX’s point clouds as a prior map for navigation. As the only data interface for communication with other INGENIOUS components, such a data registration and integration scheme is necessary and important, because it will make the data fusion and visualization of the whole INGENIOUS project easier. In conclusion, ITC’s GCS played an important role in this LIT, and the test was successful. The GCS’s inter-WPs’ communication functions will be tested in further full scale tests.
Reference: [1] Zhang, Ning, Francesco Nex, George Vosselman, and Norman Kerle. “Training a Disaster Victim Detection Network for UAV Search and Rescue Using Harmonious Composite Images.” Remote Sensing 14, no. 13 (2022): 2977.
The deployable Integrated Positioning System (IPS) for self-localization and environmental mapping was the last component tested during LIT#23. IPS will track and trace first responders and offer seamless navigation indoors and outdoors in an absolute reference system, while also creating a map of the surrounding environment and detecting and tracking persons and assets in real time.
IPS is a portable, passive system for ego-motion determination (position and attitude), which is being developed by our partners in DLR. It is designed for non cooperative environments (without infrastructure, external referencing or maps). IPS delivers an accurate trajectory due to optimal fusion of stereo vision and inertial navigation and is furthermore suitable for indoor and outdoor applications in real time. The developed workflow applies for both indoor and outdoor FR continuous self-localization and environmental mapping.
During LIT#23, IPS was tested in its function as well as its integration with the other INGENIOUS “Smart” devices in the air and on the ground. In order to be able to transmit its position, local point cloud and georeferenced images in global coordinates the test run started in front of the building, where the tag plates (AprilTags) georeferenced by MACS-SaR were imaged for an automatic coregistration of IPS. From this point on IPS was able to navigate without any localisation system (only by means of its IMU and cameras) but was able to transmit global coordinates to the Common Operational Picture System (COP).
A test run was performed through the 1st floor and the basement of the Building, walking around a bit further in the round meeting room and two small rooms next to the staircase.
During the walk the position of IPS and georeferenced, user-triggered images were transmitted to the WP3 server, which relays them to the COP in future. As the large WiFi antenna was not present at this LIT the connection broke during the run because of the large distances travelled. However, it was already experienced during Small Scale Test #5 (SST#5) that with the large WiFi antenna transmission over such distances is possible.
Overall the 3rd round of the INGENIOUS LITs was successfully conducted. With the execution of these tests, notable progress has been made in the integration of the latest advances achieved in each of the tools (functionalities, modularity, miniaturization, enclosure, increase in processing power, new sensing capabilities, integration, etc.) towards a joint and coordinated operation that was demonstrated in the 1st FSX with the participation of end users.