Paulini, M. S., Duran, D., Rice, M., Andrekanic, A., & Suri, N. (2023). KENNEL Threat Detection Boxes for First Responder Situational Awareness and Risk Management. In Jaziar Radianti, Ioannis Dokas, Nicolas Lalone, & Deepak Khazanchi (Eds.), Proceedings of the 20th International ISCRAM Conference (pp. 208–219). Omaha, USA: University of Nebraska at Omaha.
Abstract: KENNEL is a deployable IoT-based system consisting of a network of unattended ground sensors, known as Threat Detection Boxes (TDBs), which may be outfitted with any variety of custom and commercial-off-the-shelf sensors for hazard detection. The KENNEL system fills a technological gap for sensor fusion, interpretation, and real-time alerting via existing information management systems, such as Team Awareness Kit (TAK). First responders face a critical need for improved situational awareness, detection, and response to hazardous events. KENNEL provides a first of its kind, low-cost sensing & data fusion platform that is highly extensible, configurable, and self-sustaining, opening a world of modernization and innovation possibilities across the first responder domain. TDBs may also be statically or ad hoc deployed, improving flexibility, stand-off hazard detection, and resilience in the operational domain. From critical infrastructure monitoring to wearables, the system affords timeliness of critical information for effective risk management and increased personnel safety.
|
|
Tolt, G., Rydell, J., Bilock, E., Eek, J., Andersson, P., & Nygårds, J. (2023). Real-time Multi-Sensor Positioning for First Responders. In Jaziar Radianti, Ioannis Dokas, Nicolas Lalone, & Deepak Khazanchi (Eds.), Proceedings of the 20th International ISCRAM Conference (pp. 177–187). Omaha, USA: University of Nebraska at Omaha.
Abstract: This paper describes a concept for real-time positioning of first responders that includes a number of complementary sensors worn by the first responder, to increase accuracy and robustness in indoor and complex environments. By using sensors of different types, each with their own strengths and limitations, and fusing their respective outputs, the goal is to increase the usability of positioning information in time-critical and risky operations. This facilitates synchronization of activities and increases safety in the operation. The sensors included in the proposed real-time positioning module are shoe-mounted inertial measurement units, ultra-wideband radio, thermal and visual cameras, and GNSS. The fusion framework is based on factor graphs. This work-in-progress paper describes the individual sensor components and shows preliminary findings concerning the possibilities to improve position estimation through sensor fusion.
|
|
Lorscheidt, J., Wehbe, B., Cesar, D., Becker, T., & Vögele, T. (2023). Increasing diver safety for heavy underwater works by Sonar-to-Video Image Translation. In Jaziar Radianti, Ioannis Dokas, Nicolas Lalone, & Deepak Khazanchi (Eds.), Proceedings of the 20th International ISCRAM Conference (pp. 166–176). Omaha, USA: University of Nebraska at Omaha.
Abstract: Supervision of technical dives is particularly important in emergency and disaster response operations to ensure the safety of divers in unexplored locations with uncertain conditions. Diver monitoring relies primarily on voice communication and a video stream that gives the operator a first-person view of the diver. However, in many cases underwater visibility can drop to just a few centimeters, leaving the diver only able to feel his way with his hands and the operator depended only on voice communication, making it very difficult for both of them to identify upcoming hazards. In the DeeperSense research project, we are attempting to reduce the limitations caused by poor underwater visibility by using a sonar in combination with an AI-based algorithm designed to translate the sonar signal into a visual image that is independent of the turbidity of the water and gives an overview of the situation where the eye can no longer see anything. Laboratory results show that visual information can be recovered from sonar data.
|
|