Visible to the public Next Generation Connected and Smart Cyber Fire Fighter SystemConflict Detection Enabled

Project Details
Lead PI:Manel Martinez-Ramon
Co-PI(s):Yin Yang
Ramiro Jordan
Performance Period:07/01/16 - 06/30/18
Institution(s):University of New Mexico
Sponsor(s):National Science Foundation
Award Number:1637092
332 Reads. Placed 519 out of 803 NSF CPS Projects based on total reads on all related artifacts.
Abstract: The goal of this project is to demonstrate that advanced information and sensor technology can improve operational efficiency and increase the security and safety of fire fighters. Existing firefighting systems will be augmented by exploiting the information capabilities of hardware and software components that can be attached to the existing fire fighter equipment, with minimal physical burden and required training. The system will provide a model of the emergency scenario that will allow the commander to evaluate possible alternative actions based on their experience and available resources. This situation awareness will be created from the data provided by the fighter gear (microphones, cameras, body and ambient sensors), and will include the estimation of the fighter situation (including fighters' incidents, oxygen reserve or estimated time left to leave the scenario) and the scenario itself (including the presence of victims, evaluation hazardous object or environments, as hot surfaces, toxic gas and others). This proposal is highly relevant for smart and connected communities. It addresses problem space of great relevance in emergency operations with a technology solution that faces significant research and operational challenges. The project engages technical communities, non-profit partners and local government institutions. It is a cooperation between various departments of the University of New Mexico, in collaboration with the City of Santa Fe and the City of Albuquerque Fire departments and the National Fire Protection Association. The project integrates a hardware layout that collects the data from each fire fighter on duty with a software engine for extracting data and processing. Data from infrared cameras, body and ambient sensors' will be interfaced to a communication node to transmit extracted and compressed information using a mesh structure for communications based on software defined radio supporting heterogeneous communication assets, instant deployment and hot reconfiguration, resiliency and recovery abilities. The software engine will integrate machine learning based feature extraction and prediction methods that will process the ambient data, audio and speech, to sense the fighter's condition, detect relevant keywords whose meaning can be transmitted, or give orders to the system. Video will be locally processed to extract relevant features (civilians, heat surfaces, hazardous objects and others). Machine learning algorithms will then be used to construct the situational awareness that will be served to the commander and fire fighters, including scenario and fire fighters' situation. The system will be tested in a variety of operational scenarios to evaluate the potential for transition and application in other emergency domains.