Simon French, & Carmen Niculae. (2004). Believe in the model: Mishandle the emergency. In B. C. B. Van de Walle (Ed.), Proceedings of ISCRAM 2004 – 1st International Workshop on Information Systems for Crisis Response and Management (pp. 9–14). Brussels: Royal Flemish Academy of Belgium.
Abstract: During the past quarter century there have been many developments in scientific models and computer codes to help predict the ongoing consequences in the aftermath of many types of emergency: e.g. storms and flooding, chemical and nuclear accident, epidemics such as SARS and terrorist attack. Some of these models relate to the immediate events and can help in managing the emergency; others predict longer term impacts and thus can help shape the strategy for the return to normality. But there are many pitfalls in the way of using these models effectively. Firstly, non-scientists and, sadly, many scientists believe in the models' predictions too much. The inherent uncertainties in the models are underestimated; sometimes almost unacknowledged. This means that initial strategies may need to be revised in ways that unsettle the public, losing their trust in the emergency management process. Secondly, the output from these models form an extremely valuable input to the decision making process; but only one such input. Most emergencies are events that have huge social and economic impacts alongside the health and environmental consequences. While we can model the latter passably well, we are not so good at modelling economic impacts and very poor at modelling social impacts. Too often our political masters promise the best 'science-based' decision making and too late realise that the social and economic impacts need addressing. In this paper, we explore how model predictions should be drawn into emergency management processes in more balanced ways than often has occurred in the past. © Proceedings ISCRAM 2004.
|
Rustenberg, K., Radianti, J., & Gjøsæter, T. (2023). Exploring Demons for the Establishment of Team Situational Awareness. In Jaziar Radianti, Ioannis Dokas, Nicolas Lalone, & Deepak Khazanchi (Eds.), Proceedings of the 20th International ISCRAM Conference (pp. 636–648). Omaha, USA: University of Nebraska at Omaha.
Abstract: Individual situational awareness (SA) is crucial for building team SA, which is necessary for achieving a shared understanding of a situation, making informed decisions, and taking appropriate actions. This article examines the communication barriers that emerge when transitioning from individual to team SA in emergency management scenarios. We observed two emergency exercises on “ongoing life-threatening violence” and dam failure causing hospital congestion. The study was complemented with interviews with participants of these exercises, aiming at identifying barriers called SA-demons in the team setting. We discovered barriers that hinder the establishment of team SA, including a vicious cycle of mistrust, a fragmented information trap, a false feeling of mastery trap, and a decaying memory trap. These barriers can stem from individual, organizational, or technological factors. To complement existing SA theories, we applied the Cynefin framework and found that standard operating procedures can be potential barriers when transitioning into chaotic or complex domains.
|