While scientists have explored methods for modeling and simulating such events and their effects, significant limitations remain. Much of the work, for example, has been done on a case-by-case basis, with little effort to integrate data on historical and unfolding hazards. Moreover, the data that is accumulated is often incomplete, recorded in diverse forms, or proprietary – making reliable damage assessment difficult.
To address these limitations, researchers in the Mathematics and Computer Science Division at Argonne National Laboratory have developed a flexible framework, called the Hazard Impact Framework, or HIF, that assesses the impact of hazards on key infrastructure and describes the initial state of that infrastructure – for example, which power lines are down or which plants damaged. This description can then be used for dynamical simulations to determine the level of service remaining in the damaged system, identify cascading effects, evaluate likely outcomes, and guide infrastructure planning strategies.
Central to HIF are two curated collections. One collection includes information about the type of asset (e.g., a wind turbine), its function, and its physical characteristics. The other collection includes “fragility” data, which provides a quantitative mapping between the local effect of the hazard and the viability of the asset.
“We realized that the connection between an asset and its fragility is difficult to create from disparate and incomplete data sources, so we wanted to do this only once,” said Mark Hereld, an experimental systems engineer in the MCS Division. “With this configuration HIF can assess a wide range of scenarios by changing data, not code. Moreover, new hazards and new infrastructure components can easily be added.”
The researchers were also concerned about automation. “Decision-makers need to be able to determine the possible effects of an emergency rapidly, so that they can have more time to consider and implement the best response,” said Kibaek Kim, an assistant computational mathematician in the MCS Division. To this end, the researchers developed flexible interfaces to a variety of data sources.
Another important feature of HIF is its clear demarcation between details specific to a given asset’s response to a particular hazard and the general notion of whether and by how much the asset has been degraded. HIF achieves this feature by exchanging generic quantities between the various software components – for example, probabilistic descriptions of infrastructure responses to stresses – rather than passing physical quantities such as surge level specific to a hazard. With this new framework, scientists can now rapidly test the impact of hypothetical scenarios and probe potential infrastructure weaknesses, gaining insights that can guide infrastructure planning and strategies.
For further information, see the paper: “Disaster = Infrastructure + Hazard,” by Mark Hereld and Kibaek Kim, Resilience Week (RWS) 2017, 10.1109/RWEEK.2017.8088668.