The flooding of rivers and coastlines is the most frequent and damaging of all natural hazards. Between 1980 and 2016, total direct damages exceeded $1.6 trillion, and at least 225,000 people lost their lives. Recent events causing major economic losses include the 2011 river flooding in Thailand ($40 billion) and the 2013 coastal floods in the United States caused by Hurricane Sandy (over $50 billion). Flooding also triggers great humanitarian challenges. The 2015 Malawi floods were the worst in the country’s history and were followed by food shortage across large parts of the country. Flood losses are increasing rapidly in some world regions, driven by economic development in floodplains and increases in the frequency of extreme precipitation events and global sea level due to climate change. The largest increase in flood losses is seen in low-income countries, where population growth is rapid and many cities are expanding quickly. At the same time, evidence shows that adaptation to flood risk is already happening, and a large proportion of losses can be contained successfully by effective risk management strategies. Such risk management strategies may include floodplain zoning, construction and maintenance of flood defenses, reforestation of land draining into rivers, and use of early warning systems. To reduce risk effectively, it is important to know the location and impact of potential floods under current and future social and environmental conditions. In a risk assessment, models can be used to map the flow of water over land after an intense rainfall event or storm surge (the hazard). Modeled for many different potential events, this provides estimates of potential inundation depth in flood-prone areas. Such maps can be constructed for various scenarios of climate change based on specific changes in rainfall, temperature, and sea level. To assess the impact of the modeled hazard (e.g., cost of damage or lives lost), the potential exposure (including buildings, population, and infrastructure) must be mapped using land-use and population density data and construction information. Population growth and urban expansion can be simulated by increasing the density or extent of the urban area in the model. The effects of floods on people and different types of buildings and infrastructure are determined using a vulnerability function. This indicates the damage expected to occur to a structure or group of people as a function of flood intensity (e.g., inundation depth and flow velocity). Potential adaptation measures such as land-use change or new flood defenses can be included in the model in order to understand how effective they may be in reducing flood risk. This way, risk assessments can demonstrate the possible approaches available to policymakers to build a less risky future.
Brenden Jongman, Hessel C. Winsemius, Stuart A. Fraser, Sanne Muis, and Philip J. Ward
Throughout history, flood management practice has evolved in response to flood events. This heuristic approach has yielded some important incremental shifts in both policy and planning (from the need to plan at a catchment scale to the recognition that flooding arises from multiple sources and that defenses, no matter how reliable, fail). Progress, however, has been painfully slow and sporadic, but a new, more strategic, approach is now emerging. A strategic approach does not, however, simply sustain an acceptable level of flood defence. Strategic Flood Risk Management (SFRM) is an approach that relies upon an adaptable portfolio of measures and policies to deliver outcomes that are socially just (when assessed against egalitarian, utilitarian, and Rawlsian principles), contribute positively to ecosystem services, and promote resilience. In doing so, SFRM offers a practical policy and planning framework to transform our understanding of risk and move toward a flood-resilient society. A strategic approach to flood management involves much more than simply reducing the chance of damage through the provision of “strong” structures and recognizes adaptive management as much more than simply “wait and see.” SFRM is inherently risk based and implemented through a continuous process of review and adaptation that seeks to actively manage future uncertainty, a characteristic that sets it apart from the linear flood defense planning paradigm based upon a more certain view of the future. In doing so, SFRM accepts there is no silver bullet to flood issues and that people and economies cannot always be protected from flooding. It accepts flooding as an important ecosystem function and that a legitimate ecosystem service is its contribution to flood risk management. Perhaps most importantly, however, SFRM enables the inherent conflicts as well as opportunities that characterize flood management choices to be openly debated, priorities to be set, and difficult investment choices to be made.
Floods affect more people worldwide than any other natural hazard. Flood risk results from the interplay of a range of processes. For river floods, these are the flood-triggering processes in the atmosphere, runoff generation in the catchment, flood waves traveling through the river network, possibly flood defense failure, and finally, inundation and damage processes in the flooded areas. In addition, ripple effects, such as regional or even global supply chain disruptions, may occur. Effective and efficient flood risk management requires understanding and quantifying the flood risk and its possible future developments. Hence, risk analysis is a key element of flood risk management. Risk assessments can be structured according to three questions: What can go wrong? How likely is it that it will happen? If it goes wrong, what are the consequences? Before answering these questions, the system boundaries, the processes to be included, and the detail of the analysis need to be carefully selected. One of the greatest challenges in flood risk analyses is the identification of the set of failure or damage scenarios. Often, extreme events beyond the experience of the analyst are missing, which may bias the risk estimate. Another challenge is the estimation of probabilities. There are at most a few observed events where data on the flood situation, such as inundation extent, depth, and loss are available. That means that even in the most optimistic situation there are only a few data points to validate the risk estimates. The situation is even more delicate when the risk has to be quantified for important infrastructure objects, such as breaching of a large dam or flooding of a nuclear power plant. Such events are practically unrepeatable. Hence, estimating of probabilities needs to be based on all available evidence, using observations whenever possible, but also including theoretical knowledge, modeling, specific investigations, experience, or expert judgment. As a result, flood risk assessments are often associated with large uncertainties. Examples abound where authorities, people at risk, and disaster management have been taken by surprise due to unexpected failure scenarios. This is not only a consequence of the complexity of flood risk systems, but may also be attributed to cognitive biases, such as being overconfident in the risk assessment. Hence, it is essential to ask: How wrong can the risk analysis be and still guarantee that the outcome is acceptable?
Marian Muste and Ton Hoitink
With a continuous global increase in flood frequency and intensity, there is an immediate need for new science-based solutions for flood mitigation, resilience, and adaptation that can be quickly deployed in any flood-prone area. An integral part of these solutions is the availability of river discharge measurements delivered in real time with high spatiotemporal density and over large-scale areas. Stream stages and the associated discharges are the most perceivable variables of the water cycle and the ones that eventually determine the levels of hazard during floods. Consequently, the availability of discharge records (a.k.a. streamflows) is paramount for flood-risk management because they provide actionable information for organizing the activities before, during, and after floods, and they supply the data for planning and designing floodplain infrastructure. Moreover, the discharge records represent the ground-truth data for developing and continuously improving the accuracy of the hydrologic models used for forecasting streamflows. Acquiring discharge data for streams is critically important not only for flood forecasting and monitoring but also for many other practical uses, such as monitoring water abstractions for supporting decisions in various socioeconomic activities (from agriculture to industry, transportation, and recreation) and for ensuring healthy ecological flows. All these activities require knowledge of past, current, and future flows in rivers and streams. Given its importance, an ability to measure the flow in channels has preoccupied water users for millennia. Starting with the simplest volumetric methods to estimate flows, the measurement of discharge has evolved through continued innovation to sophisticated methods so that today we can continuously acquire and communicate the data in real time. There is no essential difference between the instruments and methods used to acquire streamflow data during normal conditions versus during floods. The measurements during floods are, however, complex, hazardous, and of limited accuracy compared with those acquired during normal flows. The essential differences in the configuration and operation of the instruments and methods for discharge estimation stem from the type of measurements they acquire—that is, discrete and autonomous measurements (i.e., measurements that can be taken any time any place) and those acquired continuously (i.e., estimates based on indirect methods developed for fixed locations). Regardless of the measurement situation and approach, the main concern of the data providers for flooding (as well as for other areas of water resource management) is the timely delivery of accurate discharge data at flood-prone locations across river basins.
Prediction of floods at locations where no streamflow data exist is a global issue because most of the countries involved don’t have adequate streamflow records. The United States Geological Survey developed the regional flood frequency (RFF) analysis to predict annual peak flow quantiles, for example, the 100-year flood, in ungauged basins. RFF equations are pure statistical characterizations that use historical streamflow records and the concept of “homogeneous regions.” To supplement the accuracy of flood quantile estimates due to limited record lengths, a physical solution is required. It is further reinforced by the need to predict potential impacts of a changing hydro-climate system on flood frequencies. A nonlinear geophysical theory of floods, or a scaling theory for short, focused on river basins and abandoned the “homogeneous regions” concept in order to incorporate flood producing physical processes. Self-similarity in channel networks plays a foundational role in understanding the observed scaling, or power law relations, between peak flows and drainage areas. Scaling theory of floods offers a unified framework to predict floods in rainfall-runoff (RF-RO) events and in annual peak flow quantiles in ungauged basins. Theoretical research in the course of time clarified several key ideas: (1) to understand scaling in annual peak flow quantiles in terms of physical processes, it was necessary to consider scaling in individual RF-RO events; (2) a unique partitioning of a drainage basin into hillslopes and channel links is necessary; (3) a continuity equation in terms of link storage and discharge was developed for a link-hillslope pair (to complete the mathematical specification, another equation for a channel link involving storage and discharge can be written that gives the continuity equation in terms of discharge); (4) the self-similarity in channel networks plays a pivotal role in solving the continuity equation, which produces scaling in peak flows as drainage area goes to infinity (scaling is an emergent property that was shown to hold for an idealized case study); (5) a theory of hydraulic-geometry in channel networks is summarized; and (6) highlights of a theory of biological diversity in riparian vegetation along a network are given. The first observational study in the Goodwin Creek Experimental Watershed, Mississippi, discovered that the scaling slopes and intercepts vary from one RF-RO event to the next. Subsequently, diagnostic studies of this variability showed that it is a reflection of variability in the flood-producing mechanisms. It has led to developing a model that links the scaling in RF-RO events with the annual peak flow quantiles featured here. Rainfall-runoff models in engineering practice use a variety of techniques to calibrate their parameters using observed streamflow hydrographs. In ungagged basins, streamflow data are not available, and in a changing climate, the reliability of historic data becomes questionable, so calibration of parameters is not a viable option. Recent progress on developing a suitable theoretical framework to test RF-RO model parameterizations without calibration is briefly reviewed. Contributions to generalizing the scaling theory of floods to medium and large river basins spanning different climates are reviewed. Two studies that have focused on understanding floods at the scale of the entire planet Earth are cited. Finally, two case studies on the innovative applications of the scaling framework to practical hydrologic engineering problems are highlighted. They include real-time flood forecasting and the effect of spatially distributed small dams in a river network on real-time flood forecasting.