Risk Perception and Its Impacts on Risk Governance
- Ortwin RennOrtwin RennDepartment of Technology and Environmental Sociology, University of Stuttgart
- and Andreas KlinkeAndreas KlinkeDepartment of Political Science, Memorial University of Newfoundland
Risk perception is an important component of risk governance, but it cannot and should not determine environmental policies. The reality is that people suffer and die as a result of false information or perception biases. It is particularly important to be aware of intuitive heuristics and common biases in making inferences from information in a situation where personal or institutional decisions have far-reaching consequences. The gap between risk assessment and risk perception is an important aspect of environmental policymaking. Communicators, risk managers, as well as representatives of the media, stakeholders, and the affected public should be well informed about the results of risk perception and risk response studies. They should be aware of typical patterns of information processing and reasoning when they engage in designing communication programs and risk management measures. At the same time, the potential recipients of information should be cognizant of the major psychological and social mechanisms of perception as a means to avoid painful errors.
To reach this goal of mutual enlightenment, it is crucial to understand the mechanisms and processes of how people perceive risks (with emphasis on environmental risks) and how they behave on the basis of their perceptions. Based on the insights from cognitive psychology, social psychology, micro-sociology, and behavioral studies, one can distill some basic lessons for risk governance that reflect universal characteristics of perception and that can be taken for granted in many different cultures and risk contexts.
This task of mutual enlightenment on the basis of evidence-based research and investigations is constrained by complexity, uncertainty, and ambiguity in describing, assessing, and analyzing risks, in particular environmental risks. The idea that the “truth” needs to be framed in a way that the targeted audience understands the message is far too simple. In a stochastic and nonlinear understanding of (environmental) risk there are always several (scientifically) legitimate ways of representing scientific insights and causal inferences. Much knowledge in risk and disaster assessment is based on incomplete models, simplified simulations, and expert judgments with a high degree of uncertainty and ambiguity. The juxtaposition of scientific truth, on one hand, and erroneous risk perception, on the other hand, does not reflect the real situation and lends itself to a vision of expertocracy that is neither functionally correct nor democratically justified. The main challenge is to initiate a dialogue that incorporates the limits and uncertainties of scientific knowledge and also starts a learning process by which obvious misperceptions are corrected and the legitimate corridor of interpretation is jointly defined.
In essence, expert opinion and lay perception need to be perceived as complementing, rather than competing with each other. The very essence of responsible action is to make viable and morally justified decisions in the face of uncertainty based on a range of scientifically legitimate expert assessments. These assessments have to be embedded into the context of criteria for acceptable risks, trade-offs between risks to humans and ecosystems, fair risk and benefit distribution, and precautionary measures. These criteria most precisely reflect the main points of lay perception. For a rational politics of risk, it is, therefore, imperative to collect both ethically justifiable evaluation criteria and standards and the best available systematic knowledge that inform us about the performance of each risk source or disaster-reduction option according to criteria that have been identified and approved in a legitimate due process. Ultimately, decisions on acceptable risks have to be based on a subjective mix of factual evidence, attitudes toward uncertainties, and moral standards.