Dominic Moran and Jorie Knook
Climate change is already having a significant impact on agriculture through greater weather variability and the increasing frequency of extreme events. International policy is rightly focused on adapting and transforming agricultural and food production systems to reduce vulnerability. But agriculture also has a role in terms of climate change mitigation. The agricultural sector accounts for approximately a third of global anthropogenic greenhouse gas emissions, including related emissions from land-use change and deforestation. Farmers and land managers have a significant role to play because emissions reduction measures can be taken to increase soil carbon sequestration, manage fertilizer application, and improve ruminant nutrition and waste. There is also potential to improve overall productivity in some systems, thereby reducing emissions per unit of product. The global significance of such actions should not be underestimated. Existing research shows that some of these measures are low cost relative to the costs of reducing emissions in other sectors such as energy or heavy industry. Some measures are apparently cost-negative or win–win, in that they have the potential to reduce emissions and save production costs. However, the mitigation potential is also hindered by the biophysical complexity of agricultural systems and institutional and behavioral barriers limiting the adoption of these measures in developed and developing countries. This includes formal agreement on how agricultural mitigation should be treated in national obligations, commitments or targets, and the nature of policy incentives that can be deployed in different farming systems and along food chains beyond the farm gate. These challenges also overlap growing concern about global food security, which highlights additional stressors, including demographic change, natural resource scarcity, and economic convergence in consumption preferences, particularly for livestock products. The focus on reducing emissions through modified food consumption and reduced waste is a recent agenda that is proving more controversial than dealing with emissions related to production.
James B. London
Coastal zone management (CZM) has evolved since the enactment of the U.S. Coastal Zone Management Act of 1972, which was the first comprehensive program of its type. The newer iteration of Integrated Coastal Zone Management (ICZM), as applied to the European Union (2000, 2002), establishes priorities and a comprehensive strategy framework. While coastal management was established in large part to address issues of both development and resource protection in the coastal zone, conditions have changed. Accelerated rates of sea level rise (SLR) as well as continued rapid development along the coasts have increased vulnerability. The article examines changing conditions over time and the role of CZM and ICZM in addressing increased climate related vulnerabilities along the coast.
The article argues that effective adaptation strategies will require a sound information base and an institutional framework that appropriately addresses the risk of development in the coastal zone. The information base has improved through recent advances in technology and geospatial data quality. Critical for decision-makers will be sound information to identify vulnerabilities, formulate options, and assess the viability of a set of adaptation alternatives. The institutional framework must include the political will to act decisively and send the right signals to encourage responsible development patterns. At the same time, as communities are likely to bear higher costs for adaptation, it is important that they are given appropriate tools to effectively weigh alternatives, including the cost avoidance associated with corrective action. Adaptation strategies must be pro-active and anticipatory. Failure to act strategically will be fiscally irresponsible.