African financial history is often neglected in research on the history of global financial systems, and in its turn research on African financial systems in the past often fails to explore links with the rest of the world. However, African economies and financial systems have been linked to the rest of the world since ancient times. Sub-Saharan Africa was a key supplier of gold used to underpin the monetary systems of Europe and the North from the medieval period through the 19th century. It was West African gold rather than slaves that first brought Europeans to the Atlantic coast of Africa during the early modern period. Within sub-Saharan Africa, currency and credit systems reflected both internal economic and political structures as well as international links. Before the colonial period, indigenous currencies were often tied to particular trades or trade routes. These systems did not immediately cease to exist with the introduction of territorial currencies by colonial governments. Rather, both systems coexisted, often leading to shocks and localized crises during periods of global financial uncertainty. At independence, African governments had to contend with a legacy of financial underdevelopment left from the colonial period. Their efforts to address this have, however, been shaped by global economic trends. Despite recent expansion and innovation, limited financial development remains a hindrance to economic growth.
Moussa P. Blimpo, Admasu Asfaw Maruta, and Josephine Ofori Adofo
Well-functioning institutions are essential for stable and prosperous societies. Despite significant improvement during the past three decades, the consolidation of coherent and stable institutions remains a challenge in many African countries. There is a persistent wedge between the de jure rules, the observance of the rules, and practices at many levels. The wedge largely stems from the fact that the analysis and design of institutions have focused mainly on a top-down approach, which gives more prominence to written laws. During the past two decades, however, a new strand of literature has emerged, focusing on accountability from the bottom up and making institutions more responsive to citizens’ needs. It designs and evaluates a mix of interventions, including information provision to local communities, training, or outright decentralization of decision-making at the local level. In theory, accountability from the bottom up may pave the way in shaping the institutions’ nature at the top—driven by superior localized knowledge. The empirical findings, however, have yielded a limited positive impact or remained mixed at best. Some of the early emerging regularities showed that information and transparency alone are not enough to generate accountability. The reasons include the lack of local ownership and the power asymmetry between the local elites and the people. Some of the studies have addressed many of these constraints at varying degrees without much improvement in the outcomes. A simple theoretical framework with multiple equilibria helps better understand this literature. In this framework, the literature consists of attempts to mobilize, gradually or at once, a critical mass to shift from existing norms and practices (inferior equilibrium) into another set of norms and practices (superior equilibrium). Shifting an equilibrium requires large and/or sustained shocks, whereas most interventions tend to be smaller in scope and short-lived. In addition, accountability at the bottom is often neglected relative to rights. If norms and practices within families and communities carry similar features as those observed at the top (e.g., abuse of one’s power), then the core of the problem is beyond just a wedge between the ruling elite and the citizens.
Bénédicte Apouey, Gabriel Picone, and Joshua Wilde
Malaria is a potentially life-threatening disease transmitted through the bites of female anopheline mosquitos infected with protozoan parasites. Malaria remains one of the major causes of mortality by infectious disease: in 2015, there were an estimated 212 million cases and 429,000 deaths globally, according to the 2016 World Malaria Report. Children under 5 years in sub-Saharan Africa bear the greatest burden of the disease worldwide. However, most of these cases could be prevented or treated. Several methods are highly effective in preventing malaria: in particular, sleeping under an insecticide-treated mosquito net (ITN), indoor residual spraying (IRS), and taking intermittent preventive treatment for pregnant women (IPTp). Regarding treatment, artesiminin-based combination therapy (ACT) is recommended as first-line treatment in many countries. Compared with other actions, malaria prevention behaviors have some specific features. In particular, they produce public health externalities. For example, bed net usage creates positive externalities since bed nets not only directly protect the user, but also reduce transmission probabilities through reduction in the number of disease hosts, and in the case of ITNs, reduction of the vector itself. In contrast, ACT uptake creates both positive externalities when individuals with malaria are treated, and negative externalities in the case of overtreatment that speeds up the spread of long-run parasite resistance. Moreover, ITNs, IPTp, and ACTs are experience goods (meaning individuals only ascertain their benefits upon usage), which implies that current preventive actions are linked to past preventive behaviors. Malaria prevention and eradication produce unambiguous benefits across various domains: economic conditions, educational outcomes, survival, fertility, and health. However, despite the high private returns to prevention, the adoption of antimalarial products and behaviors remains relatively low in malaria-affected areas. A variety of explanations have been proposed for low adoption rates, including financial constraints, high prices, and absence of information. While recent studies highlight that all of these factors play a role, the main barrier to adoption is probably financial constraints. This finding has implications regarding the appropriate pricing policy for these health products. In addition, there is a shortage of causally identified research on the effect of cultural and psychological barriers to the adoption of preventive behaviors. The literature which does exist is from a few randomized control trials of few individuals in very specific geographic and cultural contexts, and may not be generalizable. As a result, there are still ample opportunities for research on applying the insights of behavioral economics to malaria-preventive behavior in particular. Moreover, little research has been done on the supply side, such as whether free or heavily subsidized distribution of prevention technologies is fiscally sustainable; finding effective methods to solve logistical problems which lead to shortages and ineffective alternative treatments to fill the gap; or training sufficient healthcare workers to ensure smooth and effective delivery. Given these gaps in the literature, there are still multiple fruitful avenues for research which may have a first-order effect on reducing the prevalence of malaria in the developing world.
Diane McIntyre, Amarech G. Obse, Edwine W. Barasa, and John E. Ataguba
Within the context of the Sustainable Development Goals, it is important to critically review research on healthcare financing in sub-Saharan Africa (SSA) from the perspective of the universal health coverage (UHC) goals of financial protection and access to quality health services for all. There is a concerning reliance on direct out-of-pocket payments in many SSA countries, accounting for an average of 36% of current health expenditure compared to only 22% in the rest of the world. Contributions to health insurance schemes, whether voluntary or mandatory, contribute a small share of current health expenditure. While domestic mandatory prepayment mechanisms (tax and mandatory insurance) is the next largest category of healthcare financing in SSA (35%), a relatively large share of funding in SSA (14% compared to <1% in the rest of the world) is attributable to, sometimes unstable, external funding sources. There is a growing recognition of the need to reduce out-of-pocket payments and increase domestic mandatory prepayment financing to move towards UHC. Many SSA countries have declared a preference for achieving this through contributory health insurance schemes, particularly for formal sector workers, with service entitlements tied to contributions. Policy debates about whether a contributory approach is the most efficient, equitable and sustainable means of financing progress to UHC are emotive and infused with “conventional wisdom.” A range of research questions must be addressed to provide a more comprehensive empirical evidence base for these debates and to support progress to UHC.