1-10 of 54 Results  for:

  • Quantitative Political Methodology x
Clear all

Article

Judicial Dissent in Collegial Courts: Theory and Evidence  

Nuno Garoupa and Catarina Santos Botelho

In collegial courts, disagreements are inevitable. Are these disagreements advantageous or disadvantageous for lawmaking? Why, when, and how do judges decide to disagree with each other? The literature about collegial courts includes extensive normative and positive theories about separate opinions as well as how these kinds of decisions are made. Scholars offer different explanations based on distinct frameworks: a cost–benefit analysis (within rational-choice theory), the principal–agent model, and via legal culture. By considering the complexity of separate opinions in style, substance, collegiality, and frequency, it is possible to find compromises between both (normative and positive) strains of the literature. These compromises reflect a fundamental divergence between private (individual) and social motivations to promote separate opinions.

Article

Red Teaming and Crisis Preparedness  

Gary Ackerman and Douglas Clifford

Simulations are an important component of crisis preparedness, because they allow for training responders and testing plans in advance of a crisis materializing. However, traditional simulations can all too easily fall prey to a range of cognitive and organizational distortions that tend to reduce their efficacy. These shortcomings become even more problematic in the increasingly complex, highly dynamic crisis environment of the early 21st century. This situation calls for the incorporation of alternative approaches to crisis simulation, ones that by design incorporate multiple perspectives and explicit challenges to the status quo. As a distinct approach to formulating, conducting, and analyzing simulations and exercises, the central distinguishing feature of red teaming is the simulation of adversaries or competitors (or at least adopting an adversarial perspective). In this respect, red teaming can be viewed as practices that simulate adversary or adversarial decisions or behaviors, where the purpose is informing or improving defensive capabilities, and outputs are measured. Red teaming, according to this definition, significantly overlaps with but does not directly correspond to related activities such as wargaming, alternative analysis, and risk assessment. Some of the more important additional benefits provided by red teaming include the following: ▪ The explicit recognition and amelioration of several cognitive biases and other critical thinking shortfalls displayed by crisis decision makers and managers in both their planning processes and their decision-making during a crisis. ▪ The ability to robustly test existing standard operating procedures and plans at the strategic, operational, and tactical levels against emerging threats and hazards by exposing them to the machinations of adaptive, creative adversaries and other potentially problematic actors. ▪ Instilling more flexible, adaptive, and in-depth sense-making and decision-making skills in crisis response personnel at all levels by focusing the training aspects of simulations on iterated, evolving scenarios with high degrees of realism, unpredictability through exploration of nth-order effects, and multiple stakeholders. ▪ The identification of new vulnerabilities, opportunities, and risks that might otherwise remain hidden if relying on traditional, nonadversarial simulation approaches. Key guidance in conducting red teaming in the crisis preparedness context includes avoiding mirror imaging, having clear objectives and simulation parameters, remaining independent of the organizational unit being served, judicious application in terms of the frequency of red teaming, and the proper recording and presentation of red-teaming simulation outputs. Overall, red teaming—as a specific species of simulation—holds much promise for enhancing crisis preparedness and the crucial decision-making that attends a variety of emerging issues in the crisis management context.

Article

Migration Causes, Patterns, and Consequences: Contributions of Location Networks  

Justin Schon

The interdisciplinary field of migration studies is broadly interested in the causes, patterns, and consequences of migration. Much of this work, united under the umbrella of the “new economics of migration” research program, argues that personal networks within and across households drive a wide variety of migration-related actions. Findings from this micro-level research have been extremely valuable, but it has struggled to develop generalizable lessons and aggregate into macro-level and meso-level insights. In addition, at group, region, and country levels, existing work is often limited by only considering migration total inflows and/or total outflows. This focus misses many critical features of migration. Using location networks, network measures such as preferential attachment, preferential disattachment, transitivity, betweenness centrality, and homophily provide valuable information about migration cascades and transit migration. Some insights from migration research tidily aggregate from personal networks up to location networks, whereas other insights uniquely originate from examining location networks.

Article

Nuclear Proliferation: The Next Wave in 2020  

Rupal N. Mehta and Rachel Elizabeth Whitlark

What will nuclear proliferation look like in the future? While the quest for nuclear weapons has largely quieted after the turn of the 21st century, states are still interested in acquiring nuclear technology. Nuclear latency, an earlier step on the proliferation pathway, and here defined as operational uranium enrichment or plutonium reprocessing capability, is increasingly likely to be the next phase of proliferation concern. The drivers of nuclear latency, namely security factors, including rivalries with neighboring adversaries and the existence of alliances, are especially consequential in an increasingly challenging geopolitical environment. Though poised to play a significant role in international politics moving forward, latency remains a core area of exploration and subject of debate within the nuclear weapons literature writ large. While in many ways similar to nuclear weapons’ proliferation, the pursuit of nuclear latency has distinct features that merit further attention from scholars and policymakers alike.

Article

Bayesian Analyses of Political Decision Making  

Kumail Wasif and Jeff Gill

Bayes’ theorem is a relatively simple equation but one of the most important mathematical principles discovered. It is a formalization of a basic cognitive process: updating expectations as new information is obtained. It was derived from the laws of conditional probability by Reverend Thomas Bayes and published posthumously in 1763. In the 21st century, it is used in academic fields ranging from computer science to social science. The theorem’s most prominent use is in statistical inference. In this regard, there are three essential tenets of Bayesian thought that distinguish it from standard approaches. First, any quantity that is not known as an absolute fact is treated probabilistically, meaning that a numerical probability or a probability distribution is assigned. Second, research questions and designs are based on prior knowledge and expressed as prior distributions. Finally, these prior distributions are updated by conditioning on new data through the use of Bayes’ theorem to create a posterior distribution that is a compromise between prior and data knowledge. This approach has a number of advantages, especially in social science. First, it gives researchers the probability of observing the parameter given the data, which is the inverse of the results from frequentist inference and more appropriate for social scientific data and parameters. Second, Bayesian approaches excel at estimating parameters for complex data structures and functional forms, and provide more information about these parameters compared to standard approaches. This is possible due to stochastic simulation techniques called Markov Chain Monte Carlo. Third, Bayesian approaches allow for the explicit incorporation of previous estimates through the use of the prior distribution. This provides a formal mechanism for incorporating previous estimates and a means of comparing potential results. Bayes’ theorem is also used in machine learning, which is a subset of computer science that focuses on algorithms that learn from data to make predictions. One such algorithm is the Naive Bayes Classifier, which uses Bayes’ theorem to classify objects such as documents based on prior relationships. Bayesian networks can be seen as a complicated version of the Naive Classifier that maps, estimates, and predicts relationships in a network. It is useful for more complicated prediction problems. Lastly, the theorem has even been used by qualitative social scientists as a formal mechanism for stating and evaluating beliefs and updating knowledge.

Article

Computational Models of Political Decision Making  

Sung-youn Kim

A growing body of research uses computational models to study political decision making and behavior such as voter turnout, vote choice, party competition, social networks, and cooperation in social dilemmas. Advances in the computational modeling of political decision making are closely related to the idea of bounded rationality. In effect, models of full rationality can usually be analyzed by hand, but models of bounded rationality are complex and require computer-assisted analysis. Most computational models used in the literature are agent based, that is, they specify how decisions are made by autonomous, interacting computational objects called “agents.” However, an important distinction can be made between two classes of models based on the approaches they take: behavioral and information processing. Behavioral models specify relatively simple behavioral rules to relax the standard rationality assumption and investigate the system-level consequences of these rules in conjunction with deductive, game-theoretic analysis. In contrast, information-processing models specify the underlying information processes of decision making—the way political actors receive, store, retrieve, and use information to make judgment and choice—within the structural constraints on human cognition, and examine whether and how these processes produce the observed behavior in question at the individual or aggregate level. Compared to behavioral models, information-processing computational models are relatively rare, new to political scientists, and underexplored. However, focusing on the underlying mental processes of decision making that must occur within the structural constraints on human cognition, they have the potential to provide a more general, psychologically realistic account for political decision making and behavior.

Article

Demobilization Challenges After Armed Conflict  

Margit Bussmann

Demobilization of ex-combatants is a major obstacle in the transition to a stable postconflict society. The combatants must be convinced to abandon the armed confrontation and hand over their weapons in light of security concerns and a lack of alternative means of income. The challenges to overcoming the commitment problem differ in terms of numbers of combatants who must be demobilized for conflicts that end in a decisive victory and conflicts that reach a military stalemate. Peace agreements can offer several solutions for overcoming the parties’ commitment problems, but often the implementation of the provisions is incomplete. Third parties can offer to monitor an agreement and provide security guarantees. International actors increasingly assist with demobilization and reintegration programs for former combatants and help to overcome security-related concerns and economic challenges. Another solution offered is military power-sharing arrangements and the integration of rebel fighters into the national military. These measures are intended to reduce the pool for potential recruitment for existing or new rebel groups. If ex-combatants are left without means of income to support themselves and their families, the risk is higher that they will remobilize and conflict will recur. Reintegration in the civilian labor market, however, is often difficult in the weak economies of war-affected countries.

Article

Fast and Frugal Heuristics  

Konstantinos V. Katsikopoulos

Polymath, and also political scientist, Herbert Simon dared to point out that the amounts of time, information, computation, and other resources required for maximizing utility far exceed what is possible when real people have to make real decisions in the real world. In psychology, there are two main approaches to studying actual human judgment and decision making—the heuristics-and-bias and the fast-and-frugal-heuristics research programs. A distinctive characteristic of the fast-and-frugal-heuristics program is that it specifies formal models of heuristics and attempts to determine when people use them and what performance they achieve. These models rely on a few pieces of information that are processed in computationally simple ways. The information and computation are within human reach, which means that people rely on information they have relatively easy access to and employ simple operations such as summing or comparing numbers. Research in the laboratory and in the wild has found that most people use fast and frugal heuristics most of the time if a decision must be made quickly, information is expensive financially or cognitively to gather, or a single/few attributes of the problem strongly point towards an option. The ways in which people switch between heuristics is studied in the framework of the adaptive toolbox. Work employing computer simulations and mathematical analyses has uncovered conditions under which fast and frugal heuristics achieve higher performance than benchmarks from statistics and machine learning, and vice versa. These conditions constitute the theory of ecological rationality. This theory suggests that fast and frugal heuristics perform better than complex optimization models if the available information is of low quality or scarce, or if there exist dominant options or attributes. The bias-variance decomposition of statistical prediction error, which is explained in layperson’s terms, underpins these claims. Research on fast and frugal heuristics suggests a governance approach not based on nudging, but on boosting citizen competence.

Article

Genetics and Heritability Research on Political Decision Making  

Levente Littvay

In 2005, political scientists claimed that parent-child similarities, in addition to parenting, socialization, or shared social factors by the family, are also driven by genetic similarity. This claim upended a century of orthodoxy in political science. Many social scientists are uncomfortable with this concept, and this discomfort often stems from a multitude of misunderstandings. Claims about the genetics and heritability of political phenomena predate 2005 and wave of studies over the decade that followed swept through political science and then died down as quickly as they came. The behavior genetic research agenda faces several challenges within political science, including (a) resistance to these ideas within all of the social sciences, (b) difficulties faced by scholars in the production of meaningful theoretical and empirical contributions, and (c) developments in the field of genetics and their (negative) impact on the related scholarship within the study of politics.

Article

International Crises Interrogated: Modeling the Escalation Process with Quantitative Methods  

Evgeniia Iakhnis, Stefanie Neumeier, Anne Van Wijk, and Patrick James

Quantitative methodology in crisis studies is a topic of substantial scope. The principal rallying point for such research is the long-standing International Crisis Behavior (ICB) Project, which from 1975 onward has produced a comprehensive and heavily accessed data set for the study of conflict processes. A prehistory of crisis studies based on statistical methods, which identified connections between and among various conflict-related events, pointed increasingly toward the need for a program of research on escalation. The potential of quantitative methodology to contribute seriously to crisis studies has been realized along multiple dimensions by the ICB Project in particular. For example, quantitative methods have been applied productively to study the effects of both global and regional organizations, along with individual states, upon the process of crisis escalation. Current research in crisis studies is based on the premise that research designs so far have covered only one of multiple relevant stages regarding the process of escalation. This is where the concept of a “near crisis” becomes relevant: a near crisis entails perception of threat and finite time, but not an increased likelihood of military hostilities. Data analysis pertaining to multiple stages of escalation is at an early stage of development, but initial results are intriguing. A further critique of quantitative research begins with the observation that it is mostly state-centered and reductionist in nature. A key question emerges: How can the concept of crisis and associated data collection be revised to include a humanistic element that would entail new and potentially more enlightening configurations of independent and dependent variables?