Show Summary Details

Page of

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, POLITICS (oxfordre.com/politics). (c) Oxford University Press USA, 2019. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy and Legal Notice).

date: 22 September 2019

Social Complexity, Crisis, and Management

Summary and Keywords

Because social complexity is rarely defined beforehand, social science discussions often default to natural language concepts and synonyms. Assert a large sociotechnical system is complex or “increasingly complex,” and notions of many unknowns, out-of-sight causal processes, and a system difficult to comprehend fully are triggered. These terms, however, also suggest the potential for, if not actuality of, catastrophes and their unmanageability in the sociotechnical systems. It is not uncommon to find increasing social complexity credited for the generation or exacerbation of major crises, such as nuclear reactor accidents and global climate change, and the need to manage them better, albeit the crises are said to be far more difficult to manage because of the complexity.

The costs of leaving discussions of “complexity, crisis, and management” to natural language are compared here to the considerable benefits that accrue to analysis from one of the few definitions of social complexity developed and used over the last 40 years, that of political scientist Todd R. La Porte. Understanding that a large sociotechnical system is more or less complex depending on the number of its components, the different functions each component has, and the interdependencies among functions and components underscores key issues that are often missed within the theory and practice of large sociotechnical systems, including society’s critical infrastructures. Over-complexifying the problems and issues of already complex systems, in particular, is just as questionable as oversimplifying that complexity for policy and management purposes.

Keywords: complexity, crisis, management, infrastructures, policy messes, crisis analysis

Introduction

The social sciences differ nowhere more starkly from the humanities and fine arts than in their respective approaches to human complexity. One need only read a great novel to see the world is more complex than supposed. Read the best essays of George Steiner, John Berger, Isaiah Berlin, Adam Phillips, Joseph Brodsky, T. S. Eliot, Erich Heller, Hans Magnus Enszenberger, or Lionel Trilling, and you confront unique analytic sensibilities grappling with insight and some purchase of all manner of human complexity. Or your list of essayists would instead begin with Anne Carson, Helen Vendler, Marguerite Yourcenar, and Jane Hirschfield. Whatever the case, the names will inevitably differ, but that is the point: our respective encounters with each analytic sensibility is sui generis. No need for a collective point of departure to human complexities in any of this for the humanities and fine arts.

Not so for the social sciences. Unique intelligence on its own rarely is enough in assaying human complexity as social phenomena. In policy studies and the social sciences, the better minds are more likely first to define complexity and then draw out implications and modifications along the way. Definitions vary—after all, complex with respect to what social phenomena?—though, truth be told, many discussions of social complexity do not get even that far in specifying any respect-to-what. The aim of this entry is to signal to the reader why a focus on crisis and its management depends critically on how social complexity is framed.

The everyday hubris of assuming there is no need to define this term and then drawing out the implications for any redefinitions thereafter is everywhere evident. There is no better way to demonstrate what goes wrong when social scientists and policy types do not first define complexity than observing their default analysis to natural language synonyms for “complexity,” “crisis,” and “management.”

To see how, begin with arguably the best-known definition of social complexity, that of political scientist, Todd R. La Porte and his notion of organized system complexity: “The degree of complexity of organized social systems (Q) is a function of the number of system components (Ci), the relative differentiation or variety of the components (Di), and the degree of interdependence among these components (Ik). Then, by definition, the greater Ci, Di, and Ik, the greater the complexity of the organized system (Q)” (La Porte, 2015, p. 6). Without prejudging this long-standing definition to be the best in the social sciences, its merit lies in highlighting features often missed or passed over in natural language assumptions about what is social complexity (and by extension what are crisis and management). Four implications from the La Porte definition can be noted.

First, complex is a comparative feature of systems, that is, a system is more or less complex than another system in terms of their respective number of components, the differentiation of said components, and their interrelatedness. Hence, while in natural language it is common enough to say “this or that is complex,” such statements beg the question of more or less complex than what: just what is the implicit baseline (if any) for establishing “complex”? To put it the other way around, a formal starting definition may still have ambiguities—just what is a “system” that it is complex?—while still being less ambiguous than natural language discourse.

Second, the LaPortean definition shows why, say, the Earth is the most complex ecosystem among all ecosystems on this planet (i.e., the Earth consists entirely of all its ecosystems, all their differing ecosystem functions and services, and all their interconnections). The methodological point is not that you “aggregate up to complexity,” but rather, comparatively, the system of interest becomes more (or less) complex. Third, the LaPortean definition illustrates how difficult it is to quantify complexity beyond number of components and type and degree of functions attributed to each component (which is no easy matter as well). There is no compelling, broadly accepted quantitative measure of interdependence (better for our purposes, interconnectivity, as some relationships are not bidirectional but unidirectional). That said, various natural language terms, such as “increasing resource scarcity,” do capture some sense of the interconnectivity at the global system level.

Fourth, and arguably the most significant, to distinguish a system’s components from each other, then the different functions each component has, and thereafter the interdependencies (with other interconnections) between and among the functions and components is its own methodological imperative: first, differentiate! The more you differentiate the case at hand, the more unlikely you are to find reduced-form crisis narratives such as the Global Financial Crisis (the most salient feature of the 2008 financial mess was that it was not global) or the Tragedy of the Commons (its operative premise of a homogenous pasture open to herders who are all alike is exactly what must not be assumed empirically).

Before drawing out the implications for crisis and management of the LaPortean definition of complexity, turn to another description, Charles Perrow’s well-known typology of technological complexity and coupling. For purposes here, note only that both tightly and loosely coupled systems, largely technological, can be complexly or linearly interactive. With an extraordinary range of examples, Perrow defines and illustrates complexly interactive systems to be those with “unfamiliar sequences or unplanned and unexpected sequences, and either not visible or not immediately comprehensible” (1999, p. 78). The sequences of system activities in a linearly interactive system are, by contrast, more familiar, expected, and are visible, even if unplanned or unintended.

Without detracting from the virtues of the typology and its widespread applications, contrast the Perrovian notion of system complexity with that of La Porte. A key feature of LaPortean complexity is the aforementioned absence of compelling empiric indices of interdependence; a key feature of Perrovian complexity is the mentioned unexpectedness or invisibility with respect to causal factors at work in the sequences of activities of interest. Clearly, though, “difficult to measure” and “lack of causal understanding” are not everywhere equivalent or necessarily overlapping. Just because interdependence cannot be measured does not mean that causal understanding is insufficient for the development of better practices (for more see the section on Contingency and Practice Precede Theory). Both the lack of sufficient indicators and the lack of causal understanding may be true for complex social systems, but even so, both gaps need the hard work of further case-by-case analysis.

Why? Because the second one reads the Perrovian synonyms for complex—unexpected, unplanned, not automatically visible (let alone comprehended)—is the second one is in the world of crisis narratives and calls for urgent response. The optimistic Hidden Hand Principle of political economist A. O. Hirschman—only by not knowing in advance how difficult some things are to achieve do we achieve them or something even better—might as well have been banished to another planet. On this one, in contrast, mention of the Perrovian synonyms often triggers their cognates, most immediately: “unpredictability,” “unknowns,” “incomprehensibility,” and that fearsome “unmanageable.” All constellate into crisis scenarios emerging around and out of unstudied (unstudiable?) conditions, which, while calling for better management, are no longer open to management as humans know it, or so Perrow has argued for the large sociotechnical systems researched.

“With good reason!” the reader might respond, thinking of a host of large-scale technological disasters. But it is precisely at this point that the analytic virtue of having defined system complexity from the outset is clearest. Before invoking one’s exemplar of disaster, one should pause and ask: Where in La Porte’s definition of organized social complexity is the negative inevitability of any such crisis scenario? Yes, a key feature of LaPortean complexity is surprise (for more, see the section Contingency and Practice Precede Theory); but since when do all surprises end up as bad crises? True, La Porte focused later on complexity’s rude surprises; and true, Perrow extended his notion of complexly interactive and applied it to all manner of catastrophes (La Porte, 2007; Perrow, 2007). But where is that slippery-slope crisis narrative to be found in the La Porte definition? How do we get from high degrees of social complexity to this conclusion: Today’s exceptional complexities give rise to extraordinary threats and “thus” to emergency measures which necessarily end up as precedents for first-ever policies?

Crisis analysts and managers need to think deeply about the italicized terms, as each puts them (and us) at the very limits of human comprehension; infrastructure reliability and crisis management; and societal values driving relevant policy, management, and their regulation. Were readers to think more deeply, the italicized terms in the preceding paragraph look less like Today’s Big Policy Crisis than the Core Existential Threat to any kind of crisis analysis and management. To be sure, when ensuring societal safety, values of caution and precaution may well be encouraged. Nevertheless, it is passing odd that what is beyond good and bad must be worse; passing odd that at the limit there is no limit to the possibilities of disaster being absurdly numerous.

None of this is to discourage reading the valuable natural language discussions of social science and policy complexity (e.g., complex adaptive systems, complexity sciences, Snowden’s Cynefin framework, and social complexity as emergent properties) as well as related literatures on crisis and management associated with system complexity. The reader is referred to summaries and references of other entries in the Encyclopedia of Crisis Analysis.

Rather than repeating that material, the remainder of this entry singles out key insights and implications missed or under-acknowledged in the absence of a more defined orientation to social complexity. The social complexity focused on here is system complexity or issue complexity in the LaPortean vein (departures from La Porte are also made clear). Note that issues aren’t perforce complex because they are evinced by complex systems, but rather the issues are (more or less) complex in terms of the number of points at issue, the functions performed by each of these points, and the interrelatedness among the points.

The distinction between issue complexity and system complexity is crucial for what follows. Perrow famously suggested that centralized and decentralized control mechanisms conflict in complexly interactive, tightly coupled technologies with high risks. Tight coupling entailed a requirement for centralized mechanisms of authority and operations, with unquestioned compliance and immediate response capability. Complex interactivity, on the other hand, required, also in Perrow’s view, decentralized mechanisms of authority to cope with unplanned interactions on the ground by those closest to the system. But from a LaPortean sociotechnical perspective, complex issues and complex systems differ with respect to their management. Indeed, such management of centralization and decentralization is precisely what control rooms of large sociotechnical systems are mandated to do in real time (for more, see Roe & Schulman, 2008, 2016, 2018). Control rooms are centralized for system-wide response and management, but that centralization entails the rapid management of system control variables—such as electricity frequency, natural gas pressures, and waterflows—whose movements can have immediate decentralized (localized) effects. Again, it is management with respect to what that must be distinguished from the get-go and case by case (e.g., obviously not all critical infrastructures have centralized control centers).

With that distinction in mind, no claim to comprehensiveness is made for what follows; and in the interest of parsimony, the insights and implications are limited to three of paramount import to students of large sociotechnical systems.

Simplifying Away Complexity is Problematic, but So Too is Over-Complexification

Complex is about as simple as it gets for many controversial sociotechnical systems and issues of politics and public policy. In effect, students of policy and the social sciences have a duty of care to decision-makers not to simplify an already complex issue as if it were not. Equally true, the duty of care means not to over-complexify an already complex issue. The litmus test that an issue is too complexified or simplified is whether or not it can be recast in ways that open up fresh options for intervention without gainsaying its complexity. If a simplification can be recast as complex in ways that new interventions are then plausible or if the issue thought to be so complex that no further action is possible can be recast to show otherwise, then truth of the matter has been pushed and pulled beyond current exaggerations.

The best examples of over-complexifying already complex issues are to be found in that widening delta of “wicked policy problems” (the locus classicus is Rittel & Webber, 1973). Wicked problems are said to epitomize social complexity, crises, and their management—especially when cast in natural language terms as unmanageable crises arising out of that social complexity. To see with what effect, return to the earlier “conclusion”: exceptional complexities give rise to extraordinary threats and thus to emergency measures which necessarily end up as precedents for first-ever policies. Left this way, the problem of interest ends up intractable, and there are people who know—for certain, know—that major political and policy problems are more and more wicked ones by virtue of their being impossible to solve, let alone analyze or manage. Academics and scientists alike can be found now talking about devilishly wicked problems or the most wicked of wicked problems.

But such haute vulgarizations are themselves the artifact of having no default option when at the limits of thinking and analyzing. What seem to be analytically insoluble threats call for all manner of response, some of which entail moving well beyond conventionalized analysis and management. One under-acknowledged response is appealing to the background condition for taking action even when analysis and management are otherwise dead-ended into crises about which management has little effect. Humans have always been many-sided, and so must our responses be, where that background condition of having many sides frames the action we take. When at the limits of thinking and faced with analytic or management intractability—right now, not later when the promises of big-data algorithmization, AI, machine learning, and the like are to be fulfilled—the challenge is to find those other sides with which to recast and render the issue more tractable to analysis and management.

How does this appeal to the background condition of many-sidedness work? Bad policy mess: at one point, 3 to 4 billion people—up to two-thirds of the world’s population—lived in regions without adequate water supplies or sanitation. Good policy mess: Now that is a very, very large number of people, right? This is such a huge distribution of people without adequate water supplies that some of them must be doing much better than others. Without being Darwinian about it, this means there are tens of millions—more?—of people who actually have things to say about how to better survive without adequate water to those millions more who are also trying to survive without it.

What is going on in the acrolect of wicked problems is exaggerating the complexity of one human facet and simplifying away other facets. Labeling something a wicked problem creates the Ultimate One-Sided Problem—it’s, well, intractable—for humans who are everything but one-sided. The one-siders of intractability have taken the generous complexity of being intractably human and scalped it.

In the absence of definitions of social complexity, the choice of synonyms, analogies, or other cognates necessarily moves center stage as a deliberate choice among many with which to cast and recast complexity along with “associated” crisis and management. Whether you define complexity beforehand or not, synonyms and analogies are habitually entrained. The problem is immeasurably more difficult, however, if no starting definition is provided. When you don’t define social complexity, your choice of “its” synonyms, analogies, and other cognates inescapably becomes the focus of far deeper analysis, the aim of which must be to defamiliarize any taken-for-grantedness one may have about social complexity, crisis, and management.

Instead of assuming social complexity entails dreaded features of “unplanned,” “unexpected,” or “not immediately comprehended,” one could just as plausibly insist that complex issues are, on further reflection and more importantly: variously difficult, whether planned or not; about which we have little experience, whether or not they are expected; and about which the not-comprehending is with respect to unknown-unknowns that in hindsight and upon more experience prove to have been “visible right in front of us” all along.

Once one reflects on this taken-for-granted knowledge, one realizes that there are many options with which to recast natural language notions of social complexity and its sequalae. You could just as well say that the LaPortean understanding of complexity better explains policy messes, which could go good—unlike so-called wicked problems!—or bad depending on how they are managed, and that bad messes can be managed in ways that do not treat them as ipso facto crises that must be responded to. (For more on policy messes, their management and an alternative lexicon for crisis management, see Roe, 2013, 2016.)

Nor is the LaPortean definition immune from challenge. Interdependence is very much a contemporary social science term, and one need not look far for an alternative formulation whose slight difference recasts the notion significantly. What if instead of social science interdependence we had all along been talking about, say, the Taoist notion of resonance (kan-ying, ganying; see Le Blanc, 1985; Loewe, 2003)? Things in this complex world reverberate like tuning forks oscillating next to each other; complex is to move and act in response to and in concert with other things. It is not to assume, as in physics, that the oscillation is causally amplifying and destructive. What, after all, are all those correlation coefficients that populate statistical analyses of, say, income inequality within and across nations, if not correlation-as-resonance rather than correlation-as-proxies-for-causation? The slightest difference, to repeat, in what is taken as complexity’s synonyms in natural language can have major import for subsequent analysis.

Analogies can be just as decisive as the synonyms. Famously, “bricolage” and “garbage can processes” served, for a time, to foreground the complexity and mess of public policy. Yet the social sciences and policy studies are strewn with alternative analogies for making sense of complexity, crisis, and management. More, their respective applications have substantially different implications when in the absence of any overarching or starting definition of key terms. One illustration will have to suffice.

The notion of “policy palimpsest” arose early on in policy studies (Simmons, 1982), but never gained much traction thereafter. The upshot of a policy palimpsest is that any current policy statement is itself the composite of policy arguments and narratives that have overwritten each other across time. A composite argument read off a policy palimpsest reads sensibly—nouns and verbs appear in order and sense-making is achieved—but none of the previous inscriptions or points shine clear and whole through the layers, effacements, and erasures in the policy palimpsest being read from. Earlier arguments have been blurred, intertwined, and re-rendered for current (at times controverted) purposes. We want policy to come to us as instantly recognizable, just as immediately legible as the writing on this page. That instantaneity is the aim of any composite argument; the analogy of the policy palimpsest, however, is to frustrate that taken-for-granted legibility. The palimpsest insists that policy always comes with fractured backstories.

From the policy palimpsest optic, policy statements not only determine or otherwise affect policies, but the policies actually implemented are also themselves written onto the policy palimpsest and blur away elements of the preceding policy statements. The analytic challenge is to read any new composite argument with the blurred-away elements now made visible in order to acknowledge and probe what has been rendered missing in the composite of interest now. Once you have identified what is missing from the composite but was in the palimpsest being read off (no guarantees here), you have identified means to recast the complex issue in new (renewed) ways.

To see how, turn to the well-regarded journal, Foreign Affairs, and a much-cited 2014 critique of the failed-states rationale put forth in the Bush Administration’s 2002 National Security Strategy (Mazarr, 2014). The Bush Doctrine, not to put too fine a point on it, argued that failed states were an important cause of international terrorism. The Mazarr critique, including its review of the literature, underscored profound problems with the doctrine’s assumptions. Yet even if the critique and others like it are true as far as they go, analysis of the failed-states argument needs to go further, not just to identify what was effaced in the terrorism palimpsest at that time, but also what was effaced in these failed-states critiques which have become part of the very same palimpsest since then. The infamous example of what has been erased, at least in such journals, has been the polemical avowal that America deserved 9/11 as a nation and, now that it had happened, here was the ideal opportunity for the nation to take the lead in a new rapprochement with the Islamic world. The least recognized erasure, however, but the one that would have been most visible had such an attempt at rapprochement taken place, was the centrality of the following question for international policy etched off the policy horizon at the time of the burning Twin Towers: Where are this century’s new democracies to come from, if not from failed states, including—dare we say—parts of the United States?

Note the implications of treating complexity, crisis, and management in terms of a policy palimpsest and, by way of example, the analogy offered up in garbage can processes. J. W. Kingdon (1984) notably adapted the garbage can model to highlight opportunistic decision-making, where persistence is key to taking advantage of that rare policy window opening when the separate streams of problems, policies, and politics happen to couple and converge together in real time. Perhaps that’s true as far as it goes, but does this argument go far enough? The policy palimpsest notion by contrast asks what’s missing in all this—rendered missing in the streams as they flow decoupled and rendered missing in order to have the infrequent coupling. From the perspective of the garbage can, missing the window of opportunistic coupling is a failure of management through lack of decision-maker persistence or poor timing on the part of the decision-maker; from the perspective of the policy palimpsest, what is missing are the opportunistic windows for earlier coupling (think: the aforementioned rapprochement) that were erased out of the decoupled streams so that they flow separately and do not converge for the time being. All natural language discourses about “being the right person at the right time for the right problem” miss possible opportunities for making things better that are right in front of us, were we to change our optic for seeing and thereby acting.

No single analogy for the socially complex and any related notions of crisis and management are being privileged here. But in the spirit of La Porte’s definition, what is insisted upon is that social complexity is better thought of comparatively. Multiple analogies and synonyms are needed in order to draw out and highlight differences across them, and this comparison and differentiation are more consistent with an understanding of LaPortean complexity.

In Social Complexity, Contingency, and Practice Precede Theory

It does not take a genius to figure out that practice precedes theory in a world of social complexity, be it LaPortean, Perrovian, or another formulation. Comprehensive yet parsimonious theory requires causal knowledge in ways that effective practices prove not to need. While one can think of exceptions, the philosopher Gilbert Ryle (1949, p. 30) famously put the matter:

Efficient practice precedes the theory of it; methodologies presuppose the application of the methods, of the critical investigation of which they are the products . . . It is therefore possible for people intelligently to perform some sorts of operations when they are not yet able to consider any propositions enjoining how they should be performed. Some intelligent performances are not controlled by any anterior acknowledgement of the principles applied to them.

These are better practices, not “best practices.” A number of “best practices” in policy and management confuse what may have worked well in a single case (at one site or via a gazillion Monte Carlo simulations) for those better practices that emerge and are improved across multiple, real-world cases. In contrast to those one-off prototypes selectively identified as best practice, actually existing better practices emerge not only from and across a run of different cases, but even then typically have to be modified to any new case so as to be effective. That there are no guarantees in any of this—better practices may not be emerging or, if already emergent, the practices may not be modifiable to the specific scenarios being faced then and there—should be clear, however. (For more on these processes of repeated application and modification along with examples of how better practice thereby emerges, see Roe, 2013.)

The implications for management and crisis in the midst of social complexity are many. Three deserve mentioned. First, practice not only precedes causal understanding of social complexity, but may well be possible only because of that complexity, that is, the cases need to be multiple and different if better practices are to emerge. Second, a focus on better practices means that the three questions in conventional risk analysis—What could go wrong? How likely is it? What are the consequences if so?—are in the wrong order and need to be augmented: What’s working here? What’s even better by way of practice elsewhere? How do we get from here to there? and only then: What could go wrong (in getting there)? How likely is that? What are the consequences if so?

Third, better practices import into decision-making the scale of management for which they work. It should be no surprise that crisis or management failure can occur when better practices are applied to scales not represented in the run of cases from which they emerge. (A key filter in the search for better practices is that asking, “Could we being doing even better than industry-wide standards and practices” is always couched with the caveat: “Not if it means getting worse in order to get better!”)

Better practices are, to sum up, contingent on the cases across which they emerge and by which they are modified. So too do the complex cases themselves reflect myriad contingencies at work, if both the sciences and the humanities are our guide. Cases are contingent on all manner of factors—societal, political, economic, historical, cultural, legal, scientific, geographical, philosophical, governmental, psychological, neurological, technological, religious, and what-not. In fact, why close off analysis of social complexity, crisis, and management at “what-not,” when we can press understanding further with “yes, but” or “yes, and”? Once again, in contrast to those policy types fond of the rhetoric about the right policy at the right time, complexity’s barrel is so full of fish it’s very difficult indeed to track anything like the right one when it matters most, right now.

To think otherwise, it needs pointing out, is delusion and denial. How many times have we heard something like “If implemented as planned . . . ,” “If done right . . . ,” or “Given market-clearing prices . . .”—thereby instantiating the exaggerations that lead to mismanagement and failure? “If implemented as planned,” when we know that is the assumption we cannot make. “If done right,” when we know that “technically right” is unethical without specifying just what the ethics are, case by case. “Given market-clearing prices,” when we know not only that markets in the real world often do not clear (supply and demand do not equate at a single price)—and even when they do, their “efficiencies” can undermine the very markets that produce those prices. We could as well believe that engineering the steering wheel closer to the engine gets us to our destination sooner.

Another virtue of the LaPortean definition of social complexity is it highlighting how complexity is itself specifically contingent on the number of system components, their degree of differentiation, and their interconnectivity, system by system. As such, “contingent” means more than the dictionary definition of “having no logical or definitional necessity” with respect to the desideratum of interest. In LaPortean complexity, contingency is as real as are material power interests. Here again analogies are decisive in pushing thinking further.

To give one illustration, social contingency discussed in this entry can be conceived as neither solely an object in the world nor solely in the brain or consciousness of the subject. It is what Michel Serres termed a quasi-object in his 1982 book The Parasite:

[Serres] explained the quasi-object with the metaphor of a ball in a sports game. Take, for example, a football [soccer] match. The ball becomes an object when the player kicks it. During the kick, the player then becomes subject. But the ball is also a subject because where and how it moves determines the movement and intensity of the players who, accordingly, are the objects of the play. The ball weaves this collective as the players configure themselves around its movement. So the ball is neither fully subject nor fully object. It is a quasi-subject and a quasi-object and so are the players.

(Phillippe Parreno, quoted in Obrist, 2015, p. 57)

For Serres, this “quasi-object, when being passed, makes the collective, if it stops it makes the individual” (quoted in Brown, 2013, p. 96). To deepen the analogy, contingency comes into view most prominently as a quasi-object at the point of a sudden switchover. Omschakeling, the term as used in Dutch football, means the moment of a changeover when, say, a team abruptly loses control of the ball to the other team, with the defense possibly out of position at the same time (Kuper, 2010, p. 8). Contingency in this way becomes a very real, shared opportunity to manage surprise.1

So too for social complexity writ large. A chief effect of LaPortean complexity is surprise, as Demchak (1991) stated long ago. Such surprise is as material as power interests classically (simplistically?) defined as the ability of A to get B to do what B would not have done otherwise. Other synonyms for contingency and surprise—luck, coincidence, happenstance—are as well quasi-objects binding and defining an individual and his or her network together. Among many implications that follow from the materiality of contingency in social complexity, three stand out.

First, arguably the greatest surprise is how many recastings are possible for complex issues having many components, multiple differentiation, and high interconnectivity. You see jewelry where I see sculpture on a small scale; you see the orchestra conductor conducting, I see that conducting more as a dance. I witness the birth of the family’s first child, you see the first child give birth to a family. You see the sketched outline of a toy sailboat (or other desideratum), I point out that the boat’s image is the space left behind after all the other images have inlined it. We both, on the other hand, see the hole without its doughnut. I ask, when is biotechnology bestiality? You ask, are gardens zoos without the cruelty? Isn’t heroism first violence to oneself? Is burglary a kind of architectural criticism? Doesn’t our continuing inability to safely store nuclear waste from our weapons arsenal reveal the Cold War to be the first war in modern times where the United States took direct hits because of an enemy? Further, what does the United States look like when one realizes it is likely a country where more men are raped than women? (Think: its male prison populations.)2

Second, where the materiality of contingency drives politics and policy, the analytic focus shifts to those in the social sciences and policy studies who have the aptitude for recasting the issues that sift and emerge from complexity (for one such recasting of crisis management see again Roe, 2013, 2016). Third and most important, the management focus shifts to identifying, analyzing, and learning from those whose really existing better practices center on managing reliable operations in the face of high and variable contingency, be the materiality unpredictable and/or uncontrollable. Operators in those centralized control rooms of major critical infrastructures routinely demonstrate skills for managing real-time reliability in the midst of many unpredictable shocks and surprises. These skills in turn have served as a template for the better management of policy and management messes (Roe, 2013).

The upshot of the three factors is that a host of natural language concepts core to crisis analysis and management—risk, threat, vulnerability, exposure—are at best quasi-objects (neither “just” socially constructed nor “things-in-themselves out there”)—which is to insist the concepts are deeply contingent, case by case, on real-time managers, their practices, and the optics for recasting. It’s been said the policy world is complex because it’s full of unwanted interruptions, and that interruptions have made for much unfinished business in that world. Think instead of a policy world defined as complex because the interruptions are wanted, here called recastings, and that the unfinished business of the present opens up to those recastings.

Conclusion

One final issue must be emphasized by way of pulling the mentioned points together for future thinking about social complexity, crisis, and management.

Question: What is the perfect recipe for producing worst-case scenarios for unmanageable crises? Answer: Ensure first that the complex system has no boundaries, ensure then that this system cannot be managed to any standard of reliability and safety known to human beings, and thereafter you can be sure almost anything and everything is at risk and catastrophically so. If that is your answer, it is proof-positive social complexity has not been defined beforehand, let alone in anything like LaPortean terms. Nor has your answer been rooted empirically in really existing large sociotechnical systems, particularly critical infrastructures whose control centers routinely manage in the midst of unpredictable or uncontrollable circumstances. All that this recipe for crisis kitsch does is default to deliberately restrictive constellations of natural language synonyms and cognates (for more on crisis kitsch, see Roe, 2018).

References

Brown, S. (2013). In praise of the parasite: The dark organizational theory of Michel Serres. Porto Alegre, 16(1), 83–100.Find this resource:

Demchak, C. (1991). Military organizations, complex machines: Modernization in the U.S. armed services. Ithaca, NY: Cornell University Press.Find this resource:

Glazek, C. (2012). Raise the crime rate. n+1, 13 (Winter).Find this resource:

Kingdon, J. W. (1984). Agendas, alternatives and public policies. Boston, MA: Little/Brown.Find this resource:

Kuper, S. (2010). The secrets of Bayern’s ungainly schoolmaster. Financial Times, March 27–28, p. 8.Find this resource:

La Porte, T. R. (Ed.). (2007). Anticipating rude surprises: Reflections on “crisis management” without end. In L. Jones (Ed.), Communicable crises: Prevention, management and resolution in the global arena (pp. 27–46). Amsterdam, The Netherlands: Elsevier.Find this resource:

La Porte, T. R. (Ed.). (2015). Organized social complexity: Challenge to politics and policy. Princeton, NJ: Princeton University Press.Find this resource:

Le Blanc, C. (1985). Huan-Nan Tzu: Philosophical synthesis in early Han thought: The idea of resonance (Kan-Ying) with a translation and analysis of chapter six. Hong Kong: Hong Kong University Press.Find this resource:

Loewe, M. (2003, June 27). Nothing positive. TLS, 28.Find this resource:

Mazarr, M. (2014). The rise and fall of the failed-state paradigm. Foreign Affairs, 113–121.Find this resource:

Obrist, H. U. (2015). Rethinking the ritual of the exhibition: An interview with Phillippe Parreno and Paul Preciado. Mousse Magazine: Contemporary Art Magazine, 47, 48–61.Find this resource:

Perrow, C. (1999). Normal accidents. Princeton, NJ: Princeton University Press.Find this resource:

Perrow, C. (2007). The next catastrophe: Reducing our vulnerabilities to natural, industrial, and terrorist disasters. Princeton, NJ: Princeton University Press.Find this resource:

Rittel, H., & Webber, M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4(2), 155–169.Find this resource:

Roe, E. (2013). Making the most of mess: Reliability and policy in today’s management challenges. Durham, NC: Duke University Press.Find this resource:

Roe, E. (2016). Policy messes and their management. Policy Sciences, 49(4), 351–372.Find this resource:

Roe, E. (2018). Licking the sharp edge of the sword. Journal of Contingencies and Crisis Management, 27(1), 1–7.Find this resource:

Roe, E., & Schulman, P. R. (2008). High reliability management: Operating on the edge. Stanford, CA: Stanford University Press.Find this resource:

Roe, E., & Schulman, P. R. (2016). Reliability and risk: The challenge of managing interconnected infrastructures. Stanford, CA: Stanford University Press.Find this resource:

Roe, E., & Schulman, P. R. (2018). A reliability & risk framework for the assessment and management of system risks in critical infrastructures with central control rooms. Safety Science, 110(Part C), 80–88.Find this resource:

Ryle, G. (1949). The concept of mind. London, U.K.: Hutchinson’s University Library.Find this resource:

Sarotte, M. L. (2014). The collapse: The accidental opening of the Berlin Wall. New York: Basic Books.Find this resource:

Serres, M. (1982). The parasite (L. R. Schehr, Trans.). Baltimore, MD: Johns Hopkins University Press.Find this resource:

Simmons, H. (1982). From asylum to welfare. Ontario, Canada: National Institute on Mental Retardation.Find this resource:

Snowden, D. J., & Boone, M. E. (2007). A leader’s framework for decision making. Harvard Business Review, 85(11), 68.Find this resource:

Notes:

(1.) To highlight surprise and contingency is, of course, not to dismiss the role of material power and money. Professional football is notorious for the role of such interests when it comes to its players. But the suzerainty of material interests holds only as far as it goes, and power-and-money explanations do not go far enough in the example before us nor for social complexity, case by case. What is of concern are the moments when really existing contingency affects—with pith, variety, and force—the way this or that specific game—think now, language game as well—is configured in real time, given whatever rules of the game, external conditions, and the other myriad factors also at play.

Examples of the decisive role of contingency and the unexpected in the course of human history abound. One momentous event, perhaps not fully appreciated, was the fall of the Berlin Wall on November 9, 1989, which we now know to have been the unintended consequence of a run-up of accidents and unintended errors on the part of the East Germans (e.g., Sarotte, 2014).

(2.) “The Justice Department now seems to be saying that prison rape accounted for the majority of all rapes committed in the US in 2008, likely making the United States the first country in the history of the world to count more rapes for men than for women” (from Glazek, 2012).