What do we know about actual board behavior and board performance? How can we develop our knowledge about board processes and board members’ capabilities? As a research field grows into maturity, we learn to see nuances, and the vocabulary used becomes richer and more detailed. However, the development of a consistent and nuanced language in research about board processes and performance is lagging behind.
How have research streams and individual scholars influenced how we do research today, and why are these stories not included in most of the published literature reviews on this topic? What distinguishes research about boards and governance from various disciplines? How do we find research about board processes and board capital, and how has groundbreaking research on the human side of corporate governance developed? Groundbreaking research of Myles Mace was conducted more than half a decade ago, and we need to understand what has taken place after the seminal 1989 contribution of Zahra and Pearce. Research about actual board behavior and processes were not for decades published in leading management and strategy journals.
Most published research about board processes and board capital is formulaic, leans on proxies rather than direct observation, and has only incremental if any practical contributions. A message is thus that we should strive for more groundbreaking studies that challenge existing knowledge and practice, including our research practice. A research agenda about board processes and board capital should be influenced by some of the following suggestions:
• It should go beyond formulaic and incremental studies. We should challenge existing wisdom and practice and search for alternative ways of doing research.
• It should include more processual studies rather than archival data studies using proxies.
• We should learn from the scholars doing groundbreaking research before us.
• We should learn by comparing experiences from various types of organizations.
• We must include lessons and publications not found in leading English-language journals.
• We should apply a sharing philosophy and a programmatic approach in which we as researchers contribute to developing future generations of scholars.
Article
Board Processes and Performance: The Impact of Directors’ Social and Human Capital
Morten Huse
Article
Content and Text Analysis Methods for Organizational Research
Rhonda K. Reger and Paula A. Kincaid
Content analysis is to words (and other unstructured data) as statistics is to numbers (also called structured data)—an umbrella term encompassing a range of analytic techniques. Content analyses range from purely qualitative analyses, often used in grounded theorizing and case-based research to reduce interview data into theoretically meaningful categories, to highly quantitative analyses that use concept dictionaries to convert words and phrases into numerical tables for further quantitative analysis. Common specialized types of qualitative content analysis include methods associated with grounded theorizing, narrative analysis, discourse analysis, rhetorical analysis, semiotic analysis, interpretative phenomenological analysis, and conversation analysis. Major quantitative content analyses include dictionary-based approaches, topic modeling, and natural language processing. Though specific steps for specific types of content analysis vary, a prototypical content analysis requires eight steps beginning with defining coding units and ending with assessing the trustworthiness, reliability, and validity of the overall coding. Furthermore, while most content analysis evaluates textual data, some studies also analyze visual data such as gestures, videos and pictures, and verbal data such as tone.
Content analysis has several advantages over other data collection and analysis methods. Content analysis provides a flexible set of tools that are suitable for many research questions where quantitative data are unavailable. Many forms of content analysis provide a replicable methodology to access individual and collective structures and processes. Moreover, content analysis of documents and videos that organizational actors produce in the normal course of their work provides unobtrusive ways to study sociocognitive concepts and processes in context, and thus avoids some of the most serious concerns associated with other commonly used methods. Content analysis requires significant researcher judgment such that inadvertent biasing of results is a common concern. On balance, content analysis is a promising activity for the rigorous exploration of many important but difficult-to-study issues that are not easily studied via other methods. For these reasons, content analysis is burgeoning in business and management research as researchers seek to study complex and subtle phenomena.
Article
Corporate Ethics
Thomas Donaldson and Diana C. Robertson
Serious research into corporate ethics is nearly half a century old. Two approaches have dominated research; one is normative, the other empirical. The former, the normative approach, develops theories and norms that are prescriptive, that is, ones that are designed to guide corporate behavior. The latter, the empirical approach, investigates the character and causes of corporate behavior by examining corporate governance structures, policies, corporate relationships, and managerial behavior with the aim of explaining and predicting corporate behavior. Normative research has been led by scholars in the fields of moral philosophy, theology and legal theory. Empirical research has been led by scholars in the fields of sociology, psychology, economics, marketing, finance, and management.
While utilizing distinct methods, the two approaches are symbiotic. Ethical and legal theory are irrelevant without factual context. Similarly, empirical theories are sterile unless translated into corporate guidance. The following description of the history of research in corporate ethics demonstrates that normative research methods are indispensable tools for empirical inquiry, even as empirical methods are indispensable tools for normative inquiry.
Article
Entrepreneurial Teams
Nicola Breugst
Entrepreneurial teams develop and exploit ideas in order to turn them into entrepreneurial ventures that they jointly own and manage. While these teams are crucial drivers for the success of their ventures, their work can be challenging because they operate under conditions of high autonomy, uncertainty, and interdependence. Thus, it is important to understand how entrepreneurial teams work together and jointly advance their ventures. Research has followed three overarching approaches to explore how entrepreneurial teams can succeed in their endeavors. First, one stream of research has aimed at connecting team inputs, such as team members’ experiences, to firm-level outcomes. In a second stream of research, scholars have focused on what happens within entrepreneurial teams in terms of team processes and emergent states. This approach has identified various mechanisms that translate inputs into outcomes. Third, an increasing number of studies have started to unravel the complexities that entrepreneurial teams experience in their work. Specifically, this research has considered the mutual influence of team members and has explored how teams work on their tasks and are shaped by this work. Despite these advancements, entrepreneurial team research faces numerous challenges arising from the complex interplay of team members and their ventures as well as from access to high-quality data. Because of these and other challenges, many research questions around entrepreneurial teams still need to be addressed to better understand their work. These emerging research efforts are likely to be facilitated by additional data sources, such as educational programs devoted to advancing entrepreneurial teams and modern technologies promising better access to rich data. Overall, entrepreneurial team research not only contributes to a more nuanced understanding of the entrepreneurial process but also provides support for these teams as they create and nurture their ventures.
Article
Experience Sampling Methodology
Joel Koopman and Nikolaos Dimotakis
Experience sampling is a method aimed primarily at examining within-individual covariation of transient phenomena utilizing repeated measures. It can be applied to test nuanced predictions of extant theories and can provide insights that are otherwise difficult to obtain. It does so by examining the phenomena of interest close to where they occur and thus avoiding issues with recall and similar concerns. Data collected through the experience sampling method (ESM) can, alternatively, be utilized to collect highly reliable data to investigate between-individual phenomena.
A number of decisions need to be made when designing an ESM study. Study duration and intensity (that is, total days of measurement and total assessments per day) represent a tradeoff between data richness and participant fatigue that needs to be carefully weighed. Other scheduling options need to be considered, such as triggered versus scheduled surveys. Researchers also need to be aware of the generally high potential cost of this approach, as well as the monetary and nonmonetary resources required.
The intensity of this method also requires special consideration of the sample and the context. Proper screening is invaluable; ensuring that participants and their context is applicable and appropriate to the design is an important first step. The next step is ensuring that the surveys are planned in a compatible way to the sample, and that the surveys are designed to appropriately and rigorously collect data that can be used to accomplish the aims of the study at hand.
Furthermore, ESM data typically requires proper consideration in regards to how the data will be analyzed and how results will be interpreted. Proper attention to analytic approaches (typically multilevel) is required. Finally, when interpreting results from ESM data, one must not forget that these effects typically represent processes that occur continuously across individuals’ working lives—effect sizes thus need to be considered with this in mind.
Article
Individualism-Collectivism: A Review of Conceptualization and Measurement
Chao C. Chen and Ali F. Unal
The concept of individualism-collectivism (I-C) has been a prominent construct in philosophy, political science, sociology, psychology, and organization and management. Its meaning may vary greatly in scope, content, and levels of analysis, depending on the fields of inquiry and the phenomenon of interest. We focus on I-C as it relates to values, identities, motives, and behaviors in the context of organization and management. At its core, I-C is about self-collective relationships and the impact they have on the relational dynamics and outcomes at various levels of analysis. Theory and research have identified patterns of contrasts between individualism and collectivism. While the individualist orientation emphasizes individual self-identity, personal agency, and values that tend to prioritize individuals over collectives, the collectivist orientation emphasizes individuals’ collective identity, collective agency, and values that tend to prioritize collectives over individuals.
Various I-C conceptions have been critically evaluated with the focus on basic assumptions regarding the nature of individualism and collectivism as unidimensional, bidimensional, or multidimensional constructs, and whether or not individualism and collectivism are conceived as inherently oppositional or complementary to form a high-order construct. Specifically, previous reviews of culture and value studies in general, and of I-C studies in particular, acknowledge the possibility that individualist and collectivist orientations may coexist within a diverse society, organization, or group, and that those orientations may change over time or evolve to tackle emergent survival challenges. However, most previous reviews continue to focus on the unitary construct of I-C composed of two entities as polar opposites of each other, the high of one meaning the low of the other. Over time, instead of or in addition to the initial unidimensional conception of I-C, research has adopted the bidimensional or multidimensional conceptions. Furthermore, more of bi- or multidimensional conceptions have adopted the unipolar approach. That is, maintaining I-C as a high-order construct, individualism and collectivism are conceived as independent dimensions of I-C, each varies on a separate continuum, making it possible that individuals, groups and societies may be categorized on the various combinations of individualism and collectivism.
The advantages of the multidimensional approach have been emphasized, but issues of conceptual muddiness have also been raised, together with the challenges of theory-based research. It is recommended that I-C researchers be mindful of conceptual equivalence in developing I-C constructs and measurements and consider the optimal distinctiveness theory and the dialectic perspective as two potential overarching perspectives for comparative research on I-C. Finally, areas of future research have been identified as fertile fields for generating knowledge and understanding of I-C.
Article
Institutional Logics
Heather A. Haveman and Gillian Gualtieri
Research on institutional logics surveys systems of cultural elements (values, beliefs, and normative expectations) by which people, groups, and organizations make sense of and evaluate their everyday activities, and organize those activities in time and space. Although there were scattered mentions of this concept before 1990, this literature really began with the 1991 publication of a theory piece by Roger Friedland and Robert Alford. Since that time, it has become a large and diverse area of organizational research. Several books and thousands of papers and book chapters have been published on this topic, addressing institutional logics in sites as different as climate change proceedings of the United Nations, local banks in the United States, and business groups in Taiwan. Several intellectual precursors to institutional logics provide a detailed explanation of the concept and the theory surrounding it. These literatures developed over time within the broader framework of theory and empirical work in sociology, political science, and anthropology. Papers published in ten major sociology and management journals in the United States and Europe (between 1990 and 2015) provide analysis and help to identify trends in theoretical development and empirical findings. Evaluting these trends suggest three gentle corrections and potentially useful extensions to the literature help to guide future research: (1) limiting the definition of institutional logic to cultural-cognitive phenomena, rather than including material phenomena; (2) recognizing both “cold” (purely rational) cognition and “hot” (emotion-laden) cognition; and (3) developing and testing a theory (or multiple related theories), meaning a logically interconnected set of propositions concerning a delimited set of social phenomena, derived from assumptions about essential facts (axioms), that details causal mechanisms and yields empirically testable (falsifiable) hypotheses, by being more consistent about how we use concepts in theoretical statements; assessing the reliability and validity of our empirical measures; and conducting meta-analyses of the many inductive studies that have been published, to develop deductive theories.
Article
Intersectionality Theory and Practice
Doyin Atewologun
Intersectionality is a critical framework that provides us with the mindset and language for examining interconnections and interdependencies between social categories and systems. Intersectionality is relevant for researchers and for practitioners because it enhances analytical sophistication and offers theoretical explanations of the ways in which heterogeneous members of specific groups (such as women) might experience the workplace differently depending on their ethnicity, sexual orientation, and/or class and other social locations. Sensitivity to such differences enhances insight into issues of social justice and inequality in organizations and other institutions, thus maximizing the chance of social change.
The concept of intersectional locations emerged from the racialized experiences of minority ethnic women in the United States. Intersectional thinking has gained increased prominence in business and management studies, particularly in critical organization studies. A predominant focus in this field is on individual subjectivities at intersectional locations (such as examining the occupational identities of minority ethnic women). This emphasis on individuals’ experiences and within-group differences has been described variously as “content specialization” or an “intracategorical approach.” An alternate focus in business and management studies is on highlighting systematic dynamics of power. This encompasses a focus on “systemic intersectionality” and an “intercategorical approach.” Here, scholars examine multiple between-group differences, charting shifting configurations of inequality along various dimensions.
As a critical theory, intersectionality conceptualizes knowledge as situated, contextual, relational, and reflective of political and economic power. Intersectionality tends to be associated with qualitative research methods due to the central role of giving voice, elicited through focus groups, narrative interviews, action research, and observations. Intersectionality is also utilized as a methodological tool for conducting qualitative research, such as by researchers adopting an intersectional reflexivity mindset. Intersectionality is also increasingly associated with quantitative and statistical methods, which contribute to intersectionality by helping us understand and interpret the individual, combined (additive or multiplicative) effects of various categories (privileged and disadvantaged) in a given context. Future considerations for intersectionality theory and practice include managing its broad applicability while attending to its sociopolitical and emancipatory aims, and theoretically advancing understanding of the simultaneous forces of privilege and penalty in the workplace.
Article
Mediation: Causal Mechanisms in Business and Management
Patrick J. Rosopa, Phoebe Xoxakos, and Coleton King
Mediation refers to causation. Tests for mediation are common in business, management, and related fields. In the simplest mediation model, a researcher asserts that a treatment causes a mediator and that the mediator causes an outcome. For example, a practitioner might examine whether diversity training increases awareness of stereotypes, which, in turn, improves inclusive climate perceptions. Because mediation inferences are causal inferences, it is important to demonstrate that the cause actually precedes the effect, the cause and effect covary, and rival explanations for the causal effect can be ruled out.
Although various experimental designs for testing mediation hypotheses are available, single randomized experiments and two randomized experiments provide the strongest evidence for inferring mediation compared with nonexperimental designs, where selection bias and a multitude of confounding variables can make causal interpretations difficult. In addition to experimental designs, traditional statistical approaches for testing mediation include causal steps, difference in coefficients, and product of coefficients. Of the traditional approaches, the causal steps method tends to have low statistical power; the product of coefficients method tends to provide adequate power. Bootstrapping can improve the performance of these tests for mediation. The general causal mediation framework offers a modern approach to testing for causal mechanisms. The general causal mediation framework is flexible. The treatment, mediator, and outcome can be categorical or continuous. The general framework not only incorporates experimental designs (e.g., single randomized experiments, two randomized experiments) but also allows for a variety of statistical models and complex functional forms.
Article
Organizational Neuroscience
Sebastiano Massaro and Dorotea Baljević
Organizational neuroscience—a novel scholarly domain using neuroscience to inform management and organizational research, and vice versa—is flourishing. Still missing, however, is a comprehensive coverage of organizational neuroscience as a self-standing scientific field. A foundational account of the potential that neuroscience holds to advance management and organizational research is currently a gap. The gap can be addressed with a review of the main methods, systematizing the existing scholarly literature in the field including entrepreneurship, strategic management, and organizational behavior, among others.
Article
Qualitative Designs and Methodologies for Business, Management, and Organizational Research
Robert P. Gephart and Rohny Saylors
Qualitative research designs provide future-oriented plans for undertaking research. Designs should describe how to effectively address and answer a specific research question using qualitative data and qualitative analysis techniques. Designs connect research objectives to observations, data, methods, interpretations, and research outcomes. Qualitative research designs focus initially on collecting data to provide a naturalistic view of social phenomena and understand the meaning the social world holds from the point of view of social actors in real settings. The outcomes of qualitative research designs are situated narratives of peoples’ activities in real settings, reasoned explanations of behavior, discoveries of new phenomena, and creating and testing of theories.
A three-level framework can be used to describe the layers of qualitative research design and conceptualize its multifaceted nature. Note, however, that qualitative research is a flexible and not fixed process, unlike conventional positivist research designs that are unchanged after data collection commences. Flexibility provides qualitative research with the capacity to alter foci during the research process and make new and emerging discoveries.
The first or methods layer of the research design process uses social science methods to rigorously describe organizational phenomena and provide evidence that is useful for explaining phenomena and developing theory. Description is done using empirical research methods for data collection including case studies, interviews, participant observation, ethnography, and collection of texts, records, and documents.
The second or methodological layer of research design offers three formal logical strategies to analyze data and address research questions: (a) induction to answer descriptive “what” questions; (b) deduction and hypothesis testing to address theory oriented “why” questions; and (c) abduction to understand questions about what, how, and why phenomena occur.
The third or social science paradigm layer of research design is formed by broad social science traditions and approaches that reflect distinct theoretical epistemologies—theories of knowledge—and diverse empirical research practices. These perspectives include positivism, interpretive induction, and interpretive abduction (interpretive science). There are also scholarly research perspectives that reflect on and challenge or seek to change management thinking and practice, rather than producing rigorous empirical research or evidence based findings. These perspectives include critical research, postmodern research, and organization development.
Three additional issues are important to future qualitative research designs. First, there is renewed interest in the value of covert research undertaken without the informed consent of participants. Second, there is an ongoing discussion of the best style to use for reporting qualitative research. Third, there are new ways to integrate qualitative and quantitative data. These are needed to better address the interplay of qualitative and quantitative phenomena that are both found in everyday discourse, a phenomenon that has been overlooked.
Article
Social Network Analysis in Organizations
Jessica R. Methot, Nazifa Zaman, and Hanbo Shim
A social network is a set of actors—that is, any discrete entity in a network, such as a person, team, organization, place, or collective social unit—and the ties connecting them—that is, some type of relationship, exchange, or interaction between actors that serves as a conduit through which resources such as information, trust, goodwill, advice, and support flow. Social network analysis (SNA) is the use of graph-theoretic and matrix algebraic techniques to study the social structure, interactions, and strategic positions of actors in social networks. As a methodological tool, SNA allows scholars to visualize and analyze webs of ties to pinpoint the composition, content, and structure of organizational networks, as well as to identify their origins and dynamics, and then link these features to actors’ attitudes and behaviors. Social network analysis is a valuable and unique lens for management research; there has been a marked shift toward the use of social network analysis to understand a host of organizational phenomena. To this end, organizational network analysis (ONA) is centered on how employees, groups, and organizations are connected and how these connections provide a quantifiable return on human capital investments. Although criticisms have traditionally been leveled against social network analysis, the foundations of network science have a rich history, and ONA has evolved into a well-established paradigm and a modern-day trend in management research and practice.
Article
Structural Equation Modelling
Wayne Crawford and Esther Lamarre Jean
Structural equation modelling (SEM) is a family of models where multivariate techniques are used to examine simultaneously complex relationships among variables. The goal of SEM is to evaluate the extent to which proposed relationships reflect the actual pattern of relationships present in the data. SEM users employ specialized software to develop a model, which then generates a model-implied covariance matrix. The model-implied covariance matrix is based on the user-defined theoretical model and represents the user’s beliefs about relationships among the variables. Guided by the user’s predefined constraints, SEM software employs a combination of factor analysis and regression to generate a set of parameters (often through maximum likelihood [ML] estimation) to create the model-implied covariance matrix, which represents the relationships between variables included in the model. Structural equation modelling capitalizes on the benefits of both factor analysis and path analytic techniques to address complex research questions. Structural equation modelling consists of six basic steps: model specification; identification; estimation; evaluation of model fit; model modification; and reporting of results.
Conducting SEM analyses requires certain data considerations as data-related problems are often the reason for software failures. These considerations include sample size, data screening for multivariate normality, examining outliers and multicollinearity, and assessing missing data. Furthermore, three notable issues SEM users might encounter include common method variance, subjectivity and transparency, and alternative model testing. First, analyzing common method variance includes recognition of three types of variance: common variance (variance shared with the factor); specific variance (reliable variance not explained by common factors); and error variance (unreliable and inexplicable variation in the variable). Second, SEM still lacks clear guidelines for the modelling process which threatens replicability. Decisions are often subjective and based on the researcher’s preferences and knowledge of what is most appropriate for achieving the best overall model. Finally, reporting alternatives to the hypothesized model is another issue that SEM users should consider when analyzing structural equation models. When testing a hypothesized model, SEM users should consider alternative (nested) models derived from constraining or eliminating one or more paths in the hypothesized model. Alternative models offer several benefits; however, they should be driven and supported by existing theory. It is important for the researcher to clearly report and provide findings on the alternative model(s) tested.
Common model-specific issues are often experienced by users of SEM. Heywood cases, nonidentification, and nonpositive definite matrices are among the most common issues. Heywood cases arise when negative variances or squared multiple correlations greater than 1.0 are found in the results. The researcher could resolve this by considering a small plausible value that could be used to constrain the residual. Non-positive definite matrices result from linear dependencies and/or correlations greater than 1.0. To address this, researchers can attempt to ensure all indicator variables are independent, inspect output manually for negative residual variances, evaluate if sample size is appropriate, or re-specify the proposed model. When used properly, structural equation modelling is a powerful tool that allows for the simultaneous testing of complex models.