1-15 of 15 Results

  • Keywords: data analysis x
Clear all

Article

The role of theory in qualitative data analysis is continually shifting and offers researchers many choices. The dynamic and inclusive nature of qualitative research has encouraged the entry of a number of interested disciplines into the field. These discipline groups have introduced new theoretical practices that have influenced and diversified methodological approaches. To add to these, broader shifts in chronological theoretical orientations in qualitative research can be seen in the four waves of paradigmatic change; the first wave showed a developing concern with the limitations of researcher objectivity, and empirical observation of evidence based data, leading to the second wave with its focus on realities - mutually constructed by researcher and researched, participant subjectivity, and the remedying of societal inequalities and mal-distributed power. The third wave was prompted by the advent of Postmodernism and Post- structuralism with their emphasis on chaos, complexity, intertextuality and multiple realities; and most recently the fourth wave brought a focus on visual images, performance, both an active researcher and an interactive audience, and the crossing of the theoretical divide between social science and classical physics. The methods and methodological changes, which have evolved from these paradigm shifts, can be seen to have followed a similar pattern of change. The researcher now has multiple paradigms, co-methodologies, diverse methods and a variety of theoretical choices, to consider. This continuum of change has shifted the field of qualitative research dramatically from limited choices to multiple options, requiring clarification of researcher decisions and transparency of process. However, there still remains the difficult question of the role that theory will now play in such a high level of complex design and critical researcher reflexivity.

Article

Micah Dillard and Jon C.W. Pevehouse

Scholarship in international relations has taken a more quantitative turn in the past four decades. The field of foreign policy analysis was arguably the forerunner in the development and application of quantitative methodologies in international relations. From public opinion surveys to events data to experimental methods, many of the earliest uses of quantitative methodologies can be found in foreign policy analysis. On substantive questions ranging from the causes of war to the dynamics of public opinion, the analysis of data quantitatively has informed numerous debates in foreign policy analysis and international relations. Emerging quantitative methods will be useful in future efforts to analyze foreign policy.

Article

James A. Muncy and Alice M. Muncy

Business research is conducted by both businesspeople, who have informational needs, and scholars, whose field of study is business. Though some of the specifics as to how research is conducted differs between scholarly research and applied research, the general process they follow is the same. Business research is conducted in five stages. The first stage is problem formation where the objectives of the research are established. The second stage is research design. In this stage, the researcher identifies the variables of interest and possible relationships among those variables, decides on the appropriate data source and measurement approach, and plans the sampling methodology. It is also within the research design stage that the role that time will play in the study is determined. The third stage is data collection. Researchers must decide whether to outsource the data collection process or collect the data themselves. Also, data quality issues must be addressed in the collection of the data. The fourth stage is data analysis. The data must be prepared and cleaned. Statistical packages or programs such as SAS, SPSS, STATA, and R are used to analyze quantitative data. In the cases of qualitative data, coding, artificial intelligence, and/or interpretive analysis is employed. The fifth stage is the presentation of results. In applied business research, the results are typically limited in their distribution and they must be addressed to the immediate problem at hand. In scholarly business research, the results are intended to be widely distributed through journals, books, and conferences. As a means of quality control, scholarly research usually goes through a double-blind review process before it is published.

Article

Making appropriate methodological and analytic decisions in educational research requires a thorough grounding in the literature and a thorough understanding of the chosen methodology. Detailed preplanning is important for all method types and includes an understanding of the assumptions, limitations, and delimitations of the study. For quantitative research, researchers should be cautious with data analysis decisions that give preference to statistically significant results, noting that quantitative research can proceed with intents other than confirmatory hypothesis testing. Decisions and procedures that are used to search for low p values, rather than answer the driving research question, are especially problematic. Presentation of quantitative results should include components that clarify and account for analytic choices, that report all relevant statistical results, and that provide sufficient information to replicate the study. Consideration should also be given to joining recent initiatives for more transparency in research with the use of preregistered studies and open data repositories. For qualitative research, researchers should be thoughtful about choosing a specific method for their project that appropriately matches the method’s framework and analytic procedures with the research aim and anticipated sample. Qualitative researchers should also strive for transparency in their method description by allowing for a view of the analytic process that drove the data collection and iterative dives into the data. Presentation of qualitative results requires a balance between providing a compelling narrative that establishes the trustworthiness of results with the judicious use of participant voices. Mixed methods research also requires appropriate integration of different data types.

Article

Paul R. Hensel

The International Studies Association’s (ISA) Scientific Study of International Processes (SSIP) section is dedicated to the systematic analysis of empirical data covering the entire range of international political questions. Drawing on the canons of scientific inquiry, SSIP seeks to support and promote replicable research in terms of the clarity of a theoretical argument and/or the testing of hypotheses. Journals that have been most likely to publish SSIP-related research include the top three general journals in the field of political science: the American Political Science Review, American Journal of Political Science, and Journal of Politics. A number of more specialized journals frequently publish research of interest to the SSIP community, such as Conflict Management and Peace Science, International Interactions, International Organization, International Studies Quarterly, Journal of Conflict Resolution, and Journal of Peace Research. Together, these journals published a total of 1,024 qualifying articles between 2003 and 2010. These articles cover a wide range of topics, from armed conflict and conflict management to terrorism, international political economy, economic development or growth, monetary policy, foreign aid, sanctions, human rights and repression, international law, international organizations/institutions, and foreign policy attitudes and beliefs. Data users who are interested in conducting their own research must: choose the most appropriate data set(s), become familiar with what the data set includes and how its central concepts are measured, multipurpose data sources, investigate missing data, and assess robustness across multiple data sets.

Article

Kyle Beardsley, Patrick James, Jonathan Wilkenfeld, and Michael Brecher

Over the course of more than four decades the International Crisis Behavior (ICB) Project, a major and ongoing data-gathering enterprise in the social sciences, has compiled data that continues to be accessed heavily in scholarship on conflict processes. ICB holdings consist of full-length qualitative case studies, along with an expanding range of quantitative data sets. Founded in 1975, the ICB Project is among the most visible and influential within the discipline of International Relations (IR). A wide range of studies based either primarily or in part on the ICB’s concepts and data have accumulated and cover subjects that include the causes, processes, and consequences of crises. The breadth of ICB’s contribution has expanded over time to go beyond a purely state-centric approach to include crisis-related activities of transnational actors across a range of categories. ICB also offers depth through, for example, potential resolution of contemporary debates about mediation in crises on the basis of nuanced findings about long- versus short-term impact with regard to conflict resolution.

Article

A researcher’s methodological approach is guided by his or her orientation toward three major philosophical assumptions: epistemological assumptions (i.e., what the nature of truth or knowledge is and how it can be pursued), ontological assumptions (i.e., what the nature of reality is and how it can be understood), and axiological assumptions (i.e., what the researcher’s position in the world is and responsibilities to it). Qualitative inquiry is largely guided by methodological beliefs that hold truth and reality as socially constructed, that value subjectivity over objectivity, that explore questions of “how” or “why” over questions of “what,” and that value participants’ voices and experiences. Broadly, qualitative inquiry seeks to describe the world as it is experienced and lived in by the participants under study. With respect to intergroup communication, qualitative inquiry takes an in-depth approach to understanding how members of a community or culture enact the behaviors of everyday life relevant to their group. Qualitative inquiry comprises several methodologies or methodological approaches including ethnography, autoethnography, and ethnography of communication; narrative paradigm and narrative theory; grounded theory; phenomenology; and case studies. Each methodology employs one method or a combination of methods to collect qualitative data. Methods refer to the tools used to collect data for the purposes of informing research and answering research questions. Qualitative methods include tools for the collection of descriptive, largely non-numeric data, including several types of interviews, observations, and interactions, and the collection of meaningful texts, documents, and objects. The collection of qualitative data often requires the researcher to establish a trusting relationship (rapport) with participants and gain an insider’s (emic) perspective of the context for study. In many cases, this is established through prolonged engagement in the field and carefully crafting interview questions that encourage detailed disclosures. Qualitative data are analyzed through a process of dissection, up-close examination, contrast, and comparison between units of data and then putting pieces back together in a synergetic way that represents data holistically. Most qualitative data analysis involves some form of coding: a process of identifying units of data that are relevant to the research questions, assigning them a short label or code, then clustering similar codes into increasingly abstract thematic categories. Researchers establish trustworthiness in qualitative reports through descriptive writing that preserves the voices of the participants, that reflects the social realities of the participants, and that contextualizes results within broader scholarly discourse by tying findings to previous theory or research. Qualitative research reports can take many forms that range from creative forms of writing and representation including poetry and photographs to more conventional forms of writing that fit expectations of social scientific academic journals. When applied to intergroup contexts, qualitative inquiry can make evident the language and communication patterns and social behaviors that distinguish one group from another. Field observations can reveal identity performance and group behavior. Interviews can solicit information from participants about in-group or out-group perceptions and experiences. And the collection and analysis of texts and documents can establish the means through which group identity is preserved and transferred.

Article

The evidence produced by healthcare economic evaluation studies is a key component of any Health Technology Assessment (HTA) process designed to inform resource allocation decisions in a budget-limited context. To improve the quality (and harmonize the generation process) of such evidence, many HTA agencies have established methodological guidelines describing the normative framework inspiring their decision-making process. The information requirements that economic evaluation analyses for HTA must satisfy typically involve the use of complex quantitative syntheses of multiple available datasets, handling mixtures of aggregate and patient-level information, and the use of sophisticated statistical models for the analysis of non-Normal data (e.g., time-to-event, quality of life and costs). Much of the recent methodological research in economic evaluation for healthcare has developed in response to these needs, in terms of sound statistical decision-theoretic foundations, and is increasingly being formulated within a Bayesian paradigm. The rationale for this preference lies in the fact that by taking a probabilistic approach, based on decision rules and available information, a Bayesian economic evaluation study can explicitly account for relevant sources of uncertainty in the decision process and produce information to identify an “optimal” course of actions. Moreover, the Bayesian approach naturally allows the incorporation of an element of judgment or evidence from different sources (e.g., expert opinion or multiple studies) into the analysis. This is particularly important when, as often occurs in economic evaluation for HTA, the evidence base is sparse and requires some inevitable mathematical modeling to bridge the gaps in the available data. The availability of free and open source software in the last two decades has greatly reduced the computational costs and facilitated the application of Bayesian methods and has the potential to improve the work of modelers and regulators alike, thus advancing the fields of economic evaluation of healthcare interventions. This chapter provides an overview of the areas where Bayesian methods have contributed to the address the methodological needs that stem from the normative framework adopted by a number of HTA agencies.

Article

Research in educational leadership and management spans settings from early childhood to tertiary education and life-long learning. From its mid-20th-century beginnings as a tool for organizing educational systems, the wide range of methodologies in present use reflects the shifting focus of the field. The current mix of quantitative and qualitative approaches indicates differing epistemological stances and a range of purposes from instrumental responses to government policy initiatives, through investigation of issues of social justice, to personal enquiry into leadership influence on environments for learning. Research in the field encompasses the values and dilemmas underpinning educational leadership roles, the enactment of middle leadership, teacher leadership and student leadership, and includes leaders conducting research to improve their own practice. Multiple aspects of decision-making are involved in educational leadership research. The philosophical assumptions of researchers inform their positivist or interpretivist stance and the associated choices of quantitative or qualitative methodology. The external drivers of the investigation, together with its purpose and scope, influence the choice of research approach —for example, data-mining, survey, case study, action research—and technique—interview, questionnaire, documentary analysis, narrative, and life-history. These approaches and techniques in turn invite a range of analytical methods, from statistical modeling, systematic qualitative data analysis and discourse analysis to auto-ethnographic critical reflection and reflective narrative. The interpretation of the analysis hinges on the purpose of the research: to understand, inform, improve, or bring about change. Twenty-first-century challenges for the field include expanding theory beyond a largely Western-centric focus; responding to the development of new theories of leadership, including the voice of non-leaders in perspectives on leadership; ensuring that research informs policy rather than vice versa; and addressing the sheer volume and nature of data available through emerging technologies.

Article

Flooding remains one of the globe’s most devastating natural hazards and a leading driver of natural disaster losses across many countries, including the United States. As such, a rich and growing literature aims to better understand, model, and assess flood losses. Several major theoretical and empirical themes emerge from the literature. Fundamental to the flood damage assessment literature are definitions of flood damage, including a typology of flood damage, such as direct and indirect losses. In addition, the literature theoretically and empirically assesses major determinants of flood damage including hydrological factors, measurement of the physical features in harm’s way, as well as understanding and modeling protective activities, such as flood risk mitigation and adaptation, that all co-determine the overall flood losses. From there, common methods to quantify flood damage take these factors as inputs, modeling hydrological risk, exposure, and vulnerability into quantifiable flood loss estimates through a flood damage function, and include both ex ante expected loss assessments and ex post event-specific analyses. To do so, high-quality data are key across all model steps and can be found across a variety of sources. Early 21st-century advancements in spatial data and remote sensing push the literature forward. While topics and themes apply more generally to flood damage across the globe, examples from the United States illustrate key topics. Understanding main themes and insights in this important research area is critical for researchers, policy-makers, and practitioners to better understand, utilize, and extend existing flood damage assessment literatures in order to lessen or even prevent future tragedy.

Article

Real groups constitute themselves as representatives of social structures, that is, of communicative processes in which it is possible to identify patterns and a certain model of communication. This model is not random or incipient, rather it documents collective experiences as well as the social characteristics of these groups, their representations of class, social environment, and generational belonging. In the context of qualitative research methods in the fields of social sciences and education, group discussions gained prominence mainly from research conducted with children and young people. As a research method, they constitute an important tool in the reconstruction of milieux and collective orientations that guide the actions of the subjects in the spaces in which they live. This article begins with some considerations about group interviews, highlighting the Anglo-Saxon model of focus groups, the Spanish tradition of group discussions from the School of Qualitative Critics in Madrid, and group discussions conceived in the 1950s at the Frankfurt School in Germany. Next, the theoretical-methodological basis of group discussions and the documentary method developed in Germany in the 1980s by Ralf Bohnsack are presented. Both procedures are anchored mainly in Karl Mannheim’s sociology of knowledge, but also in Pierre Bourdieu’s ethnography and sociology of culture. Finally, from the results of three research projects in education carried out in Mexico, Chile, and Brazil, the potential of this research and approach to data analysis is assessed. Based on the principle of abduction, the documentary method inspires the creation of analytical instruments rooted in praxis and that can delineate educational experiences in different contexts.

Article

Philip B.K. Potter

Foreign policy analysis (FPA) is the study of how states, or the individuals that lead them, make foreign policy, execute foreign policy, and react to the foreign policies of other states. This topical breadth results in a subfield that encompasses a variety of questions and levels of analysis, and a correspondingly diverse set of methodological approaches. There are four methods which have become central in foreign policy analysis: archival research, content analysis, interviews, and focus groups. The first major phase of FPA research is termed “comparative foreign policy.” Proponents of comparative foreign policy sought to achieve comprehensive theories of foreign policy behavior through quantitative analysis of “events” data. An important strand of this behavioral work addressed the relationship between trade dependence and foreign policy compliance. On the other hand, second-generation FPA methodology largely abandoned universalized theory-building in favor of historical methods and qualitative analysis. Second-generation FPA researchers place particular emphasis on developing case study methodologies driven by social science principles. Meanwhile, the third-generation of FPA scholarship combines innovative quantitative and qualitative methods. Several methods of foreign policy analysis used by third-generation FPA researchers include computer assisted coding, experiments, simulation, surveys, network analysis, and prediction markets. Ultimately, additional attention should be given to determining the degree to which current methods of foreign policy analysis allow predictive or prescriptive conclusions. FPA scholars should also focus more in reengaging foreign policy analysis with the core of international relations research.

Article

Lifespan development is embedded in multiple social systems and social relationships. Lifespan developmental and relationship researchers study individual codevelopment in various dyadic social relationships, such as dyads of parents and children or romantic partners. Dyadic data refers to types of data for which observations from both members of a dyad are available. The analysis of dyadic data requires the use of appropriate data-analytic methods that account for such interdependencies. The standard actor-partner interdependence model, the dyadic growth curve model, and the dyadic dual change score model can be used to analyze data from dyads. These models allow examination of questions related to dyadic associations such as whether individual differences in an outcome can be predicted by one’s own (actor effects) and the other dyad member’s (partner effects) level in another variable, correlated change between dyad members, and cross-lagged dyadic associations, that is, whether one dyad member’s change can be predicted by the previous levels of the other dyad member. The choice of a specific model should be guided by theoretical and conceptual considerations as well as by features of the data, such as the type of dyad, the number and spacing of observations, or distributional properties of variables.

Article

Paul Sebastian Ruppel and Günter Mey

Grounded theory methodology is one of the most widely used approaches to collect and analyze data within qualitative research. It can be characterized as a framework for study design, data collection, and analysis, which aims at the development of middle-range theories. The final result of such a study is called a “grounded theory,” and it consists of categories that are related to each other. Health and risk message design researchers working with grounded theory methodology are explicitly invited to use any kind of data they consider suitable for a particular project. Grounded theory methodology studies were originally based on intense fieldwork data, but in the meantime, interviews have become the most widely used type of data. In addition, there is a growing interest in using visual data such as pictures or film. Grounded theory methodology originated from sociology, but has since been applied in many different disciplines. This widened application went along with modifications, new developments, and innovations, and led to several current variants of grounded theory methodology. Basic features of grounded theory methodology include theoretical sampling, specific coding procedures with a comparative approach to analysis, and memo writing. The strategy of theoretical sampling requires that theoretical insights gained from the analysis of initially collected data guide subsequent data collection. Hence, during the research process data collection and analysis alternate and interact. For data analysis, different ways of coding enable the researcher to develop increasingly abstract conceptual ideas and reflections, first embodied in codes, later in categories. This analytical process allows for a step-by-step development of categories that are grounded in data. Category development entails comparisons at all stages, for example, of different cases during sampling, of different data pieces, and of different codes and categories during analysis. As a result, grounded theory methodology is also known as the constant comparative method. Throughout the research process the researcher writes memos and keeps track of the development of conceptual ideas, methodological reflections, and practical to-dos. Today, many researchers use software specifically developed to assist the process of qualitative data analysis.

Article

Investigative practices, including research methodologies, approaches, processes, as well as knowledge dissemination efforts continue to evolve within inclusive or special education. So too do such practices evolve within related fields such as nursing, psychology, community-based care, health promotion, etc. There are several research approaches that promote the tools required to effect inclusive education, such as: evidence-based practice (EBP), EBP in practice, creative secondary uses of (anonymous) data, collective impact, qualitative evidence synthesis (QES), and lines of action (LOA). Other approaches that promote a more inclusive education research agenda more generally, include action research and participatory action research, inclusive research, appreciative inquiry, and arts-based educational research.