Show Summary Details

Page of

Printed from Oxford Research Encyclopedias, Education. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 28 January 2023

Qualitative Data Analysis and the Use of Theoryfree

Qualitative Data Analysis and the Use of Theoryfree

  • Carol GrbichCarol GrbichFlinders University


The role of theory in qualitative data analysis is continually shifting and offers researchers many choices. The dynamic and inclusive nature of qualitative research has encouraged the entry of a number of interested disciplines into the field. These discipline groups have introduced new theoretical practices that have influenced and diversified methodological approaches. To add to these, broader shifts in chronological theoretical orientations in qualitative research can be seen in the four waves of paradigmatic change; the first wave showed a developing concern with the limitations of researcher objectivity, and empirical observation of evidence based data, leading to the second wave with its focus on realities - mutually constructed by researcher and researched, participant subjectivity, and the remedying of societal inequalities and mal-distributed power. The third wave was prompted by the advent of Postmodernism and Post- structuralism with their emphasis on chaos, complexity, intertextuality and multiple realities; and most recently the fourth wave brought a focus on visual images, performance, both an active researcher and an interactive audience, and the crossing of the theoretical divide between social science and classical physics. The methods and methodological changes, which have evolved from these paradigm shifts, can be seen to have followed a similar pattern of change. The researcher now has multiple paradigms, co-methodologies, diverse methods and a variety of theoretical choices, to consider. This continuum of change has shifted the field of qualitative research dramatically from limited choices to multiple options, requiring clarification of researcher decisions and transparency of process. However, there still remains the difficult question of the role that theory will now play in such a high level of complex design and critical researcher reflexivity.


  • Research and Assessment Methods

Theory and Qualitative Data Analysis

Researchers new to qualitative research, and particularly those coming from the quantitative tradition, have often expressed frustration at the need for what appears to be an additional and perhaps unnecessary process—that of the theoretical interpretation of their carefully designed, collected, and analyzed data. The justifications for this process have tended to fall into one of two areas: the need to lift data to a broader interpretation beyond the Monty Pythonesque “this is my theory and it’s my very own,” to illumination of findings from another perspective—by placing the data in its relevant discipline field for comparison with previous theoretical data interpretations, while possibly adding something original to the field.

“Theory” is broadly seen as a set of assumptions or propositions, developed from observation or investigation of perceived realties, that attempt to provide an explanation of relationships or phenomena. The framing of data via theoretical imposition can occur at different levels. At the lowest level, various concepts such as “role,” “power,” “socialization,” “evaluation,” or “learning styles” refer to limited aspects of social organization and are usually applied to a specific group of people.

At a more complex level, theories of the Middle Range, identified by Robert Merton to link theory and practice, are used to build theory from empirical data. These tend to be discipline specific and incorporate concepts plus variables such as “gender,” “race,” or “class.” Concepts and variables are then combined into meaningful statements, which can be applied to more diverse social groups. For example, in education an investigation of student performance could emphasize such concepts as “safety,” “zero bullying,” “communication,” and “tolerance,” with variables such as “race” and “gender” to lead to a statement that good microsystems and a focus on individual needs are necessary for optimal student performance.

The third and most complex level uses the established or grand theories such as those of Sigmund Freud’s stages of children’s development, Jean Piaget’s theory of cognitive development, or Urie Bronfenbrenner’s ecological systems, which have been widely accepted as meaningful across a number of disciplines and provide abstract explanations of the uniformity of aspects of social organization, social behavior, and social change.

The trend in qualitative research regarding the application of chosen levels of theory has been generally either toward theory direction/verification or theory generation, although the two are often intertwined. In the first, a relevant existing theory is chosen early and acts as a point of critical comparison for the data to be collected. This approach requires the researcher to think theoretically as s/he designs the study, collects data, and collates it into analytical groupings. The danger of theory direction is that an over focus on a chosen theoretical orientation may limit what the researcher can access or “see” in the data, but on the upside, this approach can also enable the generation of new theoretical aspects, as it is rare that findings will fall precisely within the implications of existing statements. Theory generation is a much looser approach and involves either one or a range of relevant levels of theory being identified at any point in the research process, and from which, in conjunction with data findings, some new combination or distillation can enhance interpretation.

The question of whether a well-designed study should negate the need for theoretical interpretation has been minimally debated. Mehdi and Mansor (2010) identified three trends in the literature on this topic: that theory in qualitative research relates to integrated methodology and epistemology; that theory is a separate and additional element to any methodological underpinnings; and that theory has no solid relationship with qualitative research. No clear agreement on any of these is evident. Overall, there appears to be general acceptance that the process of using theory, albeit etically (imposed) or emically (integrated), enhances outcomes, and moves research away from being a-theoretical or unilluminated by other ideas. However, regarding praxis, a closer look at the issue of the use of theory and data may be in order. Theoretical interpretation, as currently practiced, has limits. To begin with, the playing field is not level. In the grounded theory tradition, Glaser and Strauss (1967) were initially clear that in order to prevent undue influence on design and interpretation, the researcher should avoid reviewing the literature on a topic until after some data collection and analysis had been undertaken. The presumption that most researchers would already be well versed in theory/ies and would have a broad spectrum to draw on in order to facilitate the constant comparative process from which data-based concepts could be generated was found to be incorrect. Glaser (1978) suggested this lack could be improved at the conceptual level via personal and professional reflexivity.

This issue became even more of a problem with the advent of practice-led disciplines such as education and health into the field of qualitative research. These groups had not been widely exposed to the theories of the traditional social sciences such as sociology, psychology, and philosophy, although in education they would have been familiar with John Dewey’s concept of “pragmatism” linking learning with hands-on activity, and were more used to developing and using models of practice for comparison with current realities. By the mid-20th century, Education was more established in research and had moved toward the use of middle range theories and the late 20th-century grand theorists: Michel Foucault, with his emphasis on power and knowledge control, and Jurgen Habermas, with his focus on pragmatism, communication, and knowledge management.

In addition to addictive identification with particular levels of theory and discipline-preferred theories and methods, activity across qualitative research seems to fall between two extremes. At one end it involves separate processes of data collection and analysis before searching for a theoretical framework within which to discuss the findings—often choosing a framework that has gained traction in a specific discipline. This “best/most acceptable fit” approach often adds little to the relevant field beyond repetition and appears somewhat forced. At the other extreme there are those who weave methods, methodologies, data, and theory throughout the whole research process, actively critiquing and modifying it as they go, usually with the outcome of creating some new direction for both theory and practice. The majority of qualitative research practice, however, tends to fall somewhere between these two.

The final aspect of framing data lies in the impact of researchers themselves, and the early-21st-century emphasis is on exposing relevant personal frames, particularly those of culture, gender, socioeconomic class, life experiences such as education, work, and socialization, and the researcher’s own values and beliefs. The twin purposes of this exposure are to create researcher awareness and encourage accountability for their impact on the data, as well as allowing the reader to assess the value of research outcomes in terms of potential researcher bias or prejudice. This critical reflexivity is supposed to be undertaken at all stages of the research but it is not always clear that it has occurred.

Paradigms: From Interactionism to Performativity

It appears that there are potentially five sources of theory: that which is generally available and can be sourced from different disciplines; that which is imbedded in the chosen paradigm/s; that which underpins particular methodologies; that which the researcher brings, and that which the researched incorporate within their stories. Of these, the paradigm/s chosen are probably the most influential in terms of researcher position and design. The variety of the sets of assumptions, beliefs, and researcher practices that comprise the theoretical paradigms, perspectives, or broad world views available to researchers, and within which they are expected to locate their individual position and their research approach, has shifted dramatically since the 1930s. The changes have been distinct and identifiable, with their roots located in the societal shifts prompted by political, social, and economic change.

The First Wave

The Positivist paradigm dominated research, largely unquestioned, prior to the early 20th century. It emphasized the distancing of the researcher from his/her subjects; researcher objectivity; a focus on objective, cause–effect, evidence-based data derived from empirical observation of external realities; experimental quantitative methods involving testing hypotheses; and the provision of finite answers and unassailable future predictions. From the 1930s, concerns about the limitations of findings and the veracity of research outcomes, together with improved communication and exposure to the worldviews of other cultures, led to the advent of the realist/post-positivist paradigm. Post-positivism, or critical realism, recognized that certainty in proving the truth of a hypothesis was unachievable and that outcomes were probably limited to falsification (Popper, 1963), that true objectivity was unattainable and that the researcher was most likely to impact on or to contaminate data, that both qualitative and quantitative approaches were valuable, and that methodological pluralism was desirable.

The Second Wave

Alongside the worldwide political shifts toward “people power” in the 1960s and 1970s, two other paradigms emerged. The first, the Interpretivist/Constructivist, focused on the social situations in which we as humans develop and how our construction of knowledge occurs through interactions with others in these contexts. This paradigm also emphasized the gaining of an understanding of the subjective views or experiences of the participants being researched, and recognized the impact of the researcher on researcher–researched mutually constructed realities. Here, theory generation is the preferred outcome to explain the what, how, and why of the findings. This usually involves the development of a conceptual model, forged from both the data gained and from the application/integration of relevant theory, to provide explanations for and interpretations of findings, together with a new perspective for the field/discipline.

The second paradigm, termed the Critical/Emancipatory, focused on locating, critiquing, and changing inequalities in society. The identification of the location of systemic power discrepancies or systematic power misuse in situations involving gender, sexuality, class, and race is expected to be followed by moves to right any oppression discovered. Here, the use of theory has been focused more on predetermined concept application for “fit.” This is because the very strong notion of problematic societal structures and power inappropriately wielded have been the dominant underpinnings.

In both the Interpretive and Critical paradigms, researcher position shifted from the elevated and distant position of positivism, to one of becoming equal with those being researched, and the notion of researcher framing emerged to cover this shift and help us—the readers—to “see” (and judge) the researcher and her/his processes of data management more clearly.

The Third Wave

In the 1980s, the next wave of paradigmatic options—postmodernism and poststructuralism—emerged. Postmodernism, with its overarching cultural implications, and poststructuralism, with its focus on language, severely challenged the construction, limitations, and claims to veracity of all knowledge and in particular the use of theory derived from siloed disciplines and confined research methods. Regardless of whether the postmodern/poststructural label is attached to grounded theory, ethnography, phenomenology, action, or evaluative designs, one general aspect that prevails is a focus on language. Language has become viewed as dubious, with notions of “slippage”—the multiple meanings of individual words, and “difference”—the difference and deferral of textual meaning (Derrida, 1970, 1972), adding complexity. Double coding, irony, and juxtaposition are encouraged to further identify meaning, and to uncover aspects of social organization and behavior that have been previously marginalized or made invisible by existing discourses and discursive practices. Texts are seen as complex constructions, and intertextuality is favored, resulting in multiply constructed texts. The world is viewed as chaotic and unknowable; individuals are no longer seen as two dimensional—they are viewed as multifaceted with multiple realities. Complex “truths” are perceived as limited by time and context, requiring multiple data sets and many voices to illuminate them, and small-scale focused local research is seen as desirable. The role of researcher also changed: the politics of position and self-reflexivity dominate and the researcher needs to clearly expose past influences and formerly hidden aspects of his/her life. S/he inhabits the position of an offstage or decentered facilitator, presenting data for the reader to judge.

Theory is used mainly at the conceptual level with no particular approach being privileged. The researcher has become a “bricoleur” (Levi-Strauss, 1962) or handyman, using whatever methods or theories that are within reach, to adapt, craft, and meld technological skills with mythical intellectual reflection in order to create unique perspectives on the topic. Transitional interpretations dominate, awaiting further challenges and deconstruction by the next researcher in the field.

The need for multifaceted data sets in the 1990s led inevitably to a search for other research structures, and mixed and multiple methods have become topical. In crossing the divide between qualitative and quantitative approaches, the former initially developed its own sub-paradigms: pragmatist (complimentary communication and shared meanings) and transformative/emancipatory (inequalities in race, class, gender, and disability, to be righted). An increasing focus on multiple methods led to the advent of dialectics (multiple paradigm use) and critical realism (the acceptance of divergent results) (Shannon-Baker, 2016). The dilemmas of theory use raised by these changes include whether to segregate data sets and try to explain disparate outcomes in terms of diversity using different theories; whether to integrate them through a homogeneous “smoothing” process—one theory fits all, in order to promote a singular interpretation; or whether to let the strongest paradigm—in terms of data—dominate the theoretical findings.

The Fourth Wave

During the early 21st century, as the third wave was becoming firmly established, the Performative paradigm emerged. The incorporation of fine art–based courses into universities has challenged the prescribed rules of the doctoral thesis, initially resulting in a debate—with echoes of Glaser and Strauss—as to whether theory, if used initially, is too directive, thereby potentially contaminating the performance, or whether theory application should be an outcome to enhance performances, or even whether academic guidelines regarding theory use need to be changed to accommodate these disciplines (Bolt, 2004; Freeman, 2010; Riley & Hunter, 2009). Performativity is seen in terms of “effect,” a notion derived from John Austin’s (1962) assertion that words and speech utterances do not just act as descriptors of content, they have social force and impact on reality. Following this, a productive work is seen as capable of transforming reality (Bolt, 2016). The issue most heard here is the problem of how to judge this form of research when traditional guidelines of dependability, transformability, and trustworthiness appear to be irrelevant. Barbara Bolt suggests that drawing on Austin’s (1962) terms “locutionary” (semantic meaning), “illocutionary” (force), and “perlocutionary” (effect achieved on receivers), together with the mapping of these effects in material, effective, and discursive domains, may be useful, despite the fact that mapping transformation may be difficult to track in the short term.

During the second decade of the 21st century, however, discussions relating to the use of theory have increased dramatically in academic performative research and a variety of theoreticians are now cited apart from John Austin. These include Maurice Merleu-Ponty (1945 and the spatiality of lived events; Jacques Derrida (1982) on iterability, simultaneous sameness, and difference; Giles Deleuze and Felix Guatarri (1987) on rituals of material objects and transformative potential; Jean-Francois Lyotard (1988) on plurality of micro narratives, “affect,” and its silent disruption of discourse; and Bruno Latour (2005) with regard to actor network theory—where theory is used to engage with rather than to explain the world in a reflective political manner.

In performative doctoral theses, qualitative theory and methods are being creatively challenged. For example, from the discipline of theater and performance Lee Miller and Joanne/Bob Whalley (2010) disrupt the notion of usual spaces for sincere events by taking their six-hour-long performance Partly Cloudy, Chance of Rain, involving a public reaffirmation of their marriage vows, out of the usual habitats to a service station on a highway. The performance involves a choir, a band, a pianist, 20 performers dressed as brides and grooms, photographers, a TV crew, an Anglican priest, plus 50 guests. The theories applied to this event include an exploration of Marc Auge’s (1992) conception of the “non-place”; Mikhail Bakhtin’s (1992) concepts of “dialogism” (many voices) together with “heteroglossia” (juxtaposition of many voices in a dialogue); and Ludwig Wittgenstein’s (1953) discussion of the “duck rabbit”—once the rabbit is seen (participatory experience) the duck (audience) is always infected by its presence. This couple further challenged the guidelines of traditional doctoral theses by successfully negotiating two doctoral awards for a joint piece of research

A more formal example of a doctoral thesis (Reik, 2014) using traditional qualitative approaches has examined at school level the clash of paradigms of performative creative styles of teaching with the neoliberalist focus on testing, curriculum standardization, and student outcomes.

Leah Mercer (2012), an academic in performative studies, used the performative paradigm in her doctoral thesis to challenge and breach not only the methodological but also the theoretical silos of the quantitative–qualitative divide. The physics project is an original work using live performances of personal storytelling with video and web streaming to depict the memories, preoccupations, and the formative relationship of two women, an Australian and an American, living in contemporary mediatized society. Using scientific theory, Mercer explores personal identity by reframing the principles of contemporary physics (quantum mechanics and uncertainty principle) as aesthetic principles (uncertainty and light) with the physics of space (self), time (memory), light (inspiration), and complementarity (the reconciliation of opposites) to illuminate these experiences.

The performative paradigm has also shifted the focus on the reader, developed in postmodernism, to a broader group—an active audience. Multi-methods have been increased to include symbolic imagery, in particular visual images, as well as sound and live action. The researcher’s role here is often that of performer within a cultural frame, creating and investigating multiple realities and providing the link between the text/script and the audience/public. Theory is either minimized to the level of concepts or used to break through the silos of different disciplines to integrate and reconcile aspects from long-lasting theoretical divides.

In these chronological lines of paradigm shifts, changes in researcher position and changes in the application of theory can clearly be seen. The researcher has moved out of the shadows and into the mainstream; her/his role has shifted from an authoritarian collector and presenter of finite “truths” to a creator and often performer of multiple and disparate data images for the audience to respond to. Theory options have shifted from direction and generation within existing perspectives to creative amalgamations of concepts from disciplines previously rarely combined.

Methodologies: From Anthropology to Fine Arts

It would be a simple matter if all the researcher had to contend with was siting oneself in a particular paradigm/s. Unfortunately, not only have paradigms shifted in terms of researcher position and theoretical usage but so also have methodological choices and research design. One of the most popular methodologies, ethnography, with its roots in classical anthropology and its fieldwork-based observations of action and interaction in cultural contexts, can illustrate the process of methodological change following paradigm shift. If a researcher indicates that he/she has undertaken an ethnographic study, the reader will be most likely to query “which form?”: classical?, critical?, auto?, visual?, ethno drama?, cyber/net?, or performative? The following examples from this methodology should indicate how paradigm shifts have resulted in increasing complexity of design, methods, and interpretive options.

In classical ethnography the greatest borrowing is from traditional anthropology in terms of process and tools, and this can be seen with the inclusion of initial time spent in the setting to learn the language of the culture and to generally “bathe” oneself in the environment, often with minimal data collection. This process is supposed to help increase researcher understanding of the culture and minimize the problem of “othering” (treating as a different species/alien). Then a fairly lengthy amount of time is usually spent in the cultural setting either as an observer or as a participant observer to collect as much data as is relevant to answer the research question. This is followed by a return to post-check whether the findings previously gathered have stood the test of time. The analytical toolkit can involve domain analysis, freelists, pilesorts, triads and taxonomies, frame and social network, and event analysis. Truncated mini-ethnographies became more common as time became an issue, but these can still involve years of managing descriptive data, often collected by several participating researchers as seen in Douglas, Rasmussen, and Flanagan’s (1977) study of the culture of a nudist beach. Shorter versions undertaken by one researcher, for example Sohn (2015), have explored strategies of teacher and student learning in a science classroom. Theoretical interpretation can be by conceptual application for testing, such as Margaret Mead’s (1931) testing of the concept of “adolescence”—derived from American culture—in Samoan culture, or, more generally, by concept generation. The latter can be seen in David Rozenhan’s (1973) investigation of the experience of a group of researcher pseudo-patients admitted to hospitals for the mentally ill in the United States. The main concepts generated were labeling, powerlessness, and depersonalization.

De-colonial ethnography recognizes the “othering” frames of colonial and postcolonial research and takes a position that past colonial supremacy over Third World countries persists in political, economic, educational, and social constructions. Decolonizing requires a critical examination of language, attitudes, and research methods. Kakal Battacharya (2016) has exposed the micro-discourses of the continuing manifestation of colonial power in a parallel narrative written by a South Asian woman and a white American male. Concepts of colonialism and patriarchy, displayed through the discourses exposed, provide a theoretical critique.

Within critical ethnography, with its focus on power location and alleviation of oppression, Dale Spender (1980) used structured and timed observations of the styles, quality, and quantity of interaction between staff and students in a range of English classrooms. The theory-directive methodological frames of feminism and gender inequality were applied to identify and expose the lesser time and lesser quality of interaction that teachers had with female students in comparison with that assigned to male students. Widespread distribution of these results alerted education authorities and led to change, in some environments, toward introducing single-sex classrooms for certain topics. This was seen as progress toward alleviating oppressive behaviors. This approach has produced many excellent educational studies, including Peter Willis (1977) on the preparation of working-class kids for working-class jobs; Michele Fine (1991) on African American and Latino students who dropped out of a New York high school; Angela Valenzuela (1999) on emigrant and other under-achievers in American schools; Lisa Patel (2013) on inclusion and exclusion of immigrants into education; and Jean Anyon (1981) on social stratification of identical curriculum knowledge in different classrooms

A less concept-driven and more descriptive approach to critical ethnography was emphasized by Phil Carspecken’s hermeneutic approach (1996), which triggered a move toward data-generated theoretical concepts that could then be used to challenge mainstream theoretical positions.

Post-critical ethnography emphasizes power and ideology and the social practices that contribute to oppression, in particular objectivity, positionality, representation and reflexivity, and critical insufficiency or “antipower.”

Responsibility is shifted to the researcher for the world they create and critique when they interpret their research contexts (Noblit, Flores, & Murillo, 2004).

Autoethnography emerged from the postmodern paradigm, with its search for different “truths” and different relationships with readers, and prompted an emphasis on personal experience and documentation of the self in a particular cultural context (Ellis, 2004). In order to achieve this, the researcher has to inhabit the dual positions of being the focus of activities, feelings, and emotions experienced in the setting while at the same time being positioned distantly—observing and recording the behaviors of the self in that culture. Well-developed skills of critical reflexivity are required. The rejection of the power-laden discourses/grand theories of the past and the emphasis on transitional explanations has resulted in minimal theorizing and an emphasis on data display, the reader, and the reader’s response. Open presentations of data can be seen in the form of narrative storytelling, or re-presentations in the form of fiction, dramatic performances, and poetry. Carolyn Ellis (2004) has argued that “story is theory and theory is story” and our “making sense of stories” involves contributing to a broader understanding of human existence. Application/generation of concepts may also occur, and the term “Critical Autoethnography” has been used (Hughes & Pennington, 2017), particularly where experiences of race, class, or gender inequality are being experienced. Jennifer Potter (2015) used the concept “whiteness of silence” to introduce a critical race element into her autoethnographic experience of black–white racial hatred experiences within a university class on African American communication in which she was a student.

Visual ethnography uses a variety of tools, including photography, sketches, movies, social media, the Web and virtual reality, body art, clothing, painting, and sculpture, to demonstrate and track culture. This approach has been available for some time both as a methodology in its own right and as a method of data collection. An example of this approach, which mixes classical and visual ethnography, is Philippe Bourgois and Jeff Schonberg’s 12-year study of two dozen homeless heroin injectors and crack smokers living under a freeway overpass in San Francisco (2009). Their data comprised extensive black and white photos, dialogue, taped conversations, and fieldwork observation notes. The themes of violence, race relations, family trauma, power relations, and suffering were theoretically interpreted through reworked notions of “power” that incorporated Pierre Bourdieu’s (1977, 1999) concepts of “symbolic violence”—linking observed practices to social domination, and “habitus”—an individual’s personal disposition comprising unique feelings and actions grounded in biography and history; Karl Marx’s “lumpen” from “lumpenproletariat” (1848), the residual class—the vagrants and beggars together with criminal elements that lie beneath the labor force; and Michel Foucault’s “biopower” (1978, 2008)—the techniques of subjugation used by the state on the population, and “governmentality” (1991)—where individuals are disciplined through institutions and the “knowledge–power” nexus. The ideas of these three theorists were used to create and weave a theory of “lumpen abuse” to interpret the lives of the participants.

Ethno Drama involves transforming the results from an ethnographic study into a performance to be shared, for example the educational experiences of children and youth (Gabriel & Lester, 2013). The performance medium can vary from a film (Woo, 2008), an article presented in dramatic form (Carter, 2014), or more usually a play script to be staged for an audience in a theater (Ethno Theater). One of the main purposes is to provide a hearing space for voices that have been marginalized or previously silenced. These voices and their contexts can be presented by research participants, actors, or the research team, and are often directed at professionals from the field. Audience-based meetings to devise recommendations for further action may follow a performance. Because of the focus on inequality, critical theory has been the major theoretical orientation for this approach. The structure of the presentation invites audiences to identify situations of oppression, in the hope that this will inform them sufficiently to enable modification of their own practices or to be part of the development of recommendations for future change.

Lesnick and Humphrie (2018) explored the views of identity of LGBTQ+ youth between 14 and 24 years of age via interviews and online questionnaires, the transcriptions of which were woven into a script that was performed by actors presenting stories not congruent with their own racial/gender scripts in order to challenge audience expectations and labels. The research group encouraged the schools where they performed to structure discussion groups to follow the school-located performances. The scripts and discussions revealed and were lightly interpreted through concepts of homelessness, racism, and “oppression Olympics”—the way oppressed people sometimes view one another in competition rather than in solidarity. These issues were found to be relevant to both school and online communities. Support for these young people was discovered to be mostly from virtual sources, being provided by dialogues within Facebook groups.

Cyber/net or/virtual ethnographies involve the study of online communities within particular cultures. Problems which have emerged from the practice of this approach include; discovery of the researcher lurking without permission on sites, gaining prior permission which often disturbs the threads of interaction, gaining permission post–data collection but having many furious people decline participation, the “facelessness” of individuals who may have uncheckable multiple personas, and trying to make sense of very disparate data in incomplete and non-chronological order.. There has been acceptance that online and offline situations can influence each other. Dibbell (1993) demonstrated that online sexual violence toward another user’s avatar in a text-based “living room” reduced the violated person to tears as she posted pleas for the violator to be removed from the site. Theoretical interpretation at the conceptual level is common; Michel Foucault’s concept of heterotopia (1967, 1984) was used to explain such spatio-temporal prisons as online rooms. Heterotropic spaces are seen as having the capacity to reflect and distort real and imagined experiences.

Poststructural ethnography tracks the instability of concepts both culturally and linguistically. This can be demonstrated in the deconstruction of language in education (Lather, 2001), particularly the contradictions and paradoxes of sexism, gender, and racism both in texts and in the classroom. These discourses are implicated in relations of power that are dynamic and within which resistance can be observed. Poststructuralism accepts that texts are multiple, as are the personas of those who created them, and that talk such as that which occurs in a classroom can be linked with knowledge control. Walter Humes (2000) discovered that the educational management discourses of “community,” “leadership,” and “participation” could be disguised by such terms as “learning communities” and “transformational leadership.” He analyzed the results with a conceptual framework derived from management theory and policy studies and linked the findings with political power.

Performative ethnography, from the post-postmodern paradigm, integrates the performances of art and theater with the focus on culture of ethnography (Denzin, 2003). A collaborative performance ethnography (van Katwyk & Seko, 2017) used a poem re-presenting themes from a previous research study on youth self-harming to form the basis of the creation of a performative dance piece. This process enabled the researcher participants to explore less dominant ways of knowing through co-learning and through the discovery of self-vulnerability. The research was driven by a social justice-derived concern that Foucault’s notion of “sovereignty” was being implemented through a web of relations that commodified and limited knowledge, and sanctioned the exploitation of individuals and communities.

This exploration of the diversity in ethnographic methods, methodologies, and interpretive strategies would be repeated in a similar trek through the interpretive, critical, postmodern, and post-postmodern approaches currently available for undertaking the various versions of grounded theory, phenomenology, feminist research, evaluation, action, or performative research.

Implications of Changes for the Researcher

The onus is now less on finding the “right” (or most familiar in a field) research approaches and following them meticulously, and much more on researchers making their own individual decisions as to which aspects of which methodologies, methods and theoretical explanations will best answer their research question. Ideally this should not be constrained by the state of the discipline they are part of; it should be equally as easy for a fine arts researcher to carry out a classical ethnography with a detailed theoretical interpretation derived from a grand theorist/s as it would be for a researcher in law to undertake a performative study with the minimum of conceptual insights and the maximum of visual and theoretical performances. Unfortunately, the reality is that trends within disciplines dictate publication access, thereby reinforcing the prevailing boundaries of knowledge.

However, the current diversity of choice has indeed shifted the field of qualitative research dramatically away from the position it was in several decades ago. The moves toward visual and performative displays may challenge certain disciplines but these approaches have now become well entrenched in others, and in qualitative research publishing. The creativity of the performative paradigm in daring to scale the siloed and well-protected boundaries of science in order to combine theoretical physics with the theories of social science, and to re-present data in a variety of newer ways from fiction to poetry to researcher performances, is exciting.

Given that theoretical as well as methodological and methods’ domains are now wide open to researchers to pick and choose from, two important aspects—justification and transparency of process—have become essential elements in the process of convincing the reader.

Justification incorporates the why of decision-making. Why was the research question chosen? Why was the particular paradigm, or paradigms, chosen best for the question? Why were the methodology and methods chosen most appropriate for both the paradigm/s and research question/s? And why were the concepts used the most appropriate and illuminating for the study?

Transparency of process not only requires that the researcher clarifies who they are in the field with relation to the research question and the participants chosen, but demands an assessment of what impact their background and personal and professional frames have had on research decisions at all stages from topic choice to theoretical analysis. Problems faced in the research process and how they were managed or overcome also requires exposition as does the chronology of decisions made and changed at all points of the research process.

Now to the issue of theory and the question of “where to?” This brief walk through the paradigmatic, methodological, and theoretical changes has demonstrated a significant move from the use of confined paradigms with limited methodological options to the availability of multiple paradigms, co-methodologies, and methods of many shades, for the researcher to select among Regarding theory use, there has been a clear move away from grand and middle range theories toward the application of individual concepts drawn from a variety of established and minor theoreticians and disciplines, which can be amalgamated into transitory explanations. The examples of theoretical interpretation presented in this article, in my view, very considerably extend, frame, and often shed new light on the themes that have been drawn out via analytical processes. Well-argued theory at any level is a great enhancer, lifting data to heights of illumination and comparison, but it could equally be argued that in the presence of critical researcher reflexivity, complex, layered, longitudinal, and well-justified design, meticulous analysis, and monitored audience response, it may no longer be essential.


  • Eco, U. (1979). The role of the reader. Bloomington: Indiana University Press.
  • Gadamer, H. (1989). Truth and method (J. Weinheimer & D. Marshall, Trans.). NY: Crossroad.
  • Grbich, C. (2004). New approaches in social Research. London, U.K.: SAGE.
  • Grbich, C. (2013). Qualitative data analysis: An introduction. London, U.K.: SAGE.
  • Lincoln, Y., & Denzin, N. (2000). Paradigmatic controversies: Contradictions and emerging confluences. In N. Denzin and Y. Lincoln, Handbook of qualitative research (2nd ed.). Thousand Oaks, CA: SAGE.
  • Lyotard, J. (1983). Answering the question: What is post modernism? In E. Hassan & S. Hassan (Eds.), Innovation and renovation. Madison: University of Wisconsin Press.
  • Pink, S. (2012). Advances in visual methodology. Thousand Oaks, CA: SAGE.
  • Riley, S., & Hunter, L. (Eds.). (2009). Mapping landscapes for performance as research. London: Palgrave Macmillan.
  • Tinkler, P. (2011). Using photography in social and historic research. New Dehli, India: SAGE.
  • Vis, F., & Thelwall, M. (2012). Researching social media. New Dehli, India: SAGE.
  • Von Hantelmann, D. (2010). How to do things with art. Zurich, Switzerland: JRP Ringier.