Volume 107, Issue 4 pp. 526-530
Guest Editorial
Free Access

Striving for Methodological Integrity in Mixed Methods Research: The Difference Between Mixed Methods and Mixed-Up Methods

Elizabeth G. Creamer

Elizabeth G. Creamer

Virginia Polytechnic Institute

Search for more papers by this author
First published: 29 November 2018
Citations: 16

Tacit misconceptions about mixed methods research are evident not only in the way research is conceptualized, executed, and reported but also in the way it is embodied in day-to-day practice. In the name of efficiency, for example, teams are often divided into those with expertise in quantitative research and those who feel more confident of their qualitative expertise, with little thought to the role a mixed methods person might play in bringing the two together. Each group is responsible for a separate stream of publications. For example, researchers in the health fields who are implementing research with an experimental design are encouraged to add a qualitative phase to tap into participants' views, but there is little evidence that these are integrated into the quantitative phase in a way that has a significant impact on the design of the intervention or the way it is implemented. Each of these practices not only limits the likelihood of leveraging the full potential of mixed methods but also counters some of the most fundamental principles that drive its practice.

Much has been written about quality in mixed methods research, most attending to standards of reporting and prioritizing methodological transparency. Advocates of this approach, including my own Mixed Method Evaluation Rubric (Creamer, 2018a), place so much emphasis on this transparency because it reflects a mandate to communicate the results of research with enough precision and clarity to build a case for the credibility of the results and to allow others to build on the results in another study. The recent release of a report from a task force established by the American Psychological Association (Levitt et al., 2018) also prioritizes methodological transparency. Among the most salient standards are the expectation that a report contains explicit language that provides a rationale for the benefits expected to accrue from a mixed methods approach, identification of one of the widely recognized designs (e.g., concurrent, sequential, or multi-phase), and explicit commentary about how the qualitative and quantitative strands, data, and/or analytical procedures were integrated or mixed. Reflecting on these kinds of concerns in a manuscript goes a long way in warranting the conclusions of a study and are very likely to facilitate communication with an audience that extends beyond a single discipline or content area.

Guidelines or standards for reporting are not without their detractors who charge that they push a drive for standardization that smothers innovation and that they default to a prescriptive checklist mentality. Although they are not unrelated, adhering to standards for reporting is not the same thing as empirically sound research. Good reporting can camouflage poor research, just as poor reporting can downplay research that is both innovative and solidly wedded to both content and methodological knowledge.

My aim in this editorial is to offer practical guidance to newcomers to mixed methods research as they design, implement, and publish the results of a study using mixed methods. My comments will also be helpful to reviewers of mixed methods manuscripts, as well as to those undertaking the task of preparing a credible mixed methods proposal for external funding. From my experience in teaching graduate level courses in mixed methods research design and in producing a textbook about it, I set out to address common misconceptions that are evident in the way mixed methods research is often practiced. In doing so, I depart from the convention of equating quality with a set of guidelines about how to achieve methodological transparency in reporting. Instead, I argue for what I am referring to as methodological integrity. By methodological integrity, I mean that research is conceptualized and implemented in ways that resonate with the fundamental assumptions or core principles of a methodology. In qualitative research, this means that there is an inductive or exploratory component. Hypothesis testing is a core principle in quantitative research. In mixed methods, methodological integrity means embracing the principal argument for its use: integrating qualitative (open-ended, small sample) and quantitative (close-ended, large sample) data produces stronger conclusions than can be achieved with a single method alone. Thinking from the perspective of methodological integrity, enhances quality while at the same time leaving room for innovation. It is the difference between mixed methods and mixed-up methods.

By methodological integrity I mean that research is conceptualized and implemented in ways that resonate with the fundamental assumptions or core principles of a methodology.

Misconceptions about mixed methods research can influence daily practice as well the potential to leverage research to generate innovative insight and nuanced explanatory power. The research practices of engineering education researchers can be advanced by reflecting on four misconceptions: (a) a project with a qualitative and quantitative strand is necessarily mixed methods; (b) the only place to bring the qualitative and quantitative strands together is at the final interpretive stage of drawing conclusions; (c) discordance and incongruity between findings from the qualitative and quantitative data primarily signal a weakness in the instruments, methods, or sampling procedures; and (d) it is possible to implement a mixed methods research project with little or no familiarity with the philosophical and procedural assumptions that guide any research method, including mixed methods.

QUAL + QUANT Is Not Always Mixed Methods

The single most common misconception about mixed methods is that this label is appropriate whenever a study has a qualitative phase and a quantitative phase, regardless if there is any opportunity to integrate the two. This insidious logic is evident in commonly accepted search terms for mixed methods publications that equate mixed methods with QUAL + QUAN (mixed method* or [Qualitative + Quantitative]). It is often evident in practice in what is referred to as either a concurrent or parallel design where the qualitative and quantitative data collection and analysis are kept entirely separate. From the perspective of mixed methods, what is missing, of course, is the epistemological precept that places a premium on the potential gains in knowledge when different sources and types of data are actively and systematically engaged.

Ways to Integrate Qualitative and Quantitative Data Are Not Limited to the Final Stage of Drawing Conclusions

The original framing of mixed methods research designs largely positioned the act of integrating or mixing qualitative and quantitative data as something that occurs at the final stage of the research process during the interpretive process of drawing conclusions. This thinking is evident in early visualizations of the core designs where the qualitative and quantitative strands are kept separate and only intermingle as a final step in the research process. More recent depictions of the core designs situate mixing as something that also occurs during analysis. This might be the case, for example, when items from a survey are used to test themes that emerged through the qualitative analysis. Or it might be seen in a table referred to as a joint display that arrays qualitative themes or quotes with a score from a quantitative index.

Fully integrated mixed methods research (FIMMR) has been described as the most dynamic and interactive of the mixed method designs. It provides opportunities to integrate qualitative and quantitative data and procedures at every stage of the research project: from the research questions, through data collection, sampling and analysis, and in drawing conclusions. This type of design is necessarily sequential and recursive. It prioritizes an ongoing dialectical engagement between the qualitative and quantitative strands of a study. FIMMR resonates with the philosophical or theoretical assumption that the mixed methods label is most apt when there is an intent to communicate that the interface between the qualitative and quantitative strands is key to understanding the way the research was conducted and the conclusions that were drawn (Creamer, 2016). It can be framed by a mindset that launches a project but also often emerges unexpectedly as the best label to characterize the results of a complex project that evolves in unexpected ways.

Incongruities Between Findings from Different Sources of Data Are Not Necessarily a Methodological Limitation

It is very common for a researcher in the midst of a mixed methods project to find that their qualitative and quantitative data tell very different stories, even about the same construct. Qualitative data, for example, often reveal differences between individuals or sites not evident in the quantitative data. It is very common for researchers to conclude that the intermingling of qualitative data with the results from a quantitative instrument reveal that a phenomenon is far more complex than the quantitative instruments used to measure it. In a funded project I was involved in, for example, we found striking differences between how undergraduate women responded on a questionnaire and what they reported 3 months later during an interview about who was influential in their choice to pursue engineering.

While this kind of situation might challenge the researcher who assumes there is always a “right answer” and that answer is always singular, I am among those who have written with enthusiasm about the potential for theoretical insight that arises from pursuing unexpected and even counterintuitive findings with additional analysis (Creamer, 2018b). Seeking information about unanticipated outcomes of an intervention is an example of this, as is being intentional in seeking an explanation for negative or extreme cases. The potential explanatory implications of dissonance that emerges during analysis, including its contribution to theoretical insight, are only evident in light of an informed grounding in the literature.

Writing in an Informed Way Requires Grounding in the Methodological Literature

In the process of scouring the literature in diverse fields for exemplars to use in my writing, I have been struck by the amount of innovation in the use of mixed methods that is evident in practice and that this evidence extends into new emerging fields of inquiry. I have uncovered many examples of the use of innovative strategies for mixing that have yet to surface in textbooks. This is a case where practice seems to have outpaced the literature. Examples of innovative insights, including those that challenge long-standing theory, seem to occur when exigencies in the data drive researchers to seek explanations from new sources of literature or additional data collection. Reading diverse perspectives in the foundational literature promotes innovation and agency because it underscores that there are many different ways to approach mixed methods research.

That said, it is my firm conviction that producing nuanced research requires both sophistication about a content area as well as expertise in the research methods that are used. While initial reading of relevant methodological literature may help in the framing and designing of a study, it is often necessary to reinvest time in the literature to help understand unexpected results once the study is underway. At the very least, some familiarity with textbooks and articles about the methods provides the foundation required to accurately use the appropriate language for each that is so key to methodological integrity.

Concluding Thoughts

I noted that my purpose in this editorial was to offer practical insight that would be useful to a researcher inexperienced with mixed methods. I have based my comments on the idea that methodological integrity is a useful umbrella for understanding quality in mixed methods research. Sandelowski (1996) refers to a similar idea when she argues that researchers using mixed methods have an obligation to ensure the integrity of each set of techniques. I add to her views by suggesting that, as challenging as it may be, ensuring methodological integrity requires attention to three methods (qualitative, quantitative, and mixed methods) and not just two. Part of the challenge is that in mixed methods, methodological integrity requires deploying technical language associated with each of the three methods involved.

There are only two assumptions in mixed methods that are at its methodological foundation: one, both qualitative and quantitative data and analytical procedures are used and, two, these are integrated in some meaningful way during the research process. I join those who posit an additional stipulation that mixed methods involve the active engagement of different perspectives and standpoints (e.g., Greene, 2007; Johnson & Onwuegbuzie, 2004). The “mixed-up methods” moniker may be deserved when foundational assumptions are disregarded or when the label mixed methods is applied to research that defies foundational assumptions, including about indicators of quality, of both qualitative and quantitative approaches.

Partly because it often involves complex, multidimensional constructs operating in a multilevel environment, mixed methods invite creativity when built on the foundation of the basic assumptions I have mentioned. Creativity in mixed methods is often achieved through integrating different sources of data and often, different viewpoints in innovative ways. As takeaways from this editorial, I suggest that the potential for creativity and innovative new insights can be advanced by (1) taking advantage of opportunities to engage the qualitative and quantitative strands at multiple points in the research process; (2) pursuing dissonance and incongruities between sources of data; and (3) reading broadly from the methodological literature to gain a sense of the diversity of viewpoints. These underscore that quality in mixed methods is best achieved, not with slavish attention to a recipe, but with foundational knowledge that spans different viewpoints and that can serve as a platform for generating innovative new perspectives, methods, and methodological combinations.

Biography

  • Elizabeth G. Creamer is the author of the 2018 Sage textbook, An Introduction to Fully Integrated Mixed Methods Research, and Professor Emerita in the School of Education at Virginia Polytechnic Institute, 3012 Lancaster Drive, Blacksburg, VA 24060; [email protected].

    The full text of this article hosted at iucr.org is unavailable due to technical difficulties.