man at dusk against a dark sky standing on a mountain peak, holding a flashlight that shines into the dark

Tools for using Critical Theory in interpretivist research

My goal is to offer a tool for qualitative, interpretivist researchers to use Kincheloe & McLaren’s 2011 chapter applying Critical Theory to relativist research studies. I believe Critical Theory has much to offer less “realist” researchers such as I, and I would like to share this with anyone who might find it useful.

I’m going to address a few background issues to explain what I mean (so I don’t get an angry horde on the internet, should anybody ever read this post – lol), and then I’ll offer the source and my tool.

First, some background

When I was writing my dissertation proposal in leadership theory, I planned to use constructivist Grounded Theory (in the tradition of Kathy Charmaz) to study my question. My study morphed away from that design (and the various jokes about how dissertations start grand but end …. far less grand have certainly been true in my case).

Screenshot of a Tweet by Daniel Bolnick with a photo and a phrase, "Thesis Proposal. Thesis." The photo includes two images side-by-side. The left is a beautifully carved wood railing on a staircase featuring a majestic stallion rearing up on his hind legs.  The photo on the right shows a normal stair railing ending in a plastic toy horse strapped to it. Expectations vs reality.

In my original proposal, I included a discussion of why Critical Theory adds value to my inquiry. I wasn’t intending to do a CT study, but leadership studies that don’t at least pause to consider issues of power and structural disenfranchisement (and gaining some wisdom from our CT friends on that) strike me as naive. Has the researcher considered who is “in” and who’s “out” and how that affects the validity of their research data (even if they’re doing a wholly qualitative study)?

My husband — who was trained at UGA as a qualitative researcher in education — challenged me on this point. He pointed out that Critical Theory rests on a realist approach to epistemology, and my study was stating a constructivist foundation (to stay in line with the approach I wanted to use for GT). He argued that CT demands a realist framework, and I had proposed something with an inherent contradiction.

OK, fair point.

He also suggested that I move instead to “being ‘informed’ by CT” — sensitized to its concepts and questions. This offered me a way to keep an eye on those critical questions which mean a lot to me as a researcher. My committee didn’t push me too hard on that point, and the compromise made it into my proposal.

I’m currently elbows-deep in writing my results and discussions chapters, analyzing the data from what ended up being an exploratory case study (Covid conditions and the short timeframe imposed by my institution for this program made grounded theory impractical). I am definitely at the “plastic horse” stage of the writing. lol

In my data analysis, I ran across Kincheloe and McLaren’s excellent 2011 book chapter on the useful applications of modern Critical Theory to qualitative, interpretivist research. (I resisted the urge to print all 50 pages and slap it down on my husband’s desk in triumph.)


The original source for my tool

Kincheloe, J. L., & McLaren, P. (2011). Rethinking Critical Theory and Qualitative Research. In K. Hayes, S. R. Steinberg, & K. Tobin (Eds.), Key Works in Critical Pedagogy (pp. 285–326). SensePublishers. https://doi.org/10.1007/978-94-6091-397-6_23

(PDF of the chapter)

We can be against critical theory or for it, but especially at the present historical juncture, we cannot be without it.

Kincheloe & McLaren, 2011, p. 286

Kincheloe and McLaren argue that many dimensions of critical inquiry align with the overarching purposes of the qualitative researcher who seeks to shake up established “truths” and challenge norms. The purpose of their chapter is to work through a variety of dimensions in an attempt to give an “idiosyncratic” definition of what Critical Theory is (and isn’t) as we move deeper into the 21st century. “The following points briefly delineate our interpretation of a critical theory for the new millennium” (p. 288).

A critical postmodern research requires researchers to construct their perception of the world anew, not just in random ways but in a manner that undermines what appears natural, that opens to question what appears obvious. … [I]nsurgent researchers ask questions about how what is has come to be, whose interests are served by particular institutional arrangements, and where our own frames of reference come from. Facts are no longer simply “what is”; the truth of beliefs is not simply testable by their correspondence to these facts. To engage in critical postmodern research is to take part in a process of critical world making, guided by the shadowed outline of a dream of a world less conditioned by misery, suffering, and the politics of deceit. It is, in short, a pragmatics of hope in an age of cynical reason.

Kincheloe & McLaren, 2011, p. 315 (emphasis mine)

The tool: Use a Critical lens and reflect on your research design and data

So how could this treatise be of use? I immediately saw the value in taking notes on each sub-section, skimming through the critical dimensions to remind myself of elements that might be useful to me later. I wrote a series of questions aligned to the article and saved my notes for later.

This weekend, as I’ve been reflecting on the entirety of my MAXQDA codes from my subject interviews, I returned to my notebook with its handwritten list of questions and created a Word document. Not all of these questions were relevant to my needs, but the exercise helped me clarify the role of power and power-structures in the context of my research. Lots of good “future directions for research” ideas on there too.

Here is my tool in document form:

Link to Word file | Link to PDF file

The Word file will be more useful than the PDF, but both are here.

CC-BY license logo: Creative Commons - free to use, share, and remix with attribution

Creative Commons – BY license: You are free to share, remix, and use this tool freely, with attribution. Thanks!

Draw what you read

Ran across this on LinkedIn (don’t laugh; it was my once per 6mo jaunt through old messages and eyerolls because LinkedIn’s ad structure is just so …. obnoxious) and ran into Jeremy Waite’s brilliant book sketches. He reads and then draws visual notes and representations of concepts to help him remember what he read.

I like to say “All good ideas are stolen” — by that, I mean that we’re always borrowing good ideas we see out in the world, adding them to the ideas we already have, and remixing them into new things. I enjoy visual notetaking.

What I like about Jeremy’s style:
– it’s very neat and readable; he can save these and refer back to them or share them with others, and they’ll instantly be useful
– the colors truly help because the whole page isn’t color; he uses the color to draw his eye back to headings or ideas
– the mix of text and visual elements is much more text-heavy than what I usually see in people’s visual note-taking, but it’s more in line with what my visual notes look like (except mine are a hot mess compared to this)

Personal goal

I don’t think I should commit time to anything new right now, but I might try sketching a few more visual notes as I read core books and posting them here. No promises on this one; my bandwidth is mostly claimed and I’d lose myself in the joy of colored pencils rather than slaving in the salt mines of reading articles from my Zotero pile. 😀

A short, helpful overview of one qualitative data analysis method

Sarah J Tracy, 2018: “A phronetic iterative approach to data analysis in qualitative research”

Source:

Tracy, S. (2018). A phronetic iterative approach to data analysis in qualitative research. Journal of Qualitative Research 19(2), 61-76. https://doi.org/10.22284/qr.2018.19.2.61

Full-text available on Tracy’s website


I absolutely adore the writing and work of Dr. Sarah Tracy, a professor of communications theory and qualitative research at Arizona State, and author of my favorite textbook on doing qualitative research:

link to book on Amazon

Seriously, if you are heading into a qual research project, buy a copy of this book. It’s worth every penny.

So I was thrilled to run across an article on her website which offers a much shorter overview of her recommended method for qualitative data analysis. Link and source are all above.

This is the perfect article to put into your files if you work with undergraduates, grad students, or new researchers. It explains Tracy’s simple-to-follow “phronetic iterative” approach to moving from initial research question(s) to simple first-level coding, then analytical coding and memo-writing.

Jargon-free and straightforward, the article is a great example of how Tracy understands (probably through her experience teaching) the need to explain research methods in terms that people can grasp, so they can spend more time on building their actual research skills.


If, like me, you had to pause on the word “phronetic” for a moment, this might be a helpful explanation of the term. From Tracy’s article:

Phronēsis is an ancient Greek word that is typically translated to mean “prudence” or “practical wisdom” (Aristotle). Phronēsis prioritizes examination of contextual knowledge. Social action is always changing; therefore, situated meanings are crucial for making sense of any given social phenomenon. Phronēsis also focuses on the way that data can be systematically gathered, organized, interpreted, analyzed, and communicated so that it illuminates significant problems and can contribute to transformation and improvement in relationships, organizations, and societies

(Tracy, 2018, 62).

“On the scientific study of small samples” (McDermott 2023)

Source:

McDermott, R. (2023, June)). On the scientific study of small samples: Challenges confronting quantitative and qualitative methodologies. Leadership Quarterly 23, 3. https://doi.org/10.1016/j.leaqua.2023.101675

Full text here: https://www.sciencedirect.com/science/article/pii/S1048984323000012?via%3Dihub

The gist:

On research vs science as a general introduction: McDermott writes from a positivist perspective, but she is very careful in her opening discussion to define what she sees as the difference between “research” and “science” in a way that doesn’t bash qualitative or interpretivist researchers. (I don’t fully agree with her opening discussion, but I definitely appreciate it.) She critiques poor research design on the part of both quant and qual researchers, and delivers well-deserved criticism of interpretivist researchers attempting to use qual methods to establish causation. (I’m not sure how often this happens in interpretivist research, but the pressure to “prove something” through one’s work is immense.)

On avoiding key research errors when working with small sample sizes: This is where the article really shines. McDermott reviews 6 key problems and offers a solution for each. The summary of her suggestions appears in this screenshot of Table 1 (the entire article is online for free), but I encourage you to read the discussion, as she provides many helpful examples to illustrate both the problems and the solutions.

screenshot of Table 1 from McDermott 2023; article is available online full-text for free at LQ

My thoughts on McDermott’s discussion

I don’t agree with every point here; as a qualitative researcher, I find single-case research to hold value even if it does not fit McDermott’s definition of “scientific,” and her recommendation to add cases to avoid a single-case study doesn’t fly in many real-world settings (like examining a particular unique and non-reproducible instance of failure or success).

That nit-pick aside, I took to heart many of her critiques.

For example, it is tempting as a qualitative researcher to use the same data to both generate a theory / hypothesis as well as draw a conclusion about that statement using the same limited set of data. However, more robust research would use an initial study to identify key ideas and themes for future follow-up research, building study upon study. Of course, funding is the bugaboo here; we can do only what we can afford to invest of time and resources, and those are finite (even if the money is available, which it often is not, especially for non-quantitative approaches). But interviewing over Zoom costs only time and effort and outreach. An investment, yes, but one that could pay off with a much more valuable dataset and more established conclusions.

One of McDermott’s primary suggestions is to seek opportunities to find counter-factual observations. This advice lines up with a central theme that arose out of my reading in constructivist grounded theorists like Kathy Charmaz, whose 2014 handbook for grounded theory continually emphasized the need to go beyond simplistic explanations and to seek out participants whose experiences or contributions would challenge the emergent theory. The author’s examples drawn from historical research where particularly useful here, such as her critique of some classic leadership studies which made assumptions about leadership causes and effects too hastily.

One example of bad research that I’d not read before: the famous “queen bee” study that proposed powerful women inhibit the growth and development of younger, less established women in their orbit to stifle competition for their power, was written on only one set of data, with no counterfactual investigation. Later research refuted the whole idea! I’ve heard that “queen bee” research cited many times; it’s probably passed into head canon for many folks who read HBR and track equity issues in management. Launching a splashy article on a single dataset is suspect, always, but current publishing pressures push “new” research over the crucial work of verification and replication of proposed ideas.

Readability & Value

I found this article to be highly readable, accessible, and clear. I would expect any graduate student to be able to read these few pages (the PDF is only 10 pages) and follow McDermott’s article and recommendations. Upper level undergraduate students should be able to work through the article with scaffolding from their professor as needed to provide context on the issues being discussed. (I would particularly like to see more of this discussion for undergraduate psych programs.)

I would include this as required reading in any graduate course on research methods, whether quant or qual, because she discusses both methodological traditions and offers a working definition of positivist “scientific” research in the opening discussion. She also overviews key techniques for improving any study design. I’d love for this article to be one that doctoral students discuss with their advisors when they are planning their methods section.

Interpretivist scholars may bristle at being excluded from the “scientific wing” of “research,” but we should be used to this by now. And I’m not mad about it; qualitative has more to offer when it’s doing something quant methods cannot do.

Leaders shape how well an organization can learn and adapt

Two women meeting at a table. Photo by Christina @wocintechchat.com courtesy of Unsplash.

Power is a leadership tool that should be used to enhance the knowledge-processing capacity of the organization.

Martin & Marion, 2005, p. 149 (article link)

In my dissertation research, I studied the flow of information and influence within undergraduate STEM departments where faculty were interested in moving to more learner-centered teaching approaches. (I am almost able to share the ProQuest link – will add when it’s live.)

My research shows that faculty hold more power in the middle than they may realize – they are integral partners to the vital leadership work of information flow and processing new ideas into the organization’s cultural repository.

Some background on how ideas flow through organizations and why you should care

At a deep level, organizations run on how quickly good ideas can disseminate through people (teams, departments, individuals) and be adopted. Anytime something goes wrong or a challenge emerges, people begin working to solve it – this process could be formal or informal, quiet or shouted from the rooftops.

What matters, according to the authors of one of the foundational theories to my research (McElroy & Firestone), is that this Knowledge Life Cycle (their term) does run at every level of an organization. Whether yours is functional or broken can have a huge impact on how well an organization is able to respond to threats and challenges and new problems.

Martin and Marion did the first research into how this concept might be applied to higher education institutions by studying the various roles that top leaders play in shaping the KLC of their institution. The researchers settled on six roles that a leader plays to facilitate an open and helpful flow of knowledge throughout an organization at all levels.

See, leaders can’t actually put their hands on this Knowledge Life Cycle process. They can’t force people to come up with new ideas or implement them. (Well, leaders can force implementation to a point, but they definitely can’t make people “like” it or remember to care.) What leaders can do is build an environment which encourages everyone to own the problems that emerge in their areas and disseminate that knowledge quickly and helpfully across the organization (as appropriate).

Leaders can and should build an organizational environment that upholds the cultural values they want to see. They should (in the words of Jim Collins, Good to Great) put the right people in the right seats on the “bus” to create healthy departments. And they can take specific steps to improve the flow of crucial problem-solving knowledge and solutions across their whole organization.

The idea here is not that individuals can fix any organizational mess simply by “knowing things.” Rather, it is to acknowledge that learning (ideas, processes, skills, data, etc) is foundational to solving any problem, and the most effective solutions arise out of humans working together in an environment of transparency, good information flow, and safety. I am using “knowledge” loosely as a term for what we need to know or be able to do in order to accomplish the work of the organization.

A flowing rainbow of light down a dark image. Photo by Tobias Carlsson on Unsplash

Leadership roles in organizational information flow

According to Martin and Marion, there are six key roles for leaders. I will discuss each of these.

Environment manager

Leaders need to break negative patterns of behavior and replace them with new methods of solving problems, identifying the gaps in organizational knowledge that are causing challenges, and create the kind of workplace where people feel safe to speak up about problems. (LJR: My study of leadership and my work experience lead me to believe that this role is one of the most crucial for leadership, as it is nearly impossible for people outside of the power structure to change the environment in significant ways, outside of perhaps large-scale solidarity movements.)

Network manager

People exist within social networks in every organization. Complexity Theory provides key insights into how many organizational behaviors are emergent from the relationships that exist between employees. It is the job of the leader to ensure networks within the organization are healthy and broad enough to enable information to flow among departments. Martin and Marion write, “The strength of organizational networks is much larger than individual relationships; it is a collusion of multiple roles and expertise bound together to … enable creative thinking and [strengthen] collaboration and knowledge processing networks” (Martin & Marion, 2005, p. 144).

Policy manager

Here is where the leader’s power is most obvious: policy decisions are often under the purview of leadership, thus leaders are ultimately responsible to ensure that the organization benefits from clear guidance that empowers individuals to identify a problem and start finding solutions, and then to share that solution so it’s beneficial to the entire organization. Likewise, leaders need to break down clunky bureaucratic processes that inhibit people from taking action rapidly in the face of big problems. Nobody said this would be easy, but a healthy environment for knowledge and problem solving is usually one where people aren’t so boxed in that they can’t interact with others across the entire organization.

Crisis manager

We learn the most after something really breaks. Leaders must initially manage crises to get things back on track, but they must also take the opportunity to break apart processes and internal structures that led to problems blowing up instead of being recognized and solved. Repeated crises may be a symptom of a deeply dysfunctional knowledge life cycle — problems are not being recognized, or people do not feel empowered to attempt to solve them, or individuals are too afraid to speak up and identify challenges.

Knowledge gap identifier

This is perhaps my favorite of the six roles, personally, as I often find myself taking on this work naturally within any organization I’m part of. (My research shows that some of these roles, including this one, should be inhabited by front-line employees as much as they are by the leader.) Martin and Marion write, “The ability to identify knowledge gaps is a critical leadership skill.” Ideally, many people in the organization are seeing gaps in the organization’s abilities or information and bringing those to light for solution, but it is absolutely on the shoulders of the leader to make herself aware of what’s happening across all levels.

Future leader preparation

Leaders must be role models of the behaviors that will lead to an open environment where people feel safe to point out problems and work together to solve them. (Ron Heifetz’s books on Adaptive Leadership Theory are very helpful here.) But no one person can conquer this task alone. Leaders must be part of mentoring promising leadership candidates and ensuring that everyone, especially potential leaders, knows of the importance of knowledge processing for the health and success of the organization.

people with hands on a tree trunk, shot from the roots at an upward angle so you see only people's hands and the tree reaching toward the sky. Photo by Shane Rounce on Unsplash

What you can do with this theory

I love finding practical theory and thinking about how I might apply it in my role or imagine how I would lead a team in light of what I’ve learned. I find the Knowledge Life Cycle to be an interesting way to approach visualizing the way an organization is empowered to solve problems for itself.

Some “problems” are tiny and personal. YOU have your own knowledge cycle; you knock it into gear every time you realize you’re going to need to change something about how you work to make things run more smoothly or in response to a change in your environment.

Individually, I try to run a “stop / start / continue” exercise for my personal practices at work every 6-12 months. I have often abandoned tasks I had created for myself “to be helpful” or “to stay organized” which ended up costing me more time than they were worth. Ultimately, the knowledge life cycle is about recognizing a gap, testing solutions, and picking the “best fit” for the circumstances. As things change, our practices often need to be updated.

Every level of an organization has its own KLC: individuals, the teams they work with, departments, large business units, and the organization as a whole. Anyone who aspires to be a leader should begin honing their ability to recognize a healthy knowledge cycle vs a dysfunctional one.

Application to higher education

For faculty in higher education, my research suggests that faculty inhabit three of these roles at times, at least in their own departments: environment manager, network manager, and knowledge gap identifier. (Deans or department heads, should take on all 6 roles as core to their work.) Frontline faculty, even brand new folks early in their career, are integral to building a healthy departmental environment where people model recognizing challenges, speaking up about what they see, and sharing ideas — what they’ve tried, what’s worked or not worked.

Likewise, faculty can and should work to build strong social connections within their departments and beyond, reaching across the university to get to know a variety of faculty and staff partners who can be instrumental in solving instructional problems. While it’s true that, say, chemistry professors are not going to ask someone in the history department how to teach labs, all professors are educators at the core, and students benefit from faculty who are trained in instructional techniques and supported by their peers in learning and growing as teachers.

Every employee can and should recognize knowledge gaps in the processes they encounter every day. I should clarify that a “knowledge gap” isn’t some fancy event — you hit a knowledge gap whenever you realize a process has broken down, or a new challenge has emerged, or a problem isn’t going to be solved via some off-the-shelf simple fix. Who is best positioned to see these emergent challenges? The people who do the work of the organization!

Resources you can use

I’d like to conclude by recommending some good reads.

For more on higher education, read the article by Martin and Marion which I’ve linked here. It’s not long, and it’s a helpful model for strengthening higher education institutions.

Of course, I’d love for people to read chapter 2 or chapter 5 in my dissertation if you want to think more about how faculty participate in the knowledge life cycles of their departments and institutions.

For the foundational theory, check out this book:

  • McElroy, M. W. (2003). The new knowledge management: Complexity, learning, and sustainable innovation. KMCI Press. You can get a used copy for $1 or two on Amazon (my affiliate link to the book).