The epistemic fallacy in education: Beyond ‘what works’

In contemporary educational discourse, the push for evidence-based practice (EBP) and the ‘what works’ policy agenda has gained significant traction. Policymakers and educational leaders emphasise measurable outcomes, scientific methodologies and the application of research findings to inform teaching and learning strategies (for a positive take, see Sanders & Whelan, 2022; for a critical overview in relation to education, see Jones, 2025). 

While this approach seems logical, it is fraught with epistemological challenges, particularly the epistemic fallacy. This fallacy – where knowledge (epistemology) is conflated with reality (ontology) – has profound implications when applied to education, especially in understanding teaching and learning as complex social phenomena (Bhaskar, 1989).

This blog explores how the epistemic fallacy manifests in the ‘what works’ approach to EBP, critically examining its limitations through the lens of critical realism. It highlights the risks of reducing complex educational realities to empirical data and argues for a more nuanced approach that accounts for the multifaceted and context-dependent nature of education.

Understanding the Epistemic Fallacy

The epistemic fallacy, a term introduced by Roy Bhaskar in his development of critical realism, refers to the erroneous reduction of what exists (ontology) to what can be known or measured (epistemology) (Bhaskar, 1989). In the context of education, this manifests in the assumption that measurable outcomes, such as test scores, fully capture the effectiveness of teaching methods or educational interventions.

For instance, when policymakers advocate for a ‘what works’ approach, they often rely on randomised controlled trials (RCTs) and large-scale quantitative studies. These methodologies, while valuable, assume that the causal mechanisms underlying educational success can be directly observed and quantified (Goldacre, 2013; Slavin, 2002). This often overlooks deeper, potentially unobservable factors such as social structures, teacher-student relationships, institutional cultures and broader socio-cultural influences (Huang & Pu, 2024; Tikly, 2015).

Critical realism challenges this assumption by advocating for a stratified ontology, recognising three levels of reality (Bhaskar, 1975; Tikly, 2015):

  1. Empirical Level: Direct observations and experiences (e.g., student test scores).
  2. Actual Level: Events that occur, regardless of whether they are observed (e.g., classroom interactions).
  3. Real Level: Underlying causal mechanisms that shape these events (e.g., systemic inequalities, pedagogical philosophies, cultural identity).

By collapsing these levels into a single empirical layer, ‘what works’ EBP risks misrepresenting the complexity of educational reality (Bhaskar, 1998; Wrigley, 2019).

The ‘What Works’ Agenda: A Problematic Framework

The ‘what works’ policy agenda is built on the assumption that effective educational strategies can be universally identified and applied across diverse contexts. This approach has been influential in shaping education policy, funding decisions and teacher training programs. However, it exhibits three key epistemic fallacies:

1. Reductionism: Simplifying Complex Realities

A major flaw in the ‘what works’ agenda is its reductionist approach to education. Teaching and learning are context-dependent processes influenced by culture, history and individual student needs. Yet, the science-centric ‘what works’ configuration of EBP often treats them as laboratory-like conditions where interventions can be isolated and tested in controlled environments.

For example, an RCT might conclude that a particular reading intervention improves literacy scores in a number settings. However, this does not mean the same intervention will be equally effective in every school with distinct socio-economic and socio-cultural conditions, teacher expertise or experience (Huang & Pu, 2024). The assumption that educational strategies can be universally applied disregards the reality that schools function as open systems with multiple interacting variables (Bhaskar, 1998).

Terry Wrigley critiques this reductionism, arguing that such approaches often oversimplify the complexities inherent in educational settings, leading to policies that may not address the nuanced needs of diverse learning environments (Wrigley, 2004, 2019). This echoes Biesta’s (2007, 2010) argument that education is not simply about discovering ‘what works’ but about understanding what is educationally desirable.

2. Empirical Bias: Privileging Measurable Outcomes

Another epistemic fallacy in the ‘what works’ approach to EBP is the prioritisation of empirical data over qualitative insights. Standardised test scores, attendance rates, and other quantifiable metrics dominate policy discussions, often at the expense of richer, context-driven understandings of learning (Hammersly, 2005, 2013; Wrigley, 2016).

Critical realism argues that not all causal mechanisms are directly observable. A teacher’s ability to inspire critical thinking, a student’s engagement with learning materials, or the long-term impact of a curriculum cannot always be captured through short-term data collection (Bhaskar, 2008). By overemphasising what can be measured, the ‘what works’ notion of EBP marginalises aspects of education that are crucial for meaningful learning (Newman, 2020).

3. Neglecting Agency and Context

‘What works’ EBP often fails to account for teacher and student agency – the capacity to act within and shape educational environments. It assumes that successful strategies can be imposed top-down, disregarding the role of educators in adapting methodologies to their unique classrooms (Archer, 1995).

For instance, a study may find that direct instruction is the most effective teaching method for improving math scores. However, in a real-world setting, the effectiveness of direct instruction depends on how teachers implement it, their relationship with students and the broader cultural context of learning (Pu, 2022). Critical realism emphasises that agency is shaped by structural conditions but also has the power to modify those conditions (Bhaskar, 1989).

Towards a More Realistic Educational Research Paradigm

A more effective approach to educational research and policy must move beyond the epistemic fallacy by integrating critical realism’s ontological depth. This means recognising that:

  1. Evidence Should Be Context-Sensitive: Rather than searching for universal ‘best practices,’ educational research should focus on what works for whom, in what contexts, and under what conditions (Tikly, 2015).
  2. Qualitative and Quantitative Research Should Be Complementary: While quantitative methods provide valuable insights, they should be complemented by qualitative approaches such as ethnography, case studies and critical discourse analysis (CDA) (Scott, 2010; Hammersley, 2013; Wrigley, 2016; Huang & Pu, 2024).
  3. Teachers and Students Must Be Seen as Active Agents: Policies should not merely dictate strategies based on large-scale studies but should empower educators to interpret and adapt research findings to their specific classrooms (Archer, 1998).
  4. Education Should Be Viewed as an Open System: Unlike controlled experiments in natural sciences, education is influenced by a web of interconnected factors (Pawson, 2013; Scott, 2010).

Conclusion: Beyond ‘What Works’ to ‘What’s Real’

The push for evidence-based practice in education is well-intentioned, aiming to improve teaching and learning through scientific research. However, when it succumbs to the epistemic fallacy, it risks oversimplifying educational reality, privileging quantifiable empirical data over deeper causal explanations, and marginalising the agency of educators and students.

To build a truly effective educational system, we need to move from asking ‘what works’ to exploring ‘what’s real’ (Bhaskar, 2008). By adopting a critical realist perspective, we can ensure that educational research and practice contribute to meaningful and transformative learning experiences.

References

  • Archer, M. (1995). Realist social theory: The morphogenetic approach. Cambridge University Press.
  • Archer, M. (1998). Being human: The problem of agency. Cambridge University Press.
  • Bhaskar, R. (1975). A realist theory of science. Routledge.
  • Bhaskar, R. (1989). Reclaiming reality: A critical introduction to contemporary philosophy. Verso.
  • Bhaskar, R. (1998). The possibility of naturalism: A philosophical critique of the contemporary human sciences (3rd ed.). Routledge.
  • Bhaskar, R. (2008). Dialectic: The pulse of freedom. Routledge.
  • Biesta, G. (2007). Why ‘What Works’ Won’t Work: Evidence-Based Practice and the Democratic Deficit in Educational Research. Educational Theory, 57(1), 1–22.
  • Biesta, G. (2010). Why ‘What Works’ Still Won’t Work: From Evidence-Based Education to Value-Based Education. Studies in Philosophy and Education, 29(5), 491–503.
  • Fairclough, N. (1989). Language and power. Longman.
  • Goldacre, B. (2013). Building Evidence into Education. Department for Education.
  • Hammersley, M. (2005). The Myth of Research-Based Practice: The Critical Case of Educational Inquiry. International Journal of Social Research Methodology, 8(4), 317–330.
  • Hammersley, M. (2013). The Myth of Research-Based Policy and Practice. SAGE.
  • Huang, P., & Pu, S. (2024). Towards an explanatory critique of social reality: How critical realism can frame the application of critical discourse analysis in educational research. Cambridge Journal of Education.
  • Jones, A. (2025). Pedagogical prophet? David Hargreaves’ 1996 vision for a research-based profession: A 2024 reality check. PRISM, Early View.
  • Newman, J. (2020). Critical realism, critical discourse analysis, and the morphogenetic approach. Journal of Critical Realism, 19(5), 433–455.
  • Pawson, R. (2013). The science of evaluation: A realist manifesto. SAGE.
  • Pu, S. (2022). Critical thinking in academic writing: A cultural approach. Routledge.
  • Sanders, M., & Whelan, E. (2022). What Works, faster: Towards a “rapid method”. The Policy Institute, King’s College London.
  • Scott, D. (2010). Education, Epistemology and Critical Realism. Routledge
  • Slavin, R. E. (2002). Evidence-Based Education Policies: Transforming Educational Practice and Research. Educational Researcher, 31(7), 15–21.
  • Tikly, L. (2015). What works, for whom, and in what circumstances? International Journal of Educational Development, 40, 237–249.
  • Wrigley, T. (2004). ‘School Effectiveness’: The Problem of Reductionism. British Educational Research Journal, 30(2), 227–244.
  • Wrigley, T. (2016). Not So Simple: The Problem with ‘Evidence-Based Practice’ and the EEF Toolkit. FORUM: for promoting 3-19 comprehensive education, 58(2), 237–252.
  • Wrigley, T. (2019). The Problem of Reductionism in Educational Theory: Complexity, Causality, Values. Power and Education, 11(2), 145–162.

Picture credit: Wikicommons (Used under a Creative Commons licence)

Leave a comment