← Back to Blog

Fluency First or Problem-Solving First? What the Evidence Says for KS1

young primary school student working through a word problem written on paper at

The debate between "fluency first" and "problem-solving first" is one of the longest-running arguments in primary maths education. It's also one of the most consequential, because the answer changes what teachers do with 30 minutes of numeracy time for every child in Years 1 and 2.

The fluency-first position: children need to automatise basic arithmetic facts before cognitive resources are available to reason about more complex problems. Problem-solving before fluency overloads working memory and teaches nothing durable. The problem-solving-first position (often associated with enquiry-based learning): children construct mathematical understanding by working through problems, and drilling facts before developing problem-solving intuition creates students who can compute but can't think.

Both positions have serious proponents. Both have research supporting them. The disagreement is real, not manufactured. Here's what the strongest evidence actually shows.

What "Fluency" Actually Means in Research Terms

Fluency in maths research doesn't mean memorised recitation. It means accuracy, efficiency, and flexibility in applying procedures and facts. A student with number fluency can add 7+8 without counting, can choose between strategies (decomposition, recall, counting-on) based on what's most efficient for a given problem, and can explain what they're doing.

This definition matters because it's often conflated with rote memorisation, which is a narrower and less useful thing. When researchers find that fluency supports problem-solving, they're finding that accurate, flexible, automatic access to number facts frees working memory for higher-order reasoning — not that memorised chanting produces good mathematicians.

The Working Memory Case for Fluency First

The strongest theoretical argument for fluency before problem-solving comes from cognitive load theory, developed by John Sweller at UNSW Sydney and refined by researchers including Paul Kirschner and John Hattie. The core claim: human working memory has a limited capacity. When it's occupied by the mechanical computation of basic facts (what is 6+7?), there are fewer cognitive resources available for the higher-level reasoning required by a word problem (if James has 6 apples and gets some more, and ends up with 13, how many did he get?).

The empirical support for this claim is strong. A 2006 meta-analysis by Kirschner, Sweller, and Clark reviewed a substantial body of research and concluded that minimally guided problem-solving is less effective than explicit instruction for novice learners — including young children. The authors found that enquiry-based and discovery approaches consistently underperformed direct instruction on measures of both procedural skill and conceptual understanding in learners who did not yet have robust domain knowledge.

More recent work by cognitive scientists Sian Beilock at the University of Chicago has linked arithmetic fact automaticity specifically to the ability to handle multi-step problems: students who automatically recall single-digit addition and subtraction facts make fewer errors on multi-step problems than students with equivalent accuracy but slower fact retrieval — even when controlling for general cognitive ability. The difference is that slower retrievers are using working memory capacity to compute basic facts, leaving less for the problem structure.

The Problem-Solving Advocate's Response

The strongest version of the problem-solving-first argument doesn't dispute that working memory is limited. It disputes what happens when children are forced to compute before they understand. The claim: children who are drilled on facts before they've developed any intuition for what numbers mean produce fragile, procedure-dependent knowledge that doesn't transfer to novel problem contexts. They can compute but they can't reason.

This is not a fringe position. It's associated with researchers including Jo Boaler (Stanford), who has published extensively on the relationship between maths anxiety, timed practice, and conceptual understanding. Boaler's work suggests that procedural drill without conceptual grounding specifically disadvantages students who don't already have informal number sense — which correlates with disadvantaged socioeconomic backgrounds. The drill-first approach, in this view, is not just less effective — it's inequitable.

The empirical evidence here is also real, though contested. Studies showing positive effects of enquiry-based learning in primary maths exist; so do studies showing null or negative effects. The research quality varies substantially across studies.

What the Strongest Studies Actually Show

When we restrict to high-quality studies (randomised controlled trials or quasi-experimental designs with adequate control groups), the picture becomes clearer. The Education Endowment Foundation's Teaching and Learning Toolkit, which is based on a systematic evidence review, rates mathematics teaching approaches at KS1 as follows: "Explicit, structured instruction" has "strong evidence" of positive impact (typically +5 months additional progress). "Enquiry-based learning" has "moderate evidence" of positive impact (+4 months) but with wider variance in outcomes.

Critically, the EEF notes that the best results come from structured programmes that combine explicit instruction with meaningful application — not either pure fluency practice or pure discovery learning. The mastery approach adopted in the Singapore and Shanghai systems, which UK schools have increasingly adopted since the National Centre for Excellence in the Teaching of Mathematics began promoting it in 2015, explicitly combines fluency development with conceptual understanding work and problem-solving — sequenced carefully rather than treated as alternatives.

The Mastery Evidence

The most rigorous UK evidence on this question comes from the Randomised Controlled Trial of the Mathematics Mastery programme, published by the EEF in 2015 and updated with longitudinal follow-up in 2019. The trial covered 83 primary schools in England and found positive effects for mastery approaches on PISA-style assessment tasks — including both fluency and problem-solving measures.

The mastery approach is explicitly sequenced: concrete-pictorial-abstract, with fluency work at each stage before moving to the next level of abstraction. It does not present fluency and problem-solving as alternatives; it presents them as a developmental progression. A child builds fluency with concrete objects before abstract symbols, and builds problem-solving ability on a foundation of automatised concrete-then-abstract fluency.

This is not quite the same as "fluency first." It's more like "fluency at each level, before moving to the next" — a spiral rather than a linear sequence.

What This Means for KS1 Practice

For teachers working with Year 1 and Year 2 students, the practical implications of this evidence are:

Number sense before symbol work. Before Year 1 students encounter written addition, they should have substantial experience with concrete quantity — counting objects, comparing amounts, decomposing and recomposing small quantities physically. Rushing to written arithmetic before number sense is established produces the fragile procedural knowledge that the problem-solving advocates are rightly concerned about.

Fluency built on understanding. When KS1 students practise addition and subtraction facts, they should understand what those facts represent — not just recite them. The goal is not memorisation for its own sake but automaticity built on a conceptual foundation. Students who can explain that 7+8=15 because 7+7=14 and one more is 15 have a different kind of knowledge from students who have only memorised the fact.

Problem-solving that matches current fluency level. Word problems and mathematical reasoning tasks should be pitched at a level where the computational demands are within the student's automatic range. Asking a Year 1 student to solve a two-step problem involving subtraction facts they can't yet retrieve automatically is not developing reasoning — it's using up all available working memory on basic computation.

The practical answer to "fluency first or problem-solving first?" is neither: it's both, deliberately sequenced so that fluency at each level unlocks the problem-solving capacity at that level. That's what mastery teaches. It's also what our platform implements — not drill separate from application, but drill interspersed with application tasks that use exactly the facts currently being consolidated.

A Caveat on the Research

None of the evidence described above is definitive. Educational research is methodologically challenging, and the "what works" literature in primary maths education is smaller and more contested than its advocates sometimes admit. The EEF's evidence ratings are useful heuristics, not scientific laws.

Teachers who've developed effective practice through experience, reflection, and observation of their own students have knowledge that no meta-analysis can capture. The research should inform practice, not replace professional judgment. The most useful role for evidence is to challenge assumptions — including the assumption that "the way I was taught to teach" is the best available approach — not to issue instructions.