Three Assessment Butterflies

Winston Churchill once said ‘success is stumbling from failure to failure without losing enthusiasm.’

Looking back now on assessment in our first year at Michaela, I can now see what I was blind to then: we stumbled and blundered. What mistakes did we make, and how did we stumble?

We spent hours marking. We spent ages inputting data. And we didn’t design assessments cumulatively.

  1. Marking

First mistake: we spent exorbitant amounts of time in the first year marking, in particular marking English and History essays and paragraphs. We wrote comments, we set targets, we tried individualised icons, we corrected misspellings, we corrected grammatical errors, we judged and scored written accuracy, we wrote and shared rubrics with pupils. We spent hours every week on this. Over the year, we must have spent hundreds of hours on it.

The hidden pitfall of marking is opportunity cost. Every hour that a teacher spends marking is an hour they can’t spend on renewable resourcing: resourcing that endures for years. Marking a book is useful for one pupil once only: creating a knowledge organiser is useful for every pupil (and every teacher) that ever uses it at again. Marking is a hornet. Hornets are high-effort, low-impact; butterflies are high-impact, low-effort. Knowledge organisers are a butterfly; marking is a hornet. We had been blind to just how badly the hornet’s nest of marking was stinging us. So we cut marking altogether and now no longer mark at all.

MarkingOrganisers.png

  1. Data

Our second mistake: we spent far too much time in the first few years on data input. We typed in multiple scores for pupils that we didn’t use. Preoccupied by progress, we thought we needed as many numbers as we could get our hands on. But the simplistic equation of ‘more data, better progress’ didn’t hold up under scrutiny. Every teacher typed in multiple scores for each assessment, which were then collated so we could analyse the breakdowns. We were deluged in data, but thirsting for insight. There was far too much data to possibly act on. My muddled thinking left us mired in mediocrity, and we had invested 100s of hours for little long-term impact.

What we realised is this: data must serve teachers, rather than teachers serving data. Our axiom now is that we must only collect data that we use. There’s no point in drowning in data, or killing ourselves to input data that we don’t use.

  1. Design

Our third mistake was this: we had forgotten about forgetting. We designed end-of-unit assessments that tested what pupils had only just learned, and then congratulated ourselves when they did well whilst it was very fresh in the memory. We had pupils write essays just after they had finished the unit. We coached them to superb performances – but they were performances that they would not be able to repeat on that text in English or that period of History even a few weeks later. Certainly, months later, they wouldn’t stand a chance. Just as if you asked me to retake for Physics GCSE tomorrow, I would flunk it badly, so just one year on, our pupils would flunk the exact assessment that they had aced one year earlier.

Looking back with hindsight, these three mistakes – on marking, data and design – helped us realise our two great blind spots in assessment: workload and memory. We didn’t design our assessments with pupils’ memory and teachers’ workload in mind.

We were creating unnecessary and unhelpful workload for teachers that prevented them focusing on what matters most. Marking and data were meant to improve teaching and assessment, but assessment and teaching and had ended up being inhibited by them.

We were forgetting just how much our pupils were forgetting. Forgetting is a huge problem amongst pupils and a huge blind spot in teaching. If pupils have forgotten the Shakespeare play they were studying last year, can they really be said to have learned it properly? What if they can’t remember the causes or course of the war they studied last year in history? Learning is for nothing if it’s all forgotten.

The Battle of the Bridge

Assessment is the bridge between teaching and learning. There’s always a teaching-learning gap. Just because we’ve taught it, it doesn’t mean pupils have learned it. The best teachers close the teaching-learning gap so that their pupils learn – and remember rather than forget – what they are being taught. We’ve found the idea of assessment as a bridge to be a useful analogy for curriculum and exam design. Once you see assessment as a bridge, you can begin to ask new questions that generate new insights: what principles in teaching are equivalent to the laws of physics that underpin the engineering and construction of the bridge? How can we design and create a bridge that is built to endure? How can we create an assessment model that bridges the teaching-learning gap?

We’ve found two assessment solutions that have exciting potential. Here are the reasons I’m excited about them:

They have absolutely no cost.

They are low-effort for staff to create.

They have high impact on pupils’ learning.

They are not tech-dependent at all.

They are based on decades of scientific research.

They can be immediately implemented by any teacher on Monday morning.

They have stood the test of time over the years.

We’ll still be using them in three, six and even ten years’ time, and beyond.

In short: no cost, low effort, high impact, research-based, long-term solutions.

Two of the most effective assessment tools we’ve found for closing the teaching-learning gap are daily recaps and knowledge exams.

Over 100 years of scientific research evidence suggests that the testing effect has powerful impact on remembering and forgetting. If pupils are to remember and learn what we teach them in the subject curriculum, assessment must be cumulative and revisit curriculum content. The teaching-learning gap gets worse if pupils forget what they’ve learned. As cognitive science has shown, ‘if nothing has been retained in long-term memory, nothing has been learned’. Assessment, by ensuring pupils revisit what they’re learning, can help ensure they remember it.

Pupils forget very swiftly. We use daily recaps and knowledge tests to boost pupils’ long-term memory retention and prevent forgetting.

  1. Daily recaps

Daily recaps are a butterfly: low-effort, high-impact. Departments create recap questions for every single lesson. Every single lesson starts with a recap. They are easy to resource. They consolidate pupils’ learning so they don’t forget. Every day they spend up to 20 minutes in each lesson applying what they’ve learned before. In English, for example, we spend those 20 minutes on grammar recaps, spelling recaps, vocabulary recaps, literature recaps (with questions on characters, themes, plots, devices and context). We do recaps on the unit they have been studying over the last few weeks. We do recaps on the previous unit and previous year’s units. This daily habit builds very strong retention and motivation: pupils feel motivated because they see how much they are remembering and how much more they are learning than ever before. All recaps are open questions, and weaker forms might be given clues. The recaps are always written; they are no-stakes, without any data being collected; they give instant feedback, as they are swiftly marked, corrected and improved by pupils themselves. We’ve ask pupils after: ‘hands up who got 4 out of 5? Hands up who got 5 out of 5, 100%?’ Pupils achieving 100% feel successful and motivated to work hard to revise.

 

  1. Knowledge Tests

Knowledge tests are another butterfly – high impact, low effort. What I love about our knowledge tests is that they are cumulative, so that pupils revise and remember what they’ve learned. We set GCSE-style exams for depth, and we set knowledge exams to test a much fuller breadth of the knowledge pupils have learned. Knowledge exams are 35-question exams that take 60 minutes to complete. They are beautifully simple: they are organised onto 1 sheet of A4 paper, and they can be answered by pupils on one double-sided piece of A4. The breadth we can achieve with these exams is staggering. By Year 9, we have 3 knowledge tests per subject; they organise 35 questions on what pupils learned in Year 7 and 35 questions on what pupils learned in Year 8, centred on those years’ knowledge organisers. Pupils are challenged to revise and remember what they’ve learned over all the years they spent in secondary school. I am willing to bet that many of our teachers could not beat even our Year 7 pupils on these exams across all subjects! Pages packed with answers for every pupil in the school. The humble knowledge test is a great catcher of knowledge.

As for marking them? We simply sort them into three piles: excellent, pass and fail. We don’t even record the marks. Teachers just note the names of pupils who failed multiple knowledge tests so we know who’s struggled.

Knowledge tests solve the breadth-depth tradeoff of exams. They give pupils maximum practice with minimum marking burden on teachers.

3AssmntButterflies.png

Simplicity must cut through assessment complexity. We should practise what we preach on cognitive overload for teachers as well as pupils. Assessment resources must be renewable, replicable, sustainable, scalable, enduring, long-term.

And the impact of recaps and knowledge tests? Well, it’s very early days yet, but we’ve had some (very weak) Y8 or Y9 pupils miss an entire term though unavoidable long-term illness, only to return fully remembering what they’ve been taught the previous term and previous year. It’s an early indicator that the assessment strategy is bridging the teaching-learning gap and overcoming the savage forgetting curve. The real test of its impact will be GCSE results in 2019, A-level results in 2021 and University access and graduation beyond.

Blind, still

The two blind spots we’ve discovered – memory and workload – provide us with ways of interrogating our teaching and assessment practice:

  • How much are pupils remembering?
  • Where are they forgetting?
  • Where are teachers overloaded?

And I still think that we can do more and find better ways of creating assessments with memory and workload in mind. I’m sure our pupils are not yet remembering as much as we’d like them to. I had a conversation with Jonny Porter, our Head of Humanities, just this week, about ramping up the previous-unit daily recaps we do. In this sense, even at Michaela we still feel blind on the blind spot of memory – pupils are still forgetting some of what we are teaching, and we want them to remember what they are learning for the very long-term. Our ambition is that they have learned what we’ve taught for years to come: for five, ten, twenty years.

Every day, teachers and pupils at Michaela see the words on the wall: ‘success is never final; failure never fatal; it’s the courage that counts.’ It takes courage to radically simplify assessment – and courage to continually confront our workload and memory blind spots.

About Joe Kirby

School leader, education writer, Director of Education and co-founder, Athena Learning Trust, Deputy head and co-founder, Michaela Community School, English teacher
This entry was posted in Education, Teaching. Bookmark the permalink.

9 Responses to Three Assessment Butterflies

  1. Excellent Joe, I am trying to get my science department to buy into this, especially designing assessments that are cumulative across year groups. Do you have an example of the these knowledge assessments that give so much breadth?

  2. Andy McHugh says:

    I would love to not have to mark. There are so many better ways to give feedback that increase the joy instead of suck the joy out of teaching. Any tips on selling this idea to SLT?

  3. Elaine Taylor says:

    Spot on Joe. I’ve been shouting the message about assessment being not fit for purpose for years now. Sadly too many and too narrow accountability measures have resulted in the tail wagging the dog. Assessment is the driver not the guard.

  4. Pingback: Three Assessment Butterflies – THIS Education

  5. Gareth Lewis says:

    Thanks Joe. A really informative blog. I have shared via Thiseducationblog. I hope to use some of this in my teaching. Gareth.

  6. Ok – this is very clear. In primary (particularly with the little ones) marking doesn’t necessarily have to take a long time (you can actually just use a red pen and put a tick or cross). It is comments that take a long time (I think). Or thinking that you can write down instructions/feedback that will transform a child who writes in poorly punctuated simplistic sentences to one who uses complex sentences and great vocabulary.

    But I had not quite thought about how we prioritise accuracy over frequency of feedback. We’d rather have all the books/quizzes accurately marked (and then the children have to wait several days or weeks to get the results…and the feedback) than have the books less accurately marked and the children get feedback the next lesson.

    Interestingly, lots of teachers start marking by putting the books into two piles – good/not good. Then you go ahead to mark the work.

    Next thought – I would really like to hear a bit more on why you think the focus should be on identifying the struggling children. Not because I disagree but because I want to know why you think so. Might give me some food for thought.

    Finally, I’m really curious about how old-fashioned teachers used to mark. I’ve looked through some old books and found that there was a lot less marking yet pupil’s work was much more accurate (less spelling/grammar mistakes and handwriting much neater). Some of the work is clearly just paraphrased from an encyclopedia (or maybe what the teacher wrote on the board)…and I don’t necessarily think this is wrong. But even in the ‘free’ writing the pupil’s work is better than that of pupils today – who get much more …stuff written in their exercise books.
    A mystery that I am still investigating…

  7. Pingback: Are teachers satisfied with their pay? Plus two other findings from Teacher Tapp Ghana! - Teacher Tapp

  8. Pingback: Golden needles in a haystack: Assessment CPD trove #4 | Joe Kirby

Leave a Reply to Christian MooreCancel reply