“…Day after day, day after day,
We stuck, nor breath nor motion;
As idle as a painted ship
Upon a painted ocean.
“Water, water, everywhere,
And all the boards did shrink.
Water, water, everywhere,
Nor any drop to drink…”
Samuel Taylor Coleridge,
The Rime of The Ancient Mariner, 1797-1798.
“Data, data, everywhere,
But try as hard they might,
Data, data, everywhere,
Nor any real insight…”
Schools could use data to determine whether and to what extent
students have learned what they need to succeed.
In 2005, the world created 150 billion gigabytes of data. By 2011, it was 1200 billion. Global data volume is now growing tenfold over five years. McKinsey labels Big Data as ‘the next great frontier for innovation’. I have said before on this blog that big data will affect teaching: consultancy Cairneagle here predict that advanced analytics will change student assessment. There are also some colourful analogies from David McCandless for this advent of big data:
“I’ve been working as a data journalist for about a year and there’s a phrase I keep hearing all the time, which is this: ‘Data is the new oil’: data is a ubiquitous resource that we can shape to provide new insights and innovations and it’s something that can be mined very easily. But I would perhaps adapt this metaphor slightly and say that ‘Data is the new soil’. For me it feels like it’s a fertile creative medium: over the years online we’ve laid down huge amount of information, irrigated networks and connectivity, a landscape worked and tilled by unpaid workers. It feels like a fertile medium, and data visualisations and infographics are flowers blooming from this medium. If you start working and playing with it, interesting patterns can be revealed”.
A couple of things got me interested in data in teaching. The first was reading Teaching as Leadership by Teach For America’s Steven Farr before I started teaching. ‘More successful teachers,’ he says, ‘are nearly obsessive progress trackers. Tracking begins with a list of objectives students need to master and the tracking system indicates the extent to which each student has mastered those objectives over time. Students are motivated by the clear and transparent display and communication of their progress towards their goals.’ There must, I thought, be something to this business of tracking.
The second was hearing how my school managed to close the gap between poorer pupils and their wealthier counterparts to 1% in GCSE attainment, from over 25% nationally. They used a data tracking and intervention program called ‘Going For Gold’. The whole point of this program, it was agreed, was that it wasn’t just intervening for kids with labels like G&T. It expected everyone in the year to be in an intervention program, so no one felt singled out; every kid was motivated to improve. And it got great results.
Recently, I’ve been impressed by the clarity of the publications of the North Star schools network. Paul Bambrick Santoyo, who’s worked with, and presided over dramatic achievement gains at the schools he leads in Newark, New Jersey, and helped around 1,000 schools serving half-a-million children make similar gains in New York City, Chicago, New Orleans, Oakland, and elsewhere round the US. He condenses the most powerful levers of leadership into cultural levers on student ethos, staff culture, and coaching, and instructional levers on data, observations, planning and training. These make sense to me. So I’m writing a series of posts on each one on this blog. This phrase from his book Leverage Leadership particularly struck me:
“After spending ten years observing schools, I am convinced that data-driven instruction is the single most effective use of a school leader’s time”.
This got me reading an earlier book he’d written, Driven by Data. Here is my summary of his main points.
Data-driven Instruction is the idea that schools should focus on a few simple questions: how much are our students learning? How can we ensure our students learn what they need to succeed? Bambrick-Santoyo mentions the main pitfalls that schools stumble into in using data: inferior, secretive or infrequent interim assessments; a disconnect between curriculum and assessment, or between teaching and analysis; delayed results, ineffective follow-up, and insufficient time for data. All of these blight the chances of schools using data effectively to improve student achievement. He also sets outs commonly travelled but distracting false paths that result in failure: the pursuit of total buy-in (a waste of time), reliance on poorly implemented ‘professional learning communities’ (often not focused enough), and year-end assessment analysis (like an autopsy, too late to make an impact). Instead, he distils four critical drivers of school success in the data arena.
1. Assessment: Create rigorous interim assessments that provide meaningful data. Assessments are not the end of the teaching and learning process; they’re the starting point; not the destination, but the roadmap and milestones.
2. Analysis: Examine the results of assessments to identify the causes of strengths and shortcomings. Run data analysis meetings starting with the question: ‘so what’s the data telling you? Why did so many struggle with this question? What errors did they make?’ User-friendly data reports combine question-level, objective-level , individual and whole-class data succinctly on one page.
3. Action: assessment is useless until it affects instruction. Convert analysis into action plans to adjust teaching effectively for what students most need to learn. Fast turnaround of feedback is a priority. Engaged students know the end goal, how they did, and what actions they are taking to improve.
4. Application: create a conducive climate of professional development with induction and ongoing training in the principles, an implementation calendar for assessment design, analysis meetings, action planning meetings and re-teaching time. Rather than a lecture format for CPD, use an act-reflect-frame-apply model.
Data is no panacea. A cautionary tale from Tessa Matthews tells us that it can become a millstone round our necks: all pressure, no insight. Since Samuel Taylor Coleridge’s epic ballad 215 years ago, the idea of an “albatross around the neck” has become idiomatic: a heavy burden of guilt, an obstacle to success. Data doesn’t have to be an albatross round our necks. If we use it wisely, it can drive rather than drag on school improvement.
Reblogged this on The Echo Chamber.
The problem with data? Politicisation; both locally and nationally. Over the years it has become clear to me (in my professional field) that data generated for one purpose doesn’t easily lend itself to analysis for another (related, but different) purpose. Design the data collection process for the analysis you have in mind, and then don’t try to use that data for another purpose. For example, formative assessment processes (designed to inform near future teaching) are not suitable for assessing teacher performance (partly because the secondary purpose will distort the data and render it much less useful for its primary purpose). Find another way to assess teacher performance. Data generation and collection needs to be made as focused and ‘pure’ as it can be, with politicisation reduced to nothing, if possible. Use of flawed data or flawed data collection/processing is worse than no data at all, because it gives a flawed picture.
Thanks for your thoughts, Joe.
I was involved in a headship appointment recently (external adviser to the panel of govs) and when asked about use of data one of the candidates said, ‘I sometimes worry that many schools are data-rich but analysis-poor,’ and then went on to explain what he meant by that. I thought it was a good line!
Pingback: Books, bloggers & metablogs: The Blogosphere in 2013 | Pragmatic Education
This blog really resonates with me. When I visit other schools as an SLE for science, it is nearly always the case that the tracking of student progress is not fit for purpose. Science is a particularly difficult subject to manage because so many different courses are run in each school. Frequently students will move from one course to another, students may have some grades ‘banked’ while studying for a second grade. It is very easy to get lost in meaningless numbers. When the tracking is right, the subject management becomes much simpler. The tracking process itself becomes a powerful tool to identify areas of concern and really support development for all.
Pingback: A guide to this blog | Pragmatic Education