We have news of two new podcasts released this month. The first is the Thinking Deeply about Primary Education podcast which features Richard Selfridge and James Pembroke discussing all things assessment with host Kieran Mackle. It's a long listen packed with help and advice and we hope that you find it useful. We've had a lot of good feedback already, as you can see below:
The second podcast is the return of the Databusters for the 2022-23 academic year, as we discuss the current picture for Statutory Assessment as well as looking at discussions around assessing the wider curriculum.
In other news, Richard Selfridge has authored an online 'Using Data Effectively to Support Pupil Development' course for TES which is available now. Part of a series of online leadership courses aimed at aspiring, middle and senior leaders, the course is suitable for both primary and secondary schools and further details are available here.
With the new academic year upon us, I delivered a session entitled ‘How and Why to Dataproof Your School’ at ResearchEd’s National Conference in London on 3th September. My talk summarised the key message from Dataproof Your School: Take control of your data, maximise effectiveness and minimise workload.
Having worked with MATs, primaries and secondary schools on develop strategic use of data, I am pleased to see that my work with Driver Youth Trust - most recently writing DYT’s Literacy Difficulties Framework based on the work we have done with schools over the last ten years - ties neatly into a clear summary of the effective use of data in school: Build pictures to identify where changes need to be made and make those changes effectively.
My work with schools on data and SEND is underpinned by the EEF’s excellent ‘Putting Evidence to Work – A School’s Guide to Implementation’ guidance. Making changes which are sustained over the long term is complex but eminently possible with expert guidance and support.
As an independent advisor with wide ranging experience working with schools in a variety of contexts, I am always open to helping schools who are looking for strategic support to effect change. If that sounds like you or someone you know, please do get in touch.
In the meantime, the Databusters Podcast marches on and we are looking forward to welcoming a series of guests onto the podcast over the course of the coming year. In addition, the Databusters will be out on tour so if you’d like to host us or to attend one of our events, please do get in touch.
Finally, Sage Publishing are offering 30% off both Databusting for Schools and Dataproof Your School with the code RESED30 at www.sagepub.co.uk until 1 October 2022:
We’re pleased to announce that Dataproof Your School was published in February and has been generating a great deal of positive comment. As well as lots of Twitter feedback, the Amazon ratings are looking good too…
We discuss the book on the most recent Databusters Podcast, and here’s a outline of the chapters, which gives you a good sense of the book’s contents:
1 An Introduction to Using Data in School
2 A Licence to Change
3 Generating and Collating High-Quality Data
4 Standardised Tests
5 Teacher Assessment
7 Tracking Systems
8 Developing a Data Strategy
9 Dataproofing in Action
Amazon’s preview function gives you an opportunity to read the first few pages, and the book is available in all good book stores and online via Wordery, Waterstones and Sage Publications.
We've been working remotely since March 23rd, when the UK went into lockdown due to Covid-19. Luckily, we have a book to write - Dataproof Your School, which is due to be published by Sage Publications in Spring 2021. Developed by Richard Selfridge and James Pembroke, we have been road-testing concepts throughout 2019-20, developing a clear guide to how to take control of your education data.
Two Databusters podcasts have been recorded during lockdown, What Just Happened? and Staying Alert. Join Richard and James as we mull over the latest issues in education data.
Which many schools are working hard simply to keep ahead of the latest advice and regulations, there is clearly a huge opportunity within the current crisis; many schools are using this time to consider how they will develop their assessments systems for the time when all pupils are back in school, and Databusting for Schools are helping various organisations develop data strategies so that they are able to be confident that they are driving their data use rather than being driven by it.
If you are interested in finding out what we could do for you, please get in touch.
Launching The Databusters, crunching numbers on SEND and more data literacy
I’m extremely pleased to be able to announce that James Pembroke, data guru extraordinaire, and I have launch The Databusters – a new venture supporting teachers and schools in their drive to develop data literacy and to improve the use of data in the school system. James has unparalleled knowledge of all things statutory data and I bring my teaching experience and knowledge of testing and statistics; together we hope to keep on working on our joint mission to demystify and improve the use of data in the school system.
We’re currently preparing a series of training events around the UK and have recently recorded and published the first in our new monthly podcasts: Databusters. Episode 1 looks at the revised Ofsted Education Inspection Framework, developments to the Inspection Data Summary Report and the first in our ‘What everyone needs to know about’ series, which looks as standardised scores.
The podcast is currently available on Spotify and Anchor, and will be rolling out onto other platforms over the coming weeks.
As part of my work with the Driver Youth Trust, this summer Karen Wespieser and I looked closely at SEND statistics and what these mean for the school system. We presented our findings at the Hallam Festival of Education, ResearchEd Rugby and the Festival of Education and I have contributed a chapter to the forthcoming researchED Guide to SEND.
In November, I am in Manchester and Central London running sessions on Understanding and using standardised assessment and standardised scores to support teaching and learning for Rising Stars – places are booking up, so put it in your diary and I hope to see you there.
If you'd like to know more about Databusting for Schools or The Databusters, please get in touch.
A new book, crunching numbers on SEND and developing data literacy throughout Europe
Feedback from those who have read Databusting for Schools and those have attended sessions I have hosted since its publication has lead to the development of a new book for those who simply want to know what they should be doing with their data. Due in 2020, Dataproof Your School will provide answers to the pressing questions schools ask about collecting and using information.
As part of my work with the Driver Youth Trust, I have been looking closely at SEND statistics and what these mean for the school system. I'll be presenting my findings alongside Karen Wespeiser at the Hallam Festival of Education, ResearchEd Rugby and the Festival of Education.
In addition, I am pressenting my session on Assessment 101: 10 Things You Should Know About Assessing Children at ResearchEd Rugby, and I am part of a panel hosted by CEM at the Festival of Education, Measuring Progress in Education - The Good, the Bad and the Future.
Looking into Autumn 2019, I am in Manchester and Central London running sessions on Understanding and using standardised assessment and standardised scores to support teaching and learning for Rising Stars, as part of a programme which includes training by Becky St John and James Pembroke.
In the meantime, I am looking forward to the first international conference for Databusting for Schools, ResearchEd Haninge in March 2020.
Databusting for Schools, Summer 2019:
Hallam Festival of Education: 14 June
ResearchEd Rugby 15 June
Festival of Education 20 June and 21 June
If you'd like to know more about Databusting for Schools, please get in touch.
I’m hugely exciting about the forthcoming Education Show, at ExCeL London on Thursday and Friday 24th and 25th January, where I’ll be chairing the Policy in Practice theatre, one of three different strands of the UK’s go-to annual education gathering. This is the first time I’m going to be able to attend BETT, too, the UK’s premier EdTech show, which this year runs alongside the Education Show.
In another first, the Education Show has moved to London for the first time in 28 years, and tickets – which are free to those working in education – are available here. Whilst I’m looking forward to hearing from the speakers I’m introducing – who include Laura McInerney, Drew Povey and Sherry Coutu – I’m also looking forward to the opportunity to see what is new and innovative from the show’s many exhibitors.
There are two further strands in the Education Show’s programme of speakers, with both a School Business and a Pedagogy in Practice series of presentations. These strands include the incomparable Becky Allen, as well as award-winning teachers, researchers and those at the heart of education policy.
Trade shows can often be overwhelming in my experience, and careful planning is the key to getting the most out them. Both the Education Show and Bett make that easy, of course, by including extensive details on their websites, which are up and running with all the details you might possibly need.
One of the joys of education gatherings like the Education Shows is the opportunity to stumble across things of which you were previously blissfully unaware, and with the sheer number of speakers and exhibitors at ExCeL, I’m looking forward to discovering the delights in store.
A New Year always brings thoughts for the future, and now is a great time to take the opportunity to explore the challenges and inspirations which the next year will bring; events such as the Education Show and Bett pack a huge amount into a short space of time, making them an extremely efficient way to take a fresh look at what you are currently doing, and to consider your next move…
Register for your free tickets here, and I look forward to seeing you at the Education Show!
Where and when:
Hall S4, ExCeL London, London E16 1XL
Thu 24th Jan, 10:00 - 18:00
Fri 25th Jan, 10:00 - 17:00
Several schools have asked about the interaction between ‘Scaled Scores’ (as reported following End of Key Stage tests) and Standardised Scores (as reported by standardised tests such as those provided by Rising Stars, GL Assessment, CEM or NFER). Whilst these scores look similar, most people in school are by now aware that they are actually quite different. Given the superficial similarity between scores reported as a Scaled Score of ‘100’, say, and a Standardised Score of ‘100’, it isn’t hard to see why.
I wrote a blog for CEM which explains how Standardised Scores are created. As I said in the blog, ‘For nationally standardised tests, a mean and standard deviation based on a representative sample of the population will give an indication of a student’s position within the national population of those taking the test.’ This means that a Standardised Score of 100 tells you that the underlying test score is the same as the mean score on the test recorded by a reasonable national sample of those taking the test.
Standardised Scores have various limitations, but in principle they are effective when it comes to ranking children against a reference group of children. They do not, however, give any information about the performance of children against a set of standards.
Scaled Scores as reported for Key Stage 2 are used to place children on a scale from 80 to 120. These scores are intended to provide a numerical indication of children’s performance, largely so that a Value Added calculation can be made to be used within the government’s accountability structure for schools.
A panel of experts is convened to designate the raw test score which is deemed to indicate be ‘of the expected standard’. Anything above is higher than the expected standard, anything below is not. The raw scores are then converted into an 80 to 120 scale, where 100 is the ‘expected standard’. The tables for the most recent KS2 Scaled Score conversions can be found here.
The government has muddied the waters a little more/made things easier to understand by introducing the terms ‘working towards the expected standard’, ‘working at the expected standard’ and ‘working at greater depth than the expected standard’. Any score between 80 and 99 is ‘working towards the expected standard’, between 100 and 109 is ‘working at the expected standard’ and a score from 110 to 120 is ‘working at greater depth than the expected standard’.
How Scaled Scores and Standardised Scores interact
This is where some interpretation of the two different types of scores is necessary. Head teacher Michael Tidd notes that standardised tests which report using Standardised Scores are different to statutory End of Key Stage tests which report Scaled Scores saying that, ”while only 50% of children can score over 100 on the standardised test, around ¾ can – and do – on the statutory tests.”
As Michael notes, “Scoring 95 on one year’s standardised test is no more an indicator of SATs success than England winning a match this year means they’ll win the World Cup next year.”
We do have some data which helps to understand the interaction between Scaled Scores and Standardised Scores. Data analyst Jamie Pembroke has produced blogs on converting the 2017 and 2018 KS2 scaled scores to standardised scores, the latest of which suggests that a Standardised Score between 90 (most generous) to 95 (least generous) is – very roughly – likely to be similar to a Scaled Score of 100.
Rising Stars (producers of PIRA, PUMA and GAPS tests) suggest a Standardised Score of 94 and above indicates ‘working at the expected standard/greater depth’. They also suggest that ‘Greater Depth’ is indicated by a Standardised Score of 115 and above.
What should Databusting Schools do?
Broadly, schools should use Standardised Tests where possible to generate unbiased pupil performance data. This data can then be used (alongside the various other sources of information) in discussions about children’s development. Administering Standardised Tests should generally be done in Year 3 and above, and – unless you have a particular reason to do so – it should be done no more than once a year.
Children in each cohort should then be placed into three broad groups:
All of these groups should be expected to make good progress through good classroom teaching. Children in Group C will generally need additional targeted support, with the aim where possible of moving into Groups B/A over time.
The cut-offs for each of these groups are broadly as follows:
With interesting noises coming from Ofsted, Primary Schools are thinking more and more about what progress and attainment actually mean for their school, and what they should be doing to ensure that they have a sensible system for monitoring children’s development as they move through school.
Using Standardised Scores to generate unbiased indications of a children’s relative performance, and grouping children into three broad categories each year, will help schools to build up a picture of a child’s relative performance over time. Linking Standard Scores to the standards expected at the end of key stage is not without issues, but can help Databusting schools to direct their resources to best support the children in their care.
Following the publication of Databusting for Schools in July, I've continued to travel the country raising awareness and understanding of data in education. I held a Masterclass at the EduTech show at Olympia in London, which was titled 'An Insiders Guide to the Numbers in School'. This focused on the use of standardised scores, and the understanding of the mathematics behind these incredibly useful statistics. I also spoke at the School Data Conference, leading a workshop on effectively analysing and interpreting data within a primary school, and led a session for North Lincolnshire Primary Heads Consortium.
Exploring standardised scores, which I covered in depth in Databusting for Schools, has been well received at conferences; the presentation I did at researchEd's national conference ('Assessment 101 – Ten things everyone should know about assessing children') will be repeated at researchEd Durham later this month (details here), and I've written a blog for CEM which was published this week.
The piece for CEM highlights the insight which standardised scores can give you over and above raw test scores:
"If a student scores 65% on a test, what does this tell you? Is this mark good? Bad? Average? If it is deemed to be a good/bad/average mark, against whom is this judgement being made – the other children in a class, in a school, or similar children across the country?
These fairly obvious questions are what led to the development of Standardised Scores; numbers which not only tell you how a child performed in a test, but also give you some information as to where their score sits within the range of scores recorded by other children who have taken the same test.
So, if a child scored 65% on a test in which the average child scored 70%, their score might be reported as a standardised score of ‘95’; if the average child scored 60%, their score might be reported as ‘105’.
If you know that standardised scores are created such that the mean score is allocated a score of 100, that two in three standardised scores are between 85 and 115, and that 95% of scores are between 70 and 130, you can make much more sense of a child’s test score reported as a standardised score than you can from a test result reported as a percentage or a raw score."
With the growing move to understand the benefits and limitations of data in education, including extremely useful insights into the national picture from Ofsted's Towards the education inspection framework 2019 and the DfE Teacher Workload Advisory Group's Making Data Work report.
I'm continuing to run sessions on the current pupil performance data landscape, looking at the recent history and future direction of the use of numerical data in schools.
Please get in touch if you have any comments, feedback or requests for further information.
Following the publication of Databusting for Schools in July, the book is proving to be extremely popular, and feedback continues to come in.
Databusting for Schools was written to make the world of education data accessible to those who may might the subject daunting, so my favourite bit of feedback so far is probably from Primary headteacher Darren Norman: 'Not dry at all, would highly recommend to leaders and governors'. That was exactly the aim for the book, and it's good to see that it is being received so well.
I spoke at the ResearchEd National Conference in September, in a session which I called 'Assessment 101 – Ten things everyone should know about assessing children'. This session took various themes covered in Databusting for Schools and laid them out for those new to the issues in psychometrics.
These kinds of introductions to assessment are proving to be very popular, as are the sessions I run on the current pupil performance data landscape, looking at the recent history and future direction of the use of numerical data in schools.
I'm speaking at a couple of public events in October, if you like to hear me talk.
EducTech Show, London Olympica, October 12th (details here).
School Data Conference, London, November 7th (details here).
ResearchEd Durham, Durham, November 24th (details here).
I also wrote a piece, 'Tracking Pupil progress Doesn't Always Mean Using Data' for Teach Primary, which you can find here.
As my Teach Primary piece concludes, "Many schools have stopped allocating dubious numbers to children, embracing standardised tests and comparative judgement instead.
Lots have embraced the idea that assessing attainment and monitoring progress are separate endeavours, and have worked hard to ensure that children are properly supported in learning those things they have not yet mastered, rather than pushed on before they have grasped the curriculum appropriate to their age.
When Sean Harford, Ofsted’s engaging national director of education, recently tweeted that tracking pupil progress “doesn’t necessarily mean ‘use data’”, the odd hissing sound you might have heard probably came from the offices of those who have been tasked with managing data in their schools, as the air slowly leaked from the tyres of their data juggernauts.
Simply put, progress happens when a child’s knowledge and understanding advances, rather than when a number has been generated. It’s about what children can do now that they couldn’t do before, not simply whether the figures have changed. That, more than anything, is progress."
Please get in touch if you have any comments, feedback or requests for further information.