School Data – more, more, more? Part 2

September 5, 2019
By: Chris Beeden, Educational Data Consultant, School Data Managed Ltd

In this series of blogs, Chris Beeden from School Data Managed Ltd, discusses the use of data in schools, from humble beginnings to today’s measures and systems. Click here to read part 1.

Ofsted – Progress of current Pupils

In 2015 this bomb was dropped on pupil data:

  • ‘Throughout each year group and across the curriculum, including in English and mathematics, current pupils make substantial and sustained progress, developing excellent knowledge, understanding and skills, considering their different starting points.’
  • ‘The progress across the curriculum of disadvantaged pupils and pupils who have special educational needs and/or disabilities currently on roll matches or is improving towards that of other pupils with the same starting points.’


When Mr Wilshaw announced he would measure schools’ performance by progress of pupils at the school rather than leavers it felt fair.  From 2015 schools had to record and show the progress of pupils in every year group and every subject and for each of them split into lots of subgroups, however small.  This was at a time when subjects were being reformed and ‘no levels’ was being forced on schools.

Assessments should be simple: a test of knowledge with an outcome of a score which you can, if needed, convert to a grade. But these comments started to be used:

‘You can’t record a D because last term they were a D and they need to have made expected progress in the term’

‘This pupil’s grade cannot be less than their FFT target’

‘This pupil was a 3 at Key Stage 1 so cannot be developing in year 5’

‘These pupils are not making expected national progress’

‘What is expected national progress?’

‘2.32 points per year, and they made 2.24 points progress’.

Teachers were/are being asked to make an assessment and then being asked to record an outcome, not based on the assessment but on an assessment done 3 years earlier.  (If you want to know about Key Stage 1 issues look at Bedford Borough’s Key Stage 1 results 2014, compared to the Key Stage 2 2018 – that is for another blog!)

I understand the need to amend the Inspection framework but it should have been amended, not reformulated into a new one.  The current framework is not that bad and the change has – and will – cost the sector a lot of money that it does not have.  Saying to schools: “Don’t prepare or change practice for Inspections” is like telling pupils “not to revise or study the wrong text for the exam”.  I have been involved in about 20 inspections in the last 8 years (they are tough but a million times better since they have “educational people” leading them).

Just a note though, Ofsted has said it won’t look at internal data and that frees up data to be used as it should be, but outcomes has moved from the 4th section to the 1st section of the new framework. Outcomes from September will be more important than ever before. Schools will need to analyse the outcomes thoroughly and be careful to complete the Performance Tables checking in June.

Reformed Assessments – No levels

With systems now in place, 4G internet, educational experts advising lots of assessments and Ofsted wanting to see progress data, schools were then asked to remove levels. Systems were developed to record individual units and this took pupil data collection to another level.  In the confusion schools created their own ‘no level’ systems all while stakeholders continued to demand plenty of data. The removal of levels seemed a good idea but really it is madness. Look at the performance tables. Primary schools are judged by the % meeting the expected standard. At GCSE Basic it’s the performance measure – that is the % above a set level.  You do not get onto a sixth form course with a progress 8 score of 0.3.  You get on the course with a grade 6. Schools were advised not to use levels but at the same time they are judged by levels. When you get feedback on a pupil in year 4 you need something everyone (the parent, CEO, etc.) will understand.


Progress 8

Regarding the move to grouping subjects’ progress, subjects are different and have different difficulty, it is stupid to try to get them all the same. Pupils need courses that are more accessible, just don’t then judge them all together!  ECDL is a great course but cannot be compared to GCSE Music. The decline in ICT courses is a crazy result of such a system.  GCSE French is different from Combined Science.  The move by SISRA to show subject progress was fantastic but it is mad that the large DfE data team are churning out all the ASP and IDSR analysis but you cannot find any subject progress apart from maths. Thank you SISRA for creating the Subject Progress Index in 2018.  When doing the summer analysis there are three questions to answer: how did we do overall; how did each subject do: how did our vulnerable groups do? The ASP only answers 1 of these.

The way forward

Maybe we need to go back to where we were in the early 2000s but use the new technology better and let schools do what they think best for the pupil and for the other stakeholders – i.e. regular low stakes assessments and then at a maximum of twice a year convert an assessment score into an overall level for Parents, HODS, HOYS, SLT, Governors, MAT Boards, CEOs, Ofsted, RSCs.  The overall can be a level working at or end of course estimate. Moderation is important for these and/or the use of standardised assessments such as GL assessments which help check the validity of any data.

The overall level is then analysed against benchmarks.  These can be set using the prior attainment of pupils and subgroup information. Let’s use attainment benchmarks and move away from progress numberwangs. When setting targets prior attainment can be used but I think avoid other subgroup information – I do not like the fact FFT estimates are gender specific.  Us boys are way behind but need high expectations. These benchmarks should be set with the pupil. When judging schools use contextualised outcomes against actual outcomes.

Chris Beeden

Chris runs School Data Managed Limited, established three years ago. Previously, he was a school data manager for over a decade after working for Capita SIMS in a local support unit.  He is a timetabler, has a working knowledge of assessments from Early Years to Post 16; supports school census completion and is a qualified GDPR Practitioner and practicing DPO.

Chris tweets from @ChrisBeeden