Are you giving your governing body accurate data? Does it allow them to see the full picture? As a governor, I am often presented with a snapshot of the latest assessment, but many of us have full time jobs, and for most these are not within education. So, does a small snapshot of your current assessment give them enough information to challenge? That is after all the role of the governing body isn’t it?
It’s increasingly frustrating to hear of schools using old measures to communicate their headline data and others bandying Progress 8 figures around as though they are accurate. It is difficult enough for school staff to understand that P8 is not going to be accurate until the validated results of that national cohort are released, so can you imagine how hard it is for a governor with no education background? I do believe that Progress 8 can have a place though, but it is just not suitable for whole school accountability. For example, it can be used to rank students and investigate areas of weakness for intervention and also to look at the gaps between groups of students across time.
So, if we can’t share old measures or the new Progress 8 measure, what information can school staff give their governing body? I’ve been thinking about this a lot recently and thought I’d share a few of my ideas. The examples below are using data that is based on ‘Working At’ grades, but it would still be as powerful if you are using ‘Predictions’.
Threshold in English and maths
These charts offer a very visual picture of the school performance at this point. In this case, I’ve compared them to the school targets to give a better indication of how far they are from expectations.
The more data savvy governors will probably be biting at the bit for more detailed information – this example shows the key groups within a cohort and how each of these groups are performing against the basic measure. This would allow governors an opportunity to see the strengths and weaknesses within your school in this key headline measure.
Is this still enough data though? From this you would deduce that the one student from ‘White and any other background’ is a key concern and intervention is required. However, if you add the targets in you can see that this student has not been targeted to achieve the threshold measures – another conversation altogether!
Below are some other charts that give a good indication of the cohort performance:
You’ll notice that I included the Progress 8 chart, despite saying that it shouldn’t be used as a figure. Provided that the targets and assessments are both using the same A8 Estimates, this can be used to measure the gap from this assessment to the target, as long as a little notice is taken of the actual number that is being produced! But that does lead me nicely on to how Progress 8 can be presented to the Governing body.
This report allows the governors to see that there have been no dips in overall performance, but they can still see that there is some way to go from the target as they have the charts to reference.
Another question that I’ve been asking myself is whether it is appropriate to give qualification level data to the governing body? My conclusion is ‘yes’, as it allows them to question the data in more detail. If you are using pathways in your analysis and tracking, then being able to combine that with your gap analysis is going to give a clear indication on where subjects are having success with a particular group of students. This information can then be used to inform teaching & learning across other subject areas.
Governors will be able to see that English have the greatest percentage of pupil premium students who are below where they should be at this point in time. You could take it one step further and look at the bottom 5 qualifications against targets. This is an example of the English Language Spring assessments vs the KS4 Targets – by presenting this, it is easy to see the numbers of students and how far away they are.
As a governor myself, this is the type of information that I would hope to be presented with at meetings, but it’s still very bitty. I would suggest putting together a governor return so that the information is consistent at each meeting and the comparable data is readily available for questioning!
If your analysis tool allows you to set up bespoke permissions, you could consider giving your governors access (and an assessment timetable!). You will need to train them fully on how to use the system, and this will take more than one session. But allowing governors to interrogate the data for themselves, and come to the meetings with questions at the ready, will lead to a more productive meeting and keep them to time.
For SISRA Analytics customers, keep an eye out for our Governor Training Pack that is coming soon!