Author: Emily Kirkby

Life at SISRA HQ with Nathan, Senior Support

Our support team answer your queries on a daily basis, so we thought you might be interested to hear what they have to say about life at SISRA HQ! Our third Q&A session is with Nathan Coyle, Senior Support.


Q: What were you doing before you joined SISRA?

A: I moved from Ireland to Liverpool in 2011 to study. After finishing my degree in Business Management, I worked full-time in Marks & Spencer in Liverpool for a few years, in the food hall – discount on that was a great perk! I’ve now been at SISRA for two and a half years.

Q: What’s your favourite thing about your job?

A: The staff here at SISRA who made it really easy to settle in. In terms of the role, we speak to lots of different people on live chat so I really enjoy the opportunity to get out and meet you all at our training days, DataMeets and other events.

Q: Any memorable moments from working on the support team you’d like to share?

A: There isn’t one that stands out in particular, but we do often get some funny comments on chat that keep us entertained. We gave ourselves festive names over the Christmas period and that seemed to encourage a few amusing comments that got a good laugh from the support team.

Q: What is the greatest challenge the support team face on a day-to-day basis?

A: Probably balancing the workload. We’re quite a small team, and chats can often take up the majority of the day, so updating guides, creating videos and other tasks can often be tough be get through. Always plenty to do!

Q: Tell us a fun fact about yourself

A: My brother (who is 4 years older than me) refused to call me Nathan after I was born and called me Gary instead. He even told all of his school teachers he had a new brother called Gary! This continued for months until he was bought two goldfish, one of which he could name Gary (the other was named Paul).

by Nathan Coyle, Senior Support

The power of together

I’ve spent a lot of time over the last few years working with school leaders, civil servants and others on the Progress 8 measure, hoping to improve everyone’s understanding of how it works, what we can and cannot infer from it and what its pitfalls are.

No single performance measure is ever going to be perfect, but Progress 8 has a lot more going for it than its predecessor of 5+A*C including English and maths. For example, its inclusive nature, with every pupil’s grades contributing something to the school’s performance and the fact that all grades count is a much better reflection of the moral purpose behind school leadership. So is the fact that schools with different levels of prior attainment now have a chance to do well; it’s been great seeing that schools of all types can get very high scores under the new measure.

Those of us who have been involved with performance measures for some time anticipated some of the problems there would be with Progress 8. When contextual value added (CVA) was the headline measure, schools found that pupils with very low scores affected overall results disproportionately. This has proved to be true with Progress 8 and the DfE has committed to working with the profession to address this in its most recent Statement of Intent.

Another problem was the long delay between publication of exam results in August and finding out what this meant for school performance. As progress measures are relative and depend on the performance of all other schools, they have to wait until the DfE tells them how well they’ve done.

Or do they?

Unlike CVA, Progress 8 does not have complex statistical modelling underpinning it; it’s based on the simple average of the results of students with the same Key Stage 2 test score. All you need is each child’s results matched to prior attainment and you can calculate it without any other information. Crucially however, you can estimate it quite accurately if you have enough results, and this was the driving force behind the collaboration between ASCL, SISRA and well over a thousand schools this year. With around 180,000 students’ results, the accuracy of this collaboration compared to the figures published by the DfE several weeks later was very impressive and incredibly helpful to schools – click here to read Graeme Smith’s blog post to see why.

But why stop there? As powerful as it was, the collaboration between schools in 2017 could be just the tip of the iceberg. We can (and I hope we will) do the same again in 2018 but we should not feel limited to the calculation of early estimates of DfE measures. If that is all we do we will be missing a golden opportunity.

For me the really important and deeply reassuring lesson from this collaboration is that schools and their leaders will gladly work together when they see purpose and value in doing so. We now have a chance to reclaim accountability so that it is properly focussed on what it needs to be, our students. Let’s work together to do just that.


Duncan Baldwin, ASCL Deputy Director of Policy

Life at SISRA HQ with Heather, Senior Support

Our support team answer your queries on a daily basis, so we thought you might be interested to hear what they have to say about life at SISRA HQ! Our second Q&A session is with Heather Smyth, Senior Support.


Q: What were you doing before you joined SISRA?

A: Before SISRA I worked as a bar and restaurant supervisor in a few different places. I lived in London while studying for two years, then moved back up to the north of England and worked in Frankie & Benny’s before joining the Support team at SISRA!

Q: How have you found working with schools?

A: I love working with schools, as I feel like I’m really making a difference. Although we don’t work directly with teachers or students, it’s nice to think that we’re helping in the background to improve the education of students all around the world.

Q: Have you noticed any changes in chats over the years?

A: The main changes in chats have been based around the big changes in Analytics and Observe, or in education itself. For example, when I first started in 2014 Progress 8 was only just being introduced! Recently, the majority of our chats are on setting up EAP mode or KS5 Legacy to calculate the new performance Measures, such as L3 Value Added.

Q: What do you enjoy the most about being on the support team?

A: That’s a tricky one, there’s so many things I love about this job! I think the main one would be how well everyone gets along with each other at SISRA. Everyone in the company is so friendly and helpful, and we’re all quite close on the Support team, so it’s lovely to spend the majority of the week with my friends.

Q: Have you had any particularly difficult chats?

A: I wouldn’t say there have been difficult chats as such, but we do enjoy unusual queries when a school is looking for a particular figure in the Reports or trying to set up something in a specific way. We always enjoy the challenge of these, as it allows us to all help each other to think outside the box and come up with the best solution available.

Q: Can you tell us something unusual about you?

A: Some people do think that I’ve given my dog quite an unusual name.  She’s a Dachshund/Chihuahua cross called Pam!

by Heather Smyth, Senior Support

How does data influence Ofsted judgements?

So; what better thing to do on a Sunday evening but to write some comments about data and the likes! There has been a lot of talk of late about the reliability and validity of attempting to predict outcomes and Ofsted School Inspection Update (March 2017) wrote…

’As inspectors, we can help schools by not asking them during inspections to provide predictions for cohorts about to take tests and examinations. It is impossible to do so with any accuracy until after the tests and examinations have been taken, so we should not put schools under any pressure to do so – it’s meaningless. Much better to ask schools how they have assessed whether pupils are making the kind of progress they should in their studies and if not, what their teachers have been doing to support them to better achievement.’

I understand the sentiment of this statement and to some extent support this, especially when one looks at the reliability and accuracy of some Ofsted judgements prior to results actually being published! I am very much aware that a school should not be judged on outcomes alone, but it is interesting to note the percentage of schools that have been judged as good (based on predictions) and then observe the Progress 8 scores as illustrated in the diagram below. All the green dots represent schools judged as ‘good’ as of January 2017.

There is such a wide range of outcomes here yet some schools who, on the face of it, have ‘inadequate’ outcomes and conversely some who have ‘outstanding’ outcomes have been labelled as ‘good’.

So where does the data influence judgements and what do Ofsted mean by ‘assessed whether pupils are making the kind of progress they should in their studies?’ Is this an objective process or a subjective one? I have seen first-hand how some inspectors have made judgements about ‘the quality of teaching and learning’ (and standards) because a child had drawn in their English book…and it wasn’t a very good drawing! So how can we make this process more objective? I would suggest considering and reflecting on the notion of ‘assessment without levels’, mapping the curriculum to the programme of study from year 7 through to year 11. This could then be matched to what a child should know, understand and can do in the various years and backed up with evidence of testing. Taken together, these measures would then allow for pretty objective formative assessments.

These are testing times and because of the changes in the examination system, schools really do not know what the outcomes will be, apart from the fact (as announced by OFQUAL) that there will be a percentage who obtain the various grades regardless, to some extent, of what the standard is as illustrated in this diagram:

The new EAP area in SISRA Analytics is allowing me (for the first time ever) to see if my cohort are on track regardless of which year they are in by matching up teacher assessments to where they should be in any particular year group or time (subject to how a school decides to assess. i.e. current/predicted/end of year/different KS3 and KS4 grading system etc.). However, I do feel the SISRA developers have been as flexible as possible in creating a system that caters for most (if not all) methods, which is pretty impressive. I suppose then Ofsted inspectors can observe what is going on in the classroom and in books and then look to triangulate those observations with some objective assessments as illustrated by the data in whatever year group they like (mind you, that’s what they were meant to have done before wasn’t it but how then does that account for the wide range of ‘goods’ based on the outcomes as above!).

by Nigel Sheppard, Deputy Head Teacher, Horndean Technology College

How to train and engage teaching staff in the use of in-school data

I am guessing that some of you may work in schools where data is used by every member of staff, and is the basis of effective intervention and strategic planning. However, some of you may work in schools where data is just looked at by the Head, or perhaps the wider SLT, and maybe a few eager beavers around the school. Are you struggling to get your staff engaged, and to really see the benefit of the use of your student data in the wider context in school?  In this article I will take a look at some of the negative attitudes to data and data analysis that I have seen, and how they can be changed through good, well planned and targeted training.

Attitudes to data

Let’s have a look at some of the attitudes I have come across towards in-school data:

“I don’t know what it’s used for”

“It’s just paperwork and admin that gets in the way of teaching”

“I’m not a ‘data person”

“I haven’t time to look at it”

“I am here to teach, not produce data”

“Data is boring.”

What do we want to change these attitudes towards data to?

“My assessments inform intervention and action”

“Data supports my teaching”

“It is intuitive and accessible”

“It is interesting because it tells me so much about my class, my qualifications or my year group”

“I don’t produce data, I use it.”

In order to do this, we need to show staff that the analysis of their data is not just a box ticking exercise. We need to ensure that they know what the data can do for them and what the value of it to them is.

Time your training carefully

How do we start? Well, through working with schools in my capacity as a data consultant for SISRA, I have learned that it is easiest to engage staff when they are shown recent, relevant data about the actual children they teach.

Data is worth nothing on its own. Its value comes when it is considered together with the back story. When looking at these numbers and charts, we need to remember that this is all about children, about students and about lives. When teachers are able to relate the data to the students that they know, that they teach, they understand the point. It also needs to be fresh data. There is no point trying to interest a teacher in the situation last term, or last summer. You have to work with fresh assessments so that something can actually be done with the knowledge the staff take away with them, and so that the exercise of training the staff can immediately lead to impact in the classroom.

For example, when the History teacher sees that Courtney, who they see as a problematic student, is currently actually achieving about a grade higher in their qualification than she is on average across all her subjects, or conversely that Kurt, who they thought was doing just fine, is working at over a grade lower in their qualification than he is overall, they start to get interested.

When that same History teacher sees in black and white (or possibly red) the gap between the students on pupil premium and the students that are not on pupil premium in their class, they start to ask questions.

I have led staff training in schools where I have had to stop talking at a certain point and let the teachers run with it for a while. While I want to move them on to the next report, they want to have a good discussion with their neighbour about the students in their class, about what the data is telling them about those students, and about what they’re going to do about it.

If the teachers are looking at old data, or data about students they don’t know, they are not going to be engaged. Where’s the ‘hook’? What’s the point?

So, make sure that you set the training dates just after an assessment point, when the data the staff are looking at is fresh and relevant.

Make it interactive

This also leads into another point, which is to make it interactive. This is the only way to ensure the staff you are training are looking at data that is completely relevant to them. Make sure you have enough computers for everyone. Either hold the session in an ICT suite, or ask everyone to bring their laptops. Each member of staff needs to have access to their own data on your chosen analysis system and to be able to look at their own students. This way, they can all look at the same information at the same time, but each teacher can look at it for their own qualification, class, or year group. There’s no point in all of them sitting looking at a whiteboard showing reports about the Maths department, they are going to be way more engaged if they are looking at their own department or class. Also, don’t make it a ‘speech’. Once you’ve shown them what can be done, set them tasks, give them quizzes, and create discussion.

Target your sessions carefully

The next trick with training staff is to train small groups, not the whole staff at once, and to tailor each training session to a specific group of staff. Let’s have a look at who these groups could be:

  • SLT
  • Subject Leaders
  • Class Teachers
  • Teaching Assistants
  • Inclusion/SEN
  • Heads of Year
  • Form Tutors
  • Support Staff
  • Governors

Each group only needs to see the information that is relevant to their role. Tailoring training to their specific needs will avoid your trainees losing interest, ‘drifting off’ and checking their email. Run a variety of training sessions to cover your entire staff, but do it in role-specific groups.

Continue the discussion

Now, at this point all our staff are trained in how to analyse their data and how it can be useful for them, so we can sit back, relax and wait for it all to lead to school improvement. Right? Wrong. As with any learning, if the learners don’t actively practise what they have learned, they will promptly forget it. And it would be oh so easy for everyone to forget about actually using the data and carry on with the way things were, entering assessment grades into the MIS once per half term and forgetting about them. So, the next job is to make sure the information resulting from the data is being accessed, discussed and used; that it leads to intervention, to differentiation, and has an impact in the classroom. Otherwise, there is no point in collecting it in the first place.

This is where what I call the ‘Informing Improvement’ plan comes into play. This is going to involve allowing ‘faculty time’, but if we are all in agreement that the point of collecting the assessments is to lead to intervention and impact in the classroom, then it is time spent well. What I am proposing is a regular programme of discussion and action, after each assessment point.

Informing Improvement!

Following each assessment point, at each departmental meeting, your school’s data analysis system should be opened up and shown on a whiteboard, and the teachers teaching a particular qualification will be expected to discuss the data they can see, and what it tells them together with the Head of Department. Discussion could be centred around a short questionnaire, but the point of the meeting would be to identify problem areas and formulate a plan of action and intervention. Heads of Department would then take this intervention plan to their SLT link for agreement.

Similarly, Heads of Year could also meet with tutors to discuss any students who are falling behind across all or several of their qualifications. They can then decide on some suggestions for pastoral support for those students, which again should then be agreed with the relevant SLT member.

The last ‘pathway’ on the plan is the SLT themselves. It would be the SLT’s job to look at any high-risk groups in their school and check progress, formulating a plan for student support where relevant and necessary.

So it is important to embed the discussion about data into your staff meetings, ensuring that the data that is going in is being accessed and is leading to action. All put together, the aim of this is to ensure that each and every student is learning and progressing.

Ongoing support

Lastly, we can’t assume that because they came to your training all of the staff are going to remember how to find the information they need. There needs to be an ongoing programme of support, to ensure that nobody is left struggling.

I visited a school recently where the SLT data lead is doing this very well, using SISRA Analytics as her data analysis system. Firstly, she regularly goes along to staff meetings and presents ‘what Analytics can do’ in 5 minute bitesize chunks. Alternatively, after an assessment has been uploaded, you could let staff know it’s there via email, and at the same time give a ‘hot tip’ for which report to look at, which filter to use, something specific to investigate. Next, she has set up ‘Data Friday’, a regular drop-in support session – she very sensibly bribes staff to come with the offer of food and drink. Lastly she has a prize of sweets for the most regular user of Analytics every month!

You can use your own strategies, but the aim is to make sure the teachers know what they’re doing, or if they don’t, they know where to find help.


The collection of student data is only worthwhile if it leads to something. The data needs to lead to information, which leads to insight, which leads to impact. A cliché perhaps, but true. In order for this to happen, your staff need to understand it and know how to use it. To help you make sure your staff are engaged with your in-school data you could:

  • Choose a time for your training when you have fresh, relevant data
  • Make it interactive, so each teacher has access to their own data
  • Do it in small, role specific groups
  • Continue the discussion, making sure the data is leading to action and impact
  • Ensure that targeted ongoing support is offered

by Becky St.John, Principal Consultant