Author: Emily Kirkby

DATA (SERVICE DESK) ANALYST

THE ROLE:

SISRA Limited is seeking enthusiastic and friendly individuals to join the growing Support Team, in order to help with the company’s continued rapid expansion within the Education Sector. The main responsibilities of a data analyst at SISRA are:

Identify/assess customer needs and support users of the SISRA services with their data queries (via a Live Chat facility and email)
Troubleshooting setup, calculations or queries relating to the education sector / services
Creating and developing help resources, including video tutorials and guides
Provide accurate and reliable information to customers and colleagues

Experience of the SISRA services is not required as full training will be given. However, successful candidates are required to become proficient in supporting SISRA services. The learning period is lengthy and it generally takes employees 2+ months to be able to support our customer base with queries. As the role is diverse, once proficient, other opportunities within the company may be presented to you, including testing, assisting at events or hosting online training sessions.

ATTRIBUTES AND SKILLS:

Very logical train of thought and quick thinking
Excellent Maths and English language/grammar skills – you MUST have studied Maths to a good standard (at least GCSE grade C or equivalent)
Strong written communication skills
You must be computer literate and have great attention to detail
Customer orientation and ability to build rapport

Experience in a similar role is not required as this role is such a unique and challenging one that we welcome applicants with any kind of background.

To apply for this position please email your C.V. to jobs@sisra.com.

All applications will be treated in confidence.

Data collaboration. The year that was, and what next?

It’s just a year ago since I met the development team at SISRA and we discussed what seemed a rather crazy idea at the time: what if we asked schools if they were willing to collaborate by sharing their data? If we did that we could solve one of the current headaches of school leadership, knowing with reasonable confidence how well your school has done against the Progress 8 performance measure, as soon as possible and without having to wait months for the official figures.

With hindsight it proved to be not that crazy.

The most affirming thing I learned from this exercise was that schools can, and will, cooperate with each other where there is both trust and a common purpose. In the end, over 1,100 schools covering more than 180,000 pupils agreed to share their data which was anonymised at the point of collection. Schools of all types opted to share their data: academies and maintained schools, selective and non-selective, girls’ schools, boys’ schools and mixed. When we think about the accountability climate in recent years and how this has driven competition between schools, this is incredibly reassuring.

The average line we estimated proved to be remarkably close to the initial DfE figure and even closer to the validated version. Schools reported that their collaboration figure was very close indeed to the final published score, far better than I had anticipated. It was that close because so many schools opted in; this only works when everyone takes part.

News of this collaboration has spread widely. I’ve discussed it with several civil servants at the DfE – and with representatives from the Japanese government. A senior figure from Ofsted asked me the other day whether we would be repeating it this year.

Which brings me nicely onto what we should aim for in 2018.

The need for collaboration is just as high as it was last year. With many more reformed GCSEs on offer, Attainment 8, Progress 8 and EBacc will all continue to be in flux. Our 2017 collaboration picked up a big drop in Attainment 8 because of the points allocated to legacy GCSE subjects. If I was a betting man I would have a £1 wager that Attainment 8 averages will go up in 2018, particularly for middle-ability pupils. This is because a grade C in a legacy GCSE was allocated 4 points last year, and this year about one third of equivalent students will get a 5, which will push up the average score.

The really big change this year will be at individual subject level. We will want to know as soon as possible the transition matrices showing the percentage of pupils awarded each grade split by prior attainment so we can be clear how well individual subjects have performed, and get a stronger idea about target setting for Year 10 pupils. This is particularly true in science, where the double award has a whole new grade structure.

But we really need to understand this early in the autumn term when evaluation of performance takes place. In order to look at individual subjects early enough we need more schools to share so we can get to that finer detail. And we need to be really confident that schools are recording and sharing data consistently about subjects if we are to drill in to this level.

For me, this collaboration between schools is rather more than just working out some averages and cells in a table. The whole basis of our accountability system is misdirected at the moment. School leaders spend so much time agonising about headline measures that it’s easy to forget the real purpose and direction of accountability – your school’s pupils and their grades. By working together in this way we have taken an important first step to recapture accountability and root it firmly in a school-led system.

The team at SISRA are keen to have me point out that SISRA administrators need to take a few extra admin steps this year in order to ensure that their data is ready for the extended collaboration.  This is vital not only to allow SISRA to deliver the subject-level figures schools so crave, but the free additional setup checks included will help ensure all figures are as accurate as possible.  Please don’t leave this until results day, I’m sure that time is stressful enough!

So I hope that 2018 will mean even more of you opt in to your provider’s data sharing arrangements. The more we collaborate with each other, the more we can learn, the faster we can learn it, and the better we can lead our schools.

ASCL – New to Managing Data Training

ASCL have released a brand new one-day training course that will define a data manager’s tasks and responsibilities throughout the year, and provide new data managers with the toolkit they need to carry out their role effectively.

Delegates will:

  • learn about key moments in the data calendar: what to expect, when and where to look
  • investigate options for robust and effective target setting
  • examine school performance tables and the calculation of key headline measures
  • discuss best practice for assessment tracking in a world without levels
  • learn which groups of students need to be monitored in school and why
  • understand how good data management can support and contribute to whole school improvement while considering workload

Suitable for new data managers or other support staff with involvement in the administration and/or analysis of student assessment, tracking and performance data.

For more information or to book your place click here to visit the ASCL website.

Please note, this training course is ran by ASCL. If you have any queries contact ASCL directly.

JOHN SEDDON

Prior to joining SISRA, John had a varied and interesting career path. Following a practical apprenticeship in plumbing after leaving school, his love of both computers and music led him to study the technology of engineering in musical recording. Whilst still a personal passion, his career path deviated towards data analysis, first for a utilities maintenance organisation, and then for a premier solicitors.

John started his journey into the world of education at a Manchester school in 2013, working in a small data and exams department with a focus on using SISRA Analytics for the school’s data analysis. During his time there he perfected the school’s use of SISRA, as well as becoming an adept user of SIMS for data and exams purposes.

John has developed and delivered many in-house training programmes on the use of SISRA Analytics. This experience and knowledge of the system opened the door for John to join the consultancy team at SISRA in April 2017, covering schools across North West England, Yorkshire, and Derbyshire.

The power of together

I’ve spent a lot of time over the last few years working with school leaders, civil servants and others on the Progress 8 measure, hoping to improve everyone’s understanding of how it works, what we can and cannot infer from it and what its pitfalls are.

No single performance measure is ever going to be perfect, but Progress 8 has a lot more going for it than its predecessor of 5+A*C including English and maths. For example, its inclusive nature, with every pupil’s grades contributing something to the school’s performance and the fact that all grades count is a much better reflection of the moral purpose behind school leadership. So is the fact that schools with different levels of prior attainment now have a chance to do well; it’s been great seeing that schools of all types can get very high scores under the new measure.

Those of us who have been involved with performance measures for some time anticipated some of the problems there would be with Progress 8. When contextual value added (CVA) was the headline measure, schools found that pupils with very low scores affected overall results disproportionately. This has proved to be true with Progress 8 and the DfE has committed to working with the profession to address this in its most recent Statement of Intent.

Another problem was the long delay between publication of exam results in August and finding out what this meant for school performance. As progress measures are relative and depend on the performance of all other schools, they have to wait until the DfE tells them how well they’ve done.

Or do they?

Unlike CVA, Progress 8 does not have complex statistical modelling underpinning it; it’s based on the simple average of the results of students with the same Key Stage 2 test score. All you need is each child’s results matched to prior attainment and you can calculate it without any other information. Crucially however, you can estimate it quite accurately if you have enough results, and this was the driving force behind the collaboration between ASCL, SISRA and well over a thousand schools this year. With around 180,000 students’ results, the accuracy of this collaboration compared to the figures published by the DfE several weeks later was very impressive and incredibly helpful to schools – click here to read Graeme Smith’s blog post to see why.

But why stop there? As powerful as it was, the collaboration between schools in 2017 could be just the tip of the iceberg. We can (and I hope we will) do the same again in 2018 but we should not feel limited to the calculation of early estimates of DfE measures. If that is all we do we will be missing a golden opportunity.

For me the really important and deeply reassuring lesson from this collaboration is that schools and their leaders will gladly work together when they see purpose and value in doing so. We now have a chance to reclaim accountability so that it is properly focussed on what it needs to be, our students. Let’s work together to do just that.

Q&As with Heather Smyth, Senior Support

Our support team answer your queries on a daily basis, so we thought you might be interested to hear what they have to say about life at SISRA HQ! Our second Q&A session is with Heather Smyth, Senior Support.

Q: What were you doing before you joined SISRA?

A: Before SISRA I worked as a bar and restaurant supervisor in a few different places. I lived in London while studying for two years, then moved back up to the north of England and worked in Frankie & Benny’s before joining the Support team at SISRA!
 

Q: How have you found working with schools?

A: I love working with schools, as I feel like I’m really making a difference. Although we don’t work directly with teachers or students, it’s nice to think that we’re helping in the background to improve the education of students all around the world.
 

Q: Have you noticed any changes in chats over the years?

A: The main changes in chats have been based around the big changes in Analytics and Observe, or in education itself. For example, when I first started in 2014 Progress 8 was only just being introduced! Recently, the majority of our chats are on setting up EAP mode or KS5 Legacy to calculate the new performance Measures, such as L3 Value Added.
 

Q: What do you enjoy the most about being on the support team?

A: That’s a tricky one, there’s so many things I love about this job! I think the main one would be how well everyone gets along with each other at SISRA. Everyone in the company is so friendly and helpful, and we’re all quite close on the Support team, so it’s lovely to spend the majority of the week with my friends.
 

Q: Have you had any particularly difficult chats?

A: I wouldn’t say there have been difficult chats as such, but we do enjoy unusual queries when a school is looking for a particular figure in the Reports or trying to set up something in a specific way. We always enjoy the challenge of these, as it allows us to all help each other to think outside the box and come up with the best solution available.
 

Q: Can you tell us something unusual about you?

A: Some people do think that I’ve given my dog quite an unusual name.  She’s a Dachshund/Chihuahua cross called Pam!

Q&As with Bex Heenan, Support Manager

Our support team answer your queries on a daily basis, so we thought you might be interested to hear what they have to say about life at SISRA HQ! We’re starting off with Bex Heenan, Support Manager, who has been working at SISRA for over 5 years.


 

Q: What do you love most about your role as the Support Manager at SISRA?

A: All of you! I would definitely say helping and getting to know the SISRA Admins at our schools. It’s really rewarding when we can help someone find what they’re looking for or better yet, speed up their data analysis. The lovely comments you give us really make our day!

Q: Do you have any favourite moments from speaking to users over the years?

A: There are so many! We get to know our data managers really well and this means some of our conversations can get a bit distracted. We end up going a bit off topic, I’ve had chats where we pretended to be pirates, discussed The Great British Bake Off and even shared some song lyrics! Not to mention, we get to see how inventive our SISRA Admins are when they change their name in the service; we’ve had Edward SISRAhands, Donald Duck, SISRA Lord and Q Branch, to name a few.

I’ve also had a few touching chats with data managers retiring over the years, after speaking to them for so long its quite difficult! That’s definitely stuck with me.

Q: What is the greatest challenge you face in your day-to-day role?

A: Problem solving and trying to find different ways to achieve what our schools are looking for. We often get complex queries on chat and we always want to give the best possible outcome or solution. So I would say trying to think outside the box, but it’s also the most enjoyable thing too – feel free to ask us anything!

Q: Are there any exciting plans for the year ahead?

A: As you probably know, this summer we launched our Data Collaboration programme and were able to produce accurate Attainment 8 estimates long before the DfE, and we’re really looking forward to building on this success. We’ve also just released the re-design of Analytics and the KS3/4 Reports, and are planning on improving the usability and friendliness of the reports still further.

Support have started working on a new project to introduce bitesize videos. They’re expected to be super short (1-2 minutes) and give quick overviews of where to find snippets of information in the KS3/4 Reports.

Observe has also been redesigned, with new features being introduced every month. We’re currently working on a new approach to our Observe user guides, which involves releasing shorter, to-the-point documents. We’ve re-organised Help > Guides & Handouts to make this easier to navigate and use our new resources.

Q: Tell us something unusual about you?

A: I’d have to say (and I think my team would agree) that the most unusual thing is that I will dip anything in tea (from chocolate to biscuits and even ham!).

The Science of Science

Confused about Science? There’s a strong chance you are not alone! I’ve had many discussions with schools recently in relation to the reformed Science qualifications, which affect all current cohorts.

What’s changed?

As part of the Government’s introduction of tougher and more challenging linear GCSEs, new specifications for Science came into effect in September 2016. As the content is dictated by the Government there will, to a certain extent, be a lot of similarity between exam boards and indeed specifications from the same board.
There is no longer a single combined science option now – farewell Core and Additional Science! They are replaced by a Combined Science qualification worth two GCSEs (double award). Separate sciences continue to be an option as always, with 9-1 grades.

It doesn’t help that the most common course for the new Combined Science qualification is AQA’s ‘Trilogy’ (QN 601/8758/X). As Trilogy means ‘a group of three things’, you could be quite easily forgiven for thinking this may comprise of three sciences. AQA also offers ‘Synergy’ (QN 601/8760/8), OCR offers ‘Gateway A’ (601/8687/2) and ‘21st Century B’ (601/8690/2), whilst Edexcel keeps it simple (hooray!) with ‘Combined Science’ (601/8765/7)!
Further details for each exam board can be found here:

http://www.aqa.org.uk/subjects/science/gcse

http://www.ocr.org.uk/qualifications/by-type/gcse-related/science/

http://qualifications.pearson.com/en/qualifications/edexcel-gcses/sciences-2016.html

But what happens to the grading system?

As Combined Science is now a double award, a 17 point grading system has been introduced. These grades will never be more than a grade apart and (we understand) the first grade will always be the higher of the two. For example:

null

Grades of 1-1 to 5-5 can be awarded in foundation tier, with higher tier offering grades from 4-4 to 9-9. A fail grade would simply be a U but worth two entries.
I haven’t been able to find an official explanation of the grading system, other than this document:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/443980/gcse-science-decisions-on-conditions-and-guidance.pdf

Exam boards, such as AQA, also offer (limited) information on their websites.

What does this mean for the headline figures?

Combined Science can take up to two slots in either the EBacc or Open elements of Progress 8 where this represents the highest relevant grades achieved. One grade from this qualification can fill one slot if higher grades are achieved in other qualifications.

However the points awarded to Combined Science are averaged – because of this it is crucial that you set this qualification up correctly in any data analysis tool. For example, grades of 6 and 5 would be averaged to two 5.5 points to fill either one or two slots as appropriate (a maximum of 11 points across two slots). Let’s take a look at how this will work for a current Year 11 student:



Drama does not contribute towards the student’s P8 score

How do I ensure my set up is correct in SISRA Analytics?

Firstly, you will need to set up a 9-1 Double grade method in CONFIG > KS3/4 Grade Methods. Here’s an example (sub grades are optional):

A top tip when naming your grade methods, is to look at the order in which they appear in the Config area – their names and order appear in the reports, so do use appropriate grade method names (BTEC rather than D*DMP) and think about the order in which you would like them to appear (most common at the top, for example).
Secondly, when setting up an EAP, ensure that the Combined Science EAP uses the 9-1 Double grade method:

The EAP determines which grade method Analytics will use when calculating your grades data – this is equally as important for all other subjects! Another top tip for naming your EAPs is to use references to their baselines and end point e.g. ‘Avg KS2 to 9-1 Double’ if using KS2 as baselines:

Or ‘Combined Science 9-1 Double’ if using FFT estimates:

Including the grade method in either example has the advantage of not only helping you when cloning EAPs, but also ensuring the ‘KS4 MEASURE’ column on the matching page is accurate:

Finally, to the matching page! KS3/4 EAP allows you to correctly set up Combined Science as ONE qualification (but as a double award). You should have just one (green) row using an EAP that uses the 9-1 Double grade method. It will also need to be nominated as ‘Science’ in the Special column if it is an approved qualification that counts towards the Science specific measures, such as EBacc (all the QN codes above do).

If your Science staff only award single grades (common in Year 10), then these grades are effectively doubled by Analytics e.g. if a single grade of 5 is uploaded, then it will be treated as a 5 5 (providing the 9-1 Double grade method is used). This will save you time by not having to ‘double up’ the grades.

We do occasionally see Combined Science set up on two rows – as Combined Science 1 and Combined Science 2 for example.

Whilst this will not affect your Attainment 8 headlines and entries, it will have an effect on your Attainment 8 & Progress 8 elements – there’s an example further below.

How do I know my headlines are correct?

Once you’ve followed all of the above steps, take a look at your Reports. There’s potentially 3 different set ups for Combined Science. The first uses the correct set-up (one row for a double award qualification), the second has Combined Science set up as two different qualifications (like the old Core and Additional), with the third as one qualification using the 9-1 Single grade method.

This example school has a cohort of 210 and one student has a grade which could affect the Attainment 8/Progress 8 EBacc and Open elements. When looking at the EBacc and Open Attainment 8/Progress 8 elements below, we can see slight differences between the two. Only slight, but bear in mind this is in relation to just one student and will mean your figures do not match the DfEs – imagine if the grades of half your cohort affected these figures!

We can see a huge difference for the Ebacc Attainment 8/Progress 8 element when we look at the figures where the 9-1 single grade method has been used – it’s now negative!

Your overall Attainment 8 and Progress 8 figures should not be affected unless you use the 9-1 single method. Checking this table in the Headlines Summary Report will give you an indication of whether you are using the correct grade method:

We’ve seen how this affects our headlines, so let’s take a look at how this would affect an individual student. In this example using the Student Headlines Report, we can see this student’s Combined Science qualifications have been averaged from his total Combined Science points of 9 and 4.5 points fall into both the EBacc and Open baskets.

When set up as two separate qualifications, the lower Combined Science grade does not contribute to the Open basket:

Whilst in this example, his Attainment 8 score will remain the same, we can see here the difference it has made to his EBacc and Open Progress 8 scores:

What if my school tracks Biology, Chemistry and Physics grades for Combined Science?

This can still be done. There’s an example in the screenshot below. Just ensure that your Combined Science separate qualifications are set to Unapproved. This way, you will be able to analyse the data for the separate qualifications, classes, and students, without it affecting your headline figures.

For any schools still using Legacy mode, please note it does not cater for 9-1 double grades. Setting Combined Science up as two separate qualifications will ensure the number of entries is correct, however it will count the qualifications separately towards Attainment 8/Progress 8 calculations and the average points will not be used (just like in the example shown above).

Hopefully your own SISRA Analytics set up will be spot on, but it’s always worth checking! Our Live Support team can always assist any SISRA Administrators if help is needed.

 

Update May 2018

Guidance has today (23/5/18) been published by the DfE:
https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/693519/Combined_science_grading.pdf
https://ofqual.blog.gov.uk/2018/03/23/grading-the-new-gcse-science-qualifications/

 

Attainment 8 Estimates Data Collaboration

‘The goal is to turn data into information and information into an insight’
– Carly Fiorina

If you already use SISRA Analytics you will know what a valuable tool it can be for monitoring students’ progress both at current tracking as well as potential performance. The features to model different groups and compare with past data are also useful.

I’ve been an advocate of SISRA Analytics since I first encountered the service in 2012, and have introduced it in my last two schools as a tool for all members of staff.

Therefore, it was with great interest that my colleagues and I learned of SISRA’s collaboration plans (led by ASCL) to create their own Attainment 8 estimates. We signed up immediately, and were keen to encourage others to do the same. For a school in our position (with a low P8 score in 2016), this has been invaluable.

We have been able to demonstrate that, when compared with 1,177 other schools*, we are likely to have made substantial improvements on 2016. This gave us the confidence to review our own self-evaluation at an early stage. The shared data has been very close to the data within the checking exercise too (slightly conservative in comparison, but still closer than anything we’ve had before).

‘Without data you’re just another person with an opinion’
– W. Edwards Demming

That was true of us. We thought our progress had improved, but in this brave new world of 4s and 5s, we had no reassurance. Sharing our data with all these other schools gave credence to our opinion.

As a school we are committed to sharing our data and working with other schools where we can – this supports our value of aspiration (by sharing and learning with others we are supported to be the “best we can be”).

After the success of the 2017 opt-in I’m very much looking forward to taking part in the 2018 opt-in, which SISRA is suggesting may well stretch beyond A8/P8 data. I would strongly urge all schools to opt on to this once it becomes available. The bigger the data set, the more reliable the hypothesis, and the more insight you will get into the progress of your students.

*as of October 2017

How Ransomware can cost you your data

On Friday 15th April 2016 the school email systems, the internet and SIMS went down; we then systematically lost access to the servers throughout the day. IT confirmed early on that we had been hit by a virus and they were working on clearing it. The virus could be seen by a name on the files, and all staff were advised to save unaffected files as soon as possible. As an aside I mentioned to a colleague that so long as it wasn’t Ransomware we were fine. This proved to be prophetic in all the wrong ways. By the end of the day IT had confirmed that it was Ransomware and all the servers were affected and encrypted. The attack had come through the new BT server which had passwords that were not as strong as the rest of the network, and it had come through a brute force assault – a password cracker run for a month against BT lines.

No chance of a quick recovery

On Monday, it was clear that there were significant recovery problems, as the server backups saved into the servers and as such were compromised as well. By this point we were copying the paper standby fire registers to use for am/pm registration, and this situation was to continue for the next 6 weeks. All of the servers were unrecoverable and had to be rebuilt, Capita came in on contract to assist. One external backup from the previous year had been located and could be reinstalled, but the date for it was early August 2015. As a consequence it would be missing any data for the current academic year, results data and the year’s timetable.

Over the next few weeks the servers were rebuilt, SIMS reinstalled and the only backup uploaded. This gave a draft timetable that was too unstable to amend, no attendance data and all current calculations in SIMS for the first year of the new GCSE grade scores were gone. Fortunately, the student and staff drives and the work in them could be recovered to a degree. The Local Authority was initially unable to return the attendance data, and we ended up having to pay for them to reverse engineer the upload. This did not come back until the start of the next academic year. The exams data could be reimported but as we use SISRA Analytics, the collection data was secure on their website and could be recovered and dropped back into our systems.

The effect on exams and data

During the six week period this happened in, life was not pleasant in the student office, as all our systems were down, and very little could be done to rebuild the missing SIMS calculations, input the growing pile of paper registers for all year groups, or check the behaviour logs. The exams officer had to recreate approximately 4,500 individual exam entries and special circumstances, as our records for the GCSE and A level exams no longer existed. Curriculum staff were informed SIMS was back up and running after about five weeks, though this statement omitted the level of damage to records and led to some acrimonious exchanges when asked to provide attendance or behaviour data.

The timetable was frankly shot, and could not be amended in any way for the rest of the academic year without causing its total collapse. Every calculation needed in SIMS had to be rebuilt from creating grade sets onwards. All of that year’s data had to be reimported from SISRA Analytics, and at the same time the final data collection and reports for the year collated and disseminated. Timetable migration and the results day preparation also had to be completed before the end of July.

The damage sustained continued to be felt into the next academic year with errors being found in SIMS, calculations, and processes. Initial repairs had focused on those areas in use and as new events rolled round more compromised systems were found. The class migration completed in July failed on 1st September leading to all groups having to be recreated by hand from records.

What did we learn?

A number of lessons came from this process. External backups are essential, as are strong passwords and not opening any suspicious emails. Regularly saving and exporting essential data, documents and reports should be done. Likewise, class membership can be exported and saved, which provides an extra level of backup when the year dates roll over. Finally, cloud based or externally hosted systems like SISRA Analytics are fantastic, because you don’t lose all your data!

Identifying Students Not On Track

My school started using SISRA Analytics approximately two years ago, initially just for external exam results. Once we saw the power of the analysis we began tracking internal assessment grades for all year groups. We also find being able to look at other datasets such as FFT estimates very useful.

The EAP area is very easy to set up and the reports allow us to easily identify students who are not on track to achieve their projected grades throughout all year groups. There is also the added bonus of being able to track progress throughout both Key Stage 3 and 4.

Our assessment policy has changed for the new academic year and I am reassured that I need to make very few changes in EAP. It really is very adaptable and flexible to schools’ needs.

We have enjoyed exploring the new functionality within the reports. The reports have improved the way we report to parents too, so they can quickly see whether students are on track to reach their end of year target.

SLT really like using the power of the Qualification Totals report which provides them with an instant overview of on track performance across all subjects. It’s really quick to drill into faculty, class or student data too.

 

Victoria Kirkwood, Data Manager, Spalding Grammar School

Using Data sets for ‘What if’ scenarios

Using SISRA Analytics for exams analysis? Check. Using it for assessments? Check. Using it to see how things might work out? Read on.

The benefits for your SLT, middle managers, and class teachers in using SISRA Analytics are well known. Hopefully they are well trained on using it to investigate and highlight under-performing groups on a regular basis. But I’d like to talk about another use for SISRA Analytics: the ‘what if’ scenarios.

In SISRA EAP, you can create up to five regular data sets in addition to ‘Exams’ and ‘Assessment’. And within ‘Assessment’, you can create as many data sets as you like (publishing up to six in a particular year). So at our schools, we often use one of these spare data sets to try things out, either with grades for all subjects, or for selected subjects. Some possible scenarios include:

  • A member of SLT is interested in how the FFT Benchmarks compare to the targets your students have.
  • The head of maths wants to analyse the results of a recent test to see if the Pupil Premium interventions have worked.
  • A set of ‘aspirational targets’ are suggested by a deputy head and they want to see how the headlines would look before committing to using them.
  • A target review has taken place, and class teachers have made some suggested changes. Before they accept the changes, SLT want to see how things would be affected.
  • A set of nationally recognised assessments are undertaken – SLT want to see how a cohort has performed using CATS predictors, MidYIS assessments etc.
  • It can be done in the KS5 Legacy area of SISRA too, to see how ALPS/Alis/Level 3 transition estimates would stack up in the sixth form.

Y11 Reports Homepage with minimum and aspirational target datasets, and also FFT Estimates

Sometimes these data sets are analysed and compared for a short time, and so you can publish them as ‘Locked’ data sets to avoid confusion for other SISRA users. You can set up a ‘Locked’ authority group in the ‘USERS’ section of SISRA Analytics to give selected users access to these locked reports.

Aspirational targets published with ‘Locked’ status

Remember to review the use of these temporary ‘what if’ data sets regularly, as keeping the vast majority of your data sets available to all users is important for data transparency.

So next time you find yourself creating complicated SIMS marksheets or Excel spreadsheets to figure something out about a set of grades, consider popping them into SISRA Analytics which can do all the heavy lifting for you. SISRA is not just for the regular data cycle, it is there for those one-off occasions too.

Hiding behind your Data? A Governor’s perspective

Are you giving your governing body accurate data? Does it allow them to see the full picture? As a governor, I am often presented with a snapshot of the latest assessment, but many of us have full time jobs, and for most these are not within education. So, does a small snapshot of your current assessment give them enough information to challenge? That is after all the role of the governing body isn’t it?

It’s increasingly frustrating to hear of schools using old measures to communicate their headline data and others bandying Progress 8 figures around as though they are accurate. It is difficult enough for school staff to understand that P8 is not going to be accurate until the validated results of that national cohort are released, so can you imagine how hard it is for a governor with no education background? I do believe that Progress 8 can have a place though, but it is just not suitable for whole school accountability. For example, it can be used to rank students and investigate areas of weakness for intervention and also to look at the gaps between groups of students across time.

So, if we can’t share old measures or the new Progress 8 measure, what information can school staff give their governing body? I’ve been thinking about this a lot recently and thought I’d share a few of my ideas. The examples below are using data that is based on ‘Working At’ grades, but it would still be as powerful if you are using ‘Predictions’.

Threshold in English and maths

These charts offer a very visual picture of the school performance at this point. In this case, I’ve compared them to the school targets to give a better indication of how far they are from expectations.

The more data savvy governors will probably be biting at the bit for more detailed information – this example shows the key groups within a cohort and how each of these groups are performing against the basic measure. This would allow governors an opportunity to see the strengths and weaknesses within your school in this key headline measure.

Is this still enough data though? From this you would deduce that the one student from ‘White and any other background’ is a key concern and intervention is required. However, if you add the targets in you can see that this student has not been targeted to achieve the threshold measures – another conversation altogether!

Below are some other charts that give a good indication of the cohort performance:

You’ll notice that I included the Progress 8 chart, despite saying that it shouldn’t be used as a figure. Provided that the targets and assessments are both using the same A8 Estimates, this can be used to measure the gap from this assessment to the target, as long as a little notice is taken of the actual number that is being produced! But that does lead me nicely on to how Progress 8 can be presented to the Governing body.

This report allows the governors to see that there have been no dips in overall performance, but they can still see that there is some way to go from the target as they have the charts to reference.

Another question that I’ve been asking myself is whether it is appropriate to give qualification level data to the governing body? My conclusion is ‘yes’, as it allows them to question the data in more detail. If you are using pathways in your analysis and tracking, then being able to combine that with your gap analysis is going to give a clear indication on where subjects are having success with a particular group of students. This information can then be used to inform teaching & learning across other subject areas.

Governors will be able to see that English have the greatest percentage of pupil premium students who are below where they should be at this point in time. You could take it one step further and look at the bottom 5 qualifications against targets. This is an example of the English Language Spring assessments vs the KS4 Targets – by presenting this, it is easy to see the numbers of students and how far away they are.

As a governor myself, this is the type of information that I would hope to be presented with at meetings, but it’s still very bitty. I would suggest putting together a governor return so that the information is consistent at each meeting and the comparable data is readily available for questioning!

If your analysis tool allows you to set up bespoke permissions, you could consider giving your governors access (and an assessment timetable!). You will need to train them fully on how to use the system, and this will take more than one session. But allowing governors to interrogate the data for themselves, and come to the meetings with questions at the ready, will lead to a more productive meeting and keep them to time.

For SISRA Analytics customers, keep an eye out for our Governor Training Pack that is coming soon!

Life After Levels

My school first started using SISRA Online in March 2013 before moving to SISRA Analytics in June 2014. We started using the Analytics EAP mode, which was specifically designed for Life after Levels, in July 2016.

I found EAP mode straightforward to set up, and made use of the materials available in the HELP area, such as the videos and help guides.
My school developed its own KS3 competency grading system which, was easy to set up as EAP mode allows you to create your own bespoke grade methods. Ideal for our grades of Exceptional / Extending / Secure / Developing / Beginning. Previously we had to use Excel which was time consuming and unwieldy.

We were also able to choose to use flat lines as flightpaths too which, was my school’s preference. The EAP area is very similar to Legacy mode in Analytics with regard to uploading data, so this has helped with the transition too and prevented me having to create additional reports and marksheets in SIMS.

We also use EAP mode to analyse ‘Attitude to Learning’ grades which is excellent and has been really useful for my school.

Due to the many recent changes implemented by the DfE, greater understanding is required by data managers, but using SISRA Analytics saves time processing and publishing the analysis compared to other methods. Thank goodness for SISRA help guides which digest the DfE docs into palpable ‘what to do’ guidance!

I am thankful for Jon Williams, SISRA Technical Director and his Webinars as we went through the phased releases too.

Ruth Williams, Data Manager, The Minster School

Changing Data Management

We began using SISRA Analytics in September 2016. It was brought in by our new Head and we needed to be up and running almost immediately with it. We were, to be honest, slightly reluctant to change our ways with data management, as we had very effective Excel spreadsheets in place with which we were able to accurately produce all our headline figures and student analysis. However, after some SISRA training and using the guide sheets, we were soon able to have some data in the Legacy area and could immediately see how the speed with which SISRA can produce all of our analysis was going to help our efficiency immensely.

We have found SISRA to be very user friendly in terms of data management. The help guides are very clear to use and by simply working through these we have been able to set up a complete system and get into an efficient routine of using it. The support from SISRA has been great, being very quick to help with any issues and the DataMeets have also been very useful for gaining know-how. We are extremely impressed with how SISRA keeps up-to-date with all National developments and delivers exactly to time on its upgrades.

We have been running both EAP and Legacy areas in tandem. This has enabled us to have different Progress 8 scenarios in place for Year 11, based on 2016 or 2017 points. It is also very quick to see other ‘What If’ situations by reconfiguring and republishing. EAP is more logical to set up, being split out into Students, KS2 and Grades, and much quicker to publish! We have up to 5 data collection points per Year during the year and we are now able to have unchecked data into SISRA within a couple of hours of the collection deadline. This then enables teachers and Heads of Department immediate access to their data for checking purposes, easily spotting any errors, omissions and trends, before we republish checked data.

The possibilities for analysing and viewing data in the EAP area are vast, especially now with the added dashboards and charts
– we are definitely SISRA converts!

Sue Lefley, Data Administrator, The Deepings School

How to be a great Data Manager

I stumbled into the role of a data manager. I was coming to the end of a fixed-term contact and saw a position advertised for exams officer and data manager which I thought sounded interesting. It seemed to bring together all of the skills I’d acquired in my previous roles; data, Excel, information systems, training, data quality, and data accuracy. I had no knowledge of education or the education sector and certainly didn’t realise at the time that the data manager position existed as a vital role within all schools, which meant Google quickly became my best friend!

The role of the data manager varies from school to school; some are involved in exams, supervising cover, timetabling, MIS management. Some do all of these and more! Even the job title varies across the country; data officer, data lead, assessment officer, to name a few. In my three years as a SISRA Consultant, I’ve learnt that the pathway that leads people to becoming data managers within schools also varies. Some come from a data and IT background from another sector, others develop into doing the role having worked in a school in an administrative position and others “just end up doing it”.

However you end up doing this role and whatever your experience is, I strongly believe that training, having a good level of communication with key staff at school, and being supported is vital in helping you to do your job well. There’s lots of things that you can do yourself to make your life easier, so I’ve listed my top tips below:

Understanding Assessment Data

Make sure you understand the assessment cycle within your school:

  • What data is your school collecting?
  • Why are you collecting this data?
  • What happens with the data that is collected?

Understanding Education

  • What type of school are you working for?
  • What Key Stages does your school have?

If you have come into the role from another industry, you will come across plenty of educational acronyms and terminology. You may find it useful to keep a list of these to refer to:


 

Be Organised

  • Do you know when assessment data is due?
  • Do you know when reports are due to be sent out to parents?

Being organised is key to good data management. I used to keep a chart on my office wall of when each year group had reports due, similar to the two examples below. Collate this information and update it for each academic year. If you are also involved in other things such as exams or the census, include these key dates as well.


 

Implement Processes

I love flow charts or diagrams of steps that need to be taken when following a specific process or procedure.
When first being involved in assessment data and reports, you may find it useful to put the school procedure together as below:

Understanding Headline data and DfE Performance tables

If you are new to education, how do you find out what figures your school is measured on?
The DfE School Performance website is a good place to start. Search for your school and view the latest headline figures. Make sure that you have an understanding of how these figures are calculated by accessing the relevant DfE guides and sign up to receive email notifications about new releases and updates.

Training

If you’ve taken up this role without previous experience of the education system you must be trained or be able to spend sufficient time with key staff in school who can further explain to you what you need to know. If you are struggling, try to think of the things that you feel would help you to do your job more effectively and if possible, ask for additional training.

Stay up-to-date

Things in the education sector are always changing. Stay up to date by:

  • Joining forums and networks
  • Building relationships with data managers at other local schools
  • Attend SISRA datameets (even if you don’t use SISRA Analytics you are very welcome!)
  • Join the Data Managers Facebook group
  • Sign up to receive DfE Gov UK email alerts

 

And remember…

  • Take your time
  • Be accurate
  • Be inquisitive
  • Don’t be afraid to ask questions!

We’d love to hear any best practice, hints, tips or ideas from your experience of data management, so feel free to comment below!

Getting the right results (PT.3)

Welcome to my third and final blog post for this academic year. Hopefully you will have found the tips in my first post Getting Results Day Ready and second post Comparing Against Targets useful.

If you find yourself asking ‘Why don’t our headline figures match the DfE’s?’ don’t panic – here are 4 key areas to check:

1) Cohort total and Key Stage 2 data

 Always check the number of pupils on roll in the DfE June Tables Checking Exercise (a letter is sent to the headteacher in May).  You should also check which pupils classed as SEN.  Ensuring eligible pupils and their characteristics are correct can make significant differences to the performance tables, as this data effectively determines the calculations.

Finally, check the KS2 data you have matches that supplied by the DfE.  It has been known for the DfE to have KS2 data for some students which you have never been able to track down.  Ensure the data matches exactly to 5 decimal places – the odd student with a slight difference can affect your Attainment 8 and Progress 8 figures.

Remember, exams results are based on the cohort at the time of the January Census – you can ask that certain pupils are removed if they meet specific criteria e.g. ‘admitted from abroad with English not first language’, or ‘Permanently left England’.  Guidance can be found in the documentation on the Tables Checking website: https://tableschecking.education.gov.uk 

2) Non-curricular subjects 

 It’s good practice to always check with your exams officer whether any students are sitting exams for non-curricular subjects.  At the school in which I worked, we often entered pupils for examinations in their home language.  These can make a significant difference to your school’s headline figures.  Below we can see the Attainment 8, Progress 8 and EBacc data for our school, compared against targets:

 

The above data excludes the 6 home language grades which were all B grades.

The above data includes the 6 home language grades which were all B grades.

I also found that the odd student took Grade 6, 7, or 8 music qualifications. Whilst these exams were sat externally, they are still included in the performance tables. A Grade 6 Pass is worth 7 points in the open basket! This has the benefit of increasing the Average Attainment 8 figure and grade for both the student and school. It might be slight but every little helps! For some students, it may even fill an empty slot in the open basket.

A GCSE Music grade could also count in the open basket.  I’d recommend you always check the discount codes here though: https://www.gov.uk/government/publications/key-stage-4-qualifications-discount-codes-and-point-scores

I used to ask form tutors to enquire whether their tutees had external music awards and tried to obtain a copy of any certificates.

Beware of reformed and unreformed qualifications particularly with home languages – if a student takes an unreformed exam prior to Year 11 and it is reformed in Year 11, it will count for student performance, however it will NOT count for school performance!  This is often an issue for home languages and they will not be fully reformed until Summer 2020.  Unreformed qualifications can still be analysed within Analytics, but should always have ‘Unapproved’ as the method so they do not affect the headline figures.

A timeline of reformed qualifications can be found here: https://www.gov.uk/government/publications/get-the-facts-gcse-and-a-level-reform/get-the-facts-gcse-reform

You may wish to consider AS-levels for some students who have already taken an unreformed GCSE.  These will count in the appropriate Progress 8 ‘basket’ for their subject.  If a GCSE in the same subject has been taken, the AS will always discount the GCSE – grades A and B will score higher points than an A* at GCSE!

Finally, in the October Tables Checking Exercise you will be able to access a csv file which contains details of the grades awarded.  There’s also a handy Progress 8 column – an easy way of checking your data is to compare individual students’ P8 scores to the Progress 8 figure in your own internal software, whether it’s SISRA Analytics, Excel, or otherwise.

3) The grades data

Have any students been previously examined in any qualifications?  The DfE’s early entry/first entry rules came into effect on 29th September 2013 and from this ‘only a student’s first entry to a GCSE examination will count in their school’s performance tables’.  Any subsequent entries after this point (in the same or any related qualifications) are ineligible to count towards school performance measures, although they will still count for the student.  Guidance can be found here:

https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/651207/Key_stage_4_discounting_and_early_entry_guidance_2017.pdf

During the autumn term, you may have some grades which need to be updated due to remarks.

4) Incorrectly set up qualifications

 Finally, always check that qualifications are set up correctly.  Start by checking the measure is correct – this affects how it counts towards your KS4 measure and headlines.  Are GCSEs correctly set up as either 9-1 or A*-G, BTECs and Cambridge Nationals as Non GCSEs?

As mentioned above, AS qualifications discount GCSEs so ensure discount codes are used (you do not need to use the official ones in Analytics e.g. for Polish GCSE you could use POL, and for Polish AS *POL – the asterisk ensures the AS takes priority).  Also check the discount codes for qualifications which are similar to one another.

Are your grade methods using the correct points?  These can be found here:

https://www.gov.uk/government/publications/progress-8-school-performance-measure or in SISRA Analytics > HELP > VIDEOS AND GUIDES > Useful Links.

Now it’s time to check which subjects count towards the EBacc (this is known as the ‘special’ column in Analytics).  It is really important to ensure qualifications are correctly set up as it can make a significant difference to your figures if not.  For example, home languages such as Polish, Urdu etc. should be set to ‘language’ in the ‘special’ column.  Computer Science is another subject which is often omitted from the EBacc too (it should be set to ‘Comp Sci’). Conversely Religious Studies is often incorrectly included as a subject which counts towards the EBacc as a ‘Humanity’.  It doesn’t count towards the EBacc, but can count in the open basket.  A list of the qualifications which count towards the EBacc can be found here:

https://www.gov.uk/government/publications/english-baccalaureate-eligible-qualifications

Last but not least, are you using the most recent DfE Rules, Attainment 8 and VA estimates when publishing?

At this point your headlines should be matching 😊 

This post was originally published on 19th June 2017 and was last updated on 31st May 2018.

Comparing against targets (PT.2)

With Results Day fast approaching we will soon be analysing our exams grades against targets at school, faculty, subject and student performance level.  As discussed in my earlier blog ‘Getting Results Day Ready’ preparation is key! With this in mind, it’s a good time to ensure that the number of targets tally with the number of results you are expecting.  Let’s take a look at some data to see how important this is.

Below we can see Attainment 8, Progress 8 and EBacc data taken from the SISRA Analytics Headlines Dashboard (you can also use Headlines Charts for more detailed analysis). I have compared the Y11 Spring data against school targets.  All of the school’s timetabled qualifications have been included in both datasets.

Imagine we have just found out that 6 students will take a GCSE in Polish.  The Head of MFL expects they will all achieve a grade B (this qualification remains unreformed for Summer 2018*).  I have added these grades to our Y11 Spring collection so it mirrors the number of entries we expect to see in the exams dataset.  As a result, the headline figures see a small but positive effect 😊.  For some schools this could be the difference between a negative and positive P8 figure though!

Is there anything else we need to consider? Yes; for complete accuracy we should also ensure that any datasets we compare against have the same number of grades uploaded.  For the 6 exam grades I have entered, I should also enter 6 target grades to enable me to make accurate comparisons (all targets entered as B grades).

See how this has affected the Attainment 8, Progress 8 and EBacc target figures:

This is extremely important both at qualification and class level, particularly if the data forms part of a teacher’s performance management.  Here we are looking at the cumulative pass for the MFL faculty without the Polish grades:

Once the Polish grades have been added to the spring collection, due to both French and Spanish being reformed and Polish being unreformed, the data will appear in two tables, one per grade method.  A great way to analyse the data for a faculty in one table, regardless of different grade methods, is to use the OPTIONS functionality and select ‘All A8 Quals’ as the Grade Type.

Using the functionality mentioned above ensures you get maximum benefit of the summary rows.

When we factor the targets in too, see how the figures change again.

As Heads of Department and Class Teachers can be judged on performance (e.g. percentage of students achieving 9-7, 9-5 and 9-4 grades), ensuring all non-timetabled results and targets are added can have a positive effect on their data. This could be the difference between a pay increase or not!  The school also benefits from improved headlines too.

Always ensure the figures tally for all other datasets you compare against – whether its FFT estimates, performance management targets, CATs, MIDYIS, or YELLIS grades.  Also, when you are modelling targets or producing forecasts, a complete set of grades is essential for accuracy. 

A simple check can be made in Analytics to see if your datasets tally; simply compare the grades data for an assessment collection against your targets and check the ‘Total Grades’ column.  Your colleagues may just thank you for it!

Do also check that when comparing datasets they are using the same DfE Rules, Attainment 8 and Value Added Estimates – we shouldn’t try comparing apples with pears 😊

Another common mistake is that qualifications are not correctly nominated as EBacc subjects.  Here we can see the effect of Computer Science on some of the key headlines when it is incorrectly set up, against when it is correctly set up as a special.

Another subject often incorrectly nominated is RE as a humanity. This has the opposite effect of Computer Science on the EBacc basket.

Hopefully by reading my earlier blog as well as this one, you should now be feeling more confident about results days and the accuracy of your data.  Why not read part 3 of this Results Day series of blogs; it looks at troubleshooting if there are discrepancies between your figures and the DfE’s.

Links:

* https://www.gov.uk/government/publications/get-the-facts-gcse-and-a-level-reform/get-the-facts-gcse-reform

This post was originally published on 16th June 2017 and was last updated on 31st May 2018.

Getting Results Day Ready (Pt.1)

It’s almost that time of year when we are filled with a mix of dread and nervous anticipation.
Whilst chatting with some Data Managers, someone new to the role asked for some results day survival tips. Our top tip was preparation, preparation, preparation! Even for those who have been in the role for years, this still applies. Ensuring you are well prepared does take time, but it will make embargo days less fraught.

I think Abraham Lincoln had a very good point when he said;

“Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

Of course, we cannot always prepare for all eventualities but it is good to have a back-up plan in case something does go wrong, such as staff illness or no internet connection.

As we all love a good spreadsheet, it is handy to prepare a schedule similar to the one shown below. It ensures all staff are aware of their responsibilities and that the exam period runs as smoothly as possible. I used to break mine down into pre-results days, A-level embargo and results day, and GCSE embargo and results day.

If you are new to the role, you could visit another local school to discuss their procedures, or attend any local data staff meetings. Some Local Authorities host these or there are the SISRA DataMeets too (you don’t need to be a customer to attend) and you can find out more about these here.

 

 …so what kind of preparation can you do?

 

  • Check A2C is working and you can connect to the exam boards. Install it on a second machine too just in case!
  • Refresh exams base data.
  • Set up embargos in your MIS – don’t forget to include yourself and be mindful of JCQ regulations!
  • Upload any banked exams (with the appropriate date for first entry rules) to SISRA Analytics.
  • Ensure your student data reflects the January Census (students, KS2, SEN & FSM Ever 6 information).
  • Check the number of student targets match the number of entries for each qualification (you can read more about this in my ‘Comparing Against Targets’ blog).
  • Organise acquiring results from any external providers.
  • Partially complete any LA or Trust returns with as much information as you can e.g. cohort numbers, PP, SEN to cut down on paperwork on the day.
  • Ensure all upgrades are applied to your MIS.
  • Speak to site staff and ensure you will be able to get into school (not fun waiting outside the school gates at 5am – yes it happened to me!).


Padlock / barricade the office door – I say this in jest, but politely reiterate to overly keen members of staff that producing accurate headlines is easier when left to work undisturbed and not rushed.

 

 

  • Check the IT staff have nothing planned which will disrupt your internet connection or access to the server.
  • Have a list of entries ready so you can check all the results are in. Ensure your number of targets match this too to ensure your analysis is accurate.
  • If you are expecting any results for home languages, external music qualifications etc., create these manually in Analytics in readiness.
  • If you use student admission numbers in Analytics as the student ID, prepare a look up if you want to use the certification broadsheet in SIMS Exams Organiser (as it only contains either the UPN or exam number).
  • Remind all staff to have their logins ready (there will still be one that asks for a password reminder!).
  • Fill your supplies drawer. If you are going to have a long day, have some lunch and snacks ready.
  • Stock up on paper and any other stationery you will need.

There could of course be things that happen which are outside our control.

However, we should treat these and any mistakes as something we can learn from. This blog is the first in a series of three – look out for the others which discuss how to ensure your results day analysis is accurate!

SISRA Analytics administrators can also find results day resources within the HELP section of SISRA Analytics.

Throughout this term, I will be tweeting top tips on getting results day ready – follow me on Twitter @EmmaSISRA to find out more.
Good luck!

This post was originally published on 14th June 2017 and was updated on 30th May 2018.

Shadow Data

The DfE recently released schools shadow data which aligned the changes in point scores for 2017 against 2016 results recently.

There were some interesting differences and implications generated by the shadow data. If schools continue to use old point scores to calculate/predict attainment scores they will be inaccurate and probably over inflated. If Governors and/or Leadership Team are ‘exposed’ to Attainment 8 scores I would recommend the shadow data be explained to prepare them for a drop in attainment and progress scores although in reality it would not be a drop but just a different way data has been calculated.
It would be interesting to find out how many schools have dropped as a result of the new point scores. It would also be very interesting to see how many (if any?) schools have increased their Attainment 8 or Progress 8 based on this shadow data. National data suggests not many?!

For my school we took a hit of -02 in Progress 8 and almost 6 points in Attainment 8 going from 53.07 to 47.87. Make sure your governors are aware! If you are unsure where you get this information the DfE sent this to all schools on the 4th April and is available on the Tables Checking Website… or similar!

No Grades? No Problem.

During almost ten years in education, my overall use of data is at an all-time low. I know many people have a need to quantify; whether it’s the number of steps taken with a pedometer, followers on social media, or mentally totting up the number of cups of coffee consumed before midday, if we can add it up, average it, or worse, we will! Sometimes people don’t even mind if the resulting figure is made up as long as it gives us some information to paste into our latest pack or document.

Money? Love? No… DATA makes the world go round!

I too enjoy a graph, a table of data or an investigative hour with a spreadsheet. I can’t lie about that, but if you need your fix, the Office of National Statistics website* has plenty of data to play with so perhaps it might be best to use that as your sandbox and not stuff which might actually affect someone’s life. Such horror stories include (but not limited to) appraisal ‘targets’ for teachers based on Progress 8 predictions for Year 7. If you fancy a laugh at data’s expense, there are some incredible examples which tie in with this sentiment on Tyler Vigen’s popular site*, Spurious Correlations. *See links at the bottom of the page.

I frequently devote time to thinking of ways to monitor the quality of teaching and learning without applying the traditional outstanding to inadequate. In fact, how to evaluate teaching and learning without grades is the topic I am asked about most whenever I’m working with a school or showing Observe to a school who’s never seen it before.

For schools who have stopped grading altogether, there are still plenty of ways to analyse what’s observed while steering clear of whole grades for lessons.

All the analyses below have been taken from real schools using SISRA Observe for their observations (identifying data removed).

Question Statements
Lots of schools have opted for a set of questions based on what they expect to see in a lesson. They then go on to say ‘Yes, I’ve seen it’ or ‘No, I didn’t see it’.

Quantifying observation data then becomes straight forward, if a little limited. Over time, this approach can certainly give a good overall picture about what’s being done consistently, and what needs to be embedded further. It’s also reasonable to apply targets to this type of information as it is objective in nature and works well at whole school level.

% Yes %No
Teacher in class to greet pupils on arrival? 75 25
Objectives displayed? 60 40
Are folders set-up for each lesson within the unit? 66 34
Differentiated outcomes? 93 7
Resources ready so pupils begin to learn on entry? 100 0

 

Further analysis could then be performed at faculty or subject level to provide more detail and allow targeted support. Analysing the data in this way takes the focus off what’s not being done and can show where standards are consistently high.

Areas of Strength and Development
Another effective approach is to decide which areas to focus on taken from a set of whole school priorities. This works so well when those areas are well thought out and relate specifically to a range of objectives for the year. This could provide a direct link back to the whole school improvement plan and individual faculty improvement plans as well as the teacher standards if desired.

Observers choose one or more strengths of the lesson from a finite list and then choose one or more areas of development from the same list. Accompanied by written and verbal feedback, over time this can provide a wealth of information including matching up staff who have complementary strengths and areas for development. Schools using Observe tell us that they use this information to pair up teachers for joint observing and lesson study.

The school in the example above can immediately see what they do well (climate, relationships) and in which areas staff require more support (challenge, assessment).

Student Based Questions
‘Were students focused on learning?’

‘Did students have the right equipment?’

Some schools prefer to use a scale such as; All Students, Most Students, Some Students, No Students. Taking this approach with multiple drop-ins or learning walks can allow a leadership team to see who is consistently meeting the expectations and who may benefit from some support. Once a school is satisfied that an area is being met consistently, they may decide that observing it regularly is not necessary and this could allow them to concentrate their efforts elsewhere.

In my experience, it’s certainly possible to use data in a reasonable and sensible way in relation to observations of all types. If you’ve moved away from grading whole lessons, these approaches can allow you to measure impact over time and also shout about what’s going well.

Links:

https://www.sisra.com/observe/what-is-sisra-observe

https://www.ons.gov.uk

http://www.tylervigen.com/spurious-correlations

How does data influence Ofsted judgements?

So; what better thing to do on a Sunday evening but to write some comments about data and the likes! There has been a lot of talk of late about the reliability and validity of attempting to predict outcomes and Ofsted School Inspection Update (March 2017) wrote…

’As inspectors, we can help schools by not asking them during inspections to provide predictions for cohorts about to take tests and examinations. It is impossible to do so with any accuracy until after the tests and examinations have been taken, so we should not put schools under any pressure to do so – it’s meaningless. Much better to ask schools how they have assessed whether pupils are making the kind of progress they should in their studies and if not, what their teachers have been doing to support them to better achievement.’

I understand the sentiment of this statement and to some extent support this, especially when one looks at the reliability and accuracy of some Ofsted judgements prior to results actually being published! I am very much aware that a school should not be judged on outcomes alone, but it is interesting to note the percentage of schools that have been judged as good (based on predictions) and then observe the Progress 8 scores as illustrated in the diagram below. All the green dots represent schools judged as ‘good’ as of January 2017.

There is such a wide range of outcomes here yet some schools who, on the face of it, have ‘inadequate’ outcomes and conversely some who have ‘outstanding’ outcomes have been labelled as ‘good’.

So where does the data influence judgements and what do Ofsted mean by ‘assessed whether pupils are making the kind of progress they should in their studies?’ Is this an objective process or a subjective one? I have seen first-hand how some inspectors have made judgements about ‘the quality of teaching and learning’ (and standards) because a child had drawn in their English book…and it wasn’t a very good drawing! So how can we make this process more objective? I would suggest considering and reflecting on the notion of ‘assessment without levels’, mapping the curriculum to the programme of study from year 7 through to year 11. This could then be matched to what a child should know, understand and can do in the various years and backed up with evidence of testing. Taken together, these measures would then allow for pretty objective formative assessments.

These are testing times and because of the changes in the examination system, schools really do not know what the outcomes will be, apart from the fact (as announced by OFQUAL) that there will be a percentage who obtain the various grades regardless, to some extent, of what the standard is as illustrated in this diagram:

The new EAP area in SISRA Analytics is allowing me (for the first time ever) to see if my cohort are on track regardless of which year they are in by matching up teacher assessments to where they should be in any particular year group or time (subject to how a school decides to assess. i.e. current/predicted/end of year/different KS3 and KS4 grading system etc.). However, I do feel the SISRA developers have been as flexible as possible in creating a system that caters for most (if not all) methods, which is pretty impressive. I suppose then Ofsted inspectors can observe what is going on in the classroom and in books and then look to triangulate those observations with some objective assessments as illustrated by the data in whatever year group they like (mind you, that’s what they were meant to have done before wasn’t it but how then does that account for the wide range of ‘goods’ based on the outcomes as above!).

Use of data in the classroom

When I was thinking about topics for this post, I was reading some of the other blogs that my colleagues and I have written recently and realised that the focus has been on whole school or year 11 data. None of us have actually talked about the data that is used in the classroom. My strong belief is that data should start at classroom level and work up. Data staff in schools should ensure that the data they provide to teachers will help them deliver a better learning experience to the young people in front of them.

Before I joined SISRA, I worked as a Data Manager in a large Outstanding school where all teachers would have what we referred to as “Standards Folders”. These contained a profile of all of their classes showing contextual information such as gender, pupil premium, special educational needs & disabilities (need and type), EAL, % attendance etc. I also provided class photos as well as templates for seating plans. There wasn’t any expectation on how teachers used their Standards Folders, but they were expected to know the contents and ensure that it was kept up to date following any class changes or assessment points.

This class profile shows all students within a class together with contextual information i.e. gender, latest attendance, ethnicity, SEND etc. together with KS2 fine level, CAT test results and target grade.

An excerpt from my school’s Ofsted inspection when they were judged Outstanding states:

“Systems used to ensure the rigorous collection, analysis and use of student performance data are exceptional. They allow teachers to plan effectively for all students’ individual needs. This key feature is an essential component of the excellent teaching which the students experience and the rapid progress they make.”

There are so many different software systems available for class profiles and seating plans out there (many are free to download) so I’m not going to comment on them here. A simple Word document, Excel spreadsheet or a mark sheet set up from your MIS would be enough, so long as it contains relevant information. Don’t overload staff with anything that is not relevant or helpful to them (most schools have some staff that are data shy and you don’t want to scare them).

Have a think about what a class teacher may need to be able to find out from the data that is given to them? If we take a look at some students in the class profile shown above:

Damon A

  • Attendance is 91% so therefore a concern
  • He has a special need of Behaviour, Emotional, Social Difficulty
  • Middle ability
  • Armed forces

What could be done to help Damon? The Service Pupil Premium is designed so that schools can offer mainly pastoral support during challenging times and to help mitigate the negative impact on service children. Could a mentor be appointed to ensure that Damon’s attendance improves? Is he accumulating behaviour points because of his special needs? Is there a 6 week targeted programme that he could be involved in? Or any parental workshops and engagement?

Jennifer A

  • 100% attendance
  • More Able
  • High Prior Attainment

Is Jennifer being challenged in class? Are additional tasks being provided to her to ensure that she doesn’t get bored?

Jane A

  • Attendance good
  • Pupil Premium
  • FSM

Jane A has a high CAT Verbal score whereas her CAT Quantitative score is below average. This will mean that her ability to use numerical skills to solve problems is weaker than her ability to comprehend words. A class teacher should be aware of this when setting work.

We have looked at three individual students here but this kind of information should be sought for all students in every class.

Once an assessment is done, then the pastoral information should be used alongside any assessment data to identify:

  • Who is furthest away from expectations?
  • Who is and who is not improving as expected?
  • Does this relate to the topics taught?
  • Who is making less progress than expected or than their peers?

Once a teacher has identified this, they can then move on and see if there are any patterns forming. For example:

  • What are the trends for key groups; are girls outperforming boys?
  • Are pupil premium students doing worse than non-pupil premium students – if so, is there any PP money available for intervention?
  • Is work differentiated for more able/SEND students?

So in conclusion, have a think about what data you are giving to staff to ensure that the children or young people get the education that they deserve.

My colleague Becky’s blog ‘How to Train and Engage Teaching Staff in the Use of In-School Data’ covers the next steps.

How to train and engage teaching staff in the use of in-school data

I am guessing that some of you may work in schools where data is used by every member of staff, and is the basis of effective intervention and strategic planning. However, some of you may work in schools where data is just looked at by the Head, or perhaps the wider SLT, and maybe a few eager beavers around the school. Are you struggling to get your staff engaged, and to really see the benefit of the use of your student data in the wider context in school?  In this article I will take a look at some of the negative attitudes to data and data analysis that I have seen, and how they can be changed through good, well planned and targeted training.

Attitudes to data

Let’s have a look at some of the attitudes I have come across towards in-school data:

“I don’t know what it’s used for”

“It’s just paperwork and admin that gets in the way of teaching”

“I’m not a ‘data person”

“I haven’t time to look at it”

“I am here to teach, not produce data”

“Data is boring.”

What do we want to change these attitudes towards data to?

“My assessments inform intervention and action”

“Data supports my teaching”

“It is intuitive and accessible”

“It is interesting because it tells me so much about my class, my qualifications or my year group”

“I don’t produce data, I use it.”

In order to do this, we need to show staff that the analysis of their data is not just a box ticking exercise. We need to ensure that they know what the data can do for them and what the value of it to them is.

Time your training carefully

How do we start? Well, through working with schools in my capacity as a data consultant for SISRA, I have learned that it is easiest to engage staff when they are shown recent, relevant data about the actual children they teach.

Data is worth nothing on its own. Its value comes when it is considered together with the back story. When looking at these numbers and charts, we need to remember that this is all about children, about students and about lives. When teachers are able to relate the data to the students that they know, that they teach, they understand the point. It also needs to be fresh data. There is no point trying to interest a teacher in the situation last term, or last summer. You have to work with fresh assessments so that something can actually be done with the knowledge the staff take away with them, and so that the exercise of training the staff can immediately lead to impact in the classroom.

For example, when the History teacher sees that Courtney, who they see as a problematic student, is currently actually achieving about a grade higher in their qualification than she is on average across all her subjects, or conversely that Kurt, who they thought was doing just fine, is working at over a grade lower in their qualification than he is overall, they start to get interested.

When that same History teacher sees in black and white (or possibly red) the gap between the students on pupil premium and the students that are not on pupil premium in their class, they start to ask questions.

I have led staff training in schools where I have had to stop talking at a certain point and let the teachers run with it for a while. While I want to move them on to the next report, they want to have a good discussion with their neighbour about the students in their class, about what the data is telling them about those students, and about what they’re going to do about it.

If the teachers are looking at old data, or data about students they don’t know, they are not going to be engaged. Where’s the ‘hook’? What’s the point?

So, make sure that you set the training dates just after an assessment point, when the data the staff are looking at is fresh and relevant.

Make it interactive

This also leads into another point, which is to make it interactive. This is the only way to ensure the staff you are training are looking at data that is completely relevant to them. Make sure you have enough computers for everyone. Either hold the session in an ICT suite, or ask everyone to bring their laptops. Each member of staff needs to have access to their own data on your chosen analysis system and to be able to look at their own students. This way, they can all look at the same information at the same time, but each teacher can look at it for their own qualification, class, or year group. There’s no point in all of them sitting looking at a whiteboard showing reports about the Maths department, they are going to be way more engaged if they are looking at their own department or class. Also, don’t make it a ‘speech’. Once you’ve shown them what can be done, set them tasks, give them quizzes, and create discussion.

Target your sessions carefully

The next trick with training staff is to train small groups, not the whole staff at once, and to tailor each training session to a specific group of staff. Let’s have a look at who these groups could be:

  • SLT
  • Subject Leaders
  • Class Teachers
  • Teaching Assistants
  • Inclusion/SEN
  • Heads of Year
  • Form Tutors
  • Support Staff
  • Governors

Each group only needs to see the information that is relevant to their role. Tailoring training to their specific needs will avoid your trainees losing interest, ‘drifting off’ and checking their email. Run a variety of training sessions to cover your entire staff, but do it in role-specific groups.

Continue the discussion

Now, at this point all our staff are trained in how to analyse their data and how it can be useful for them, so we can sit back, relax and wait for it all to lead to school improvement. Right? Wrong. As with any learning, if the learners don’t actively practise what they have learned, they will promptly forget it. And it would be oh so easy for everyone to forget about actually using the data and carry on with the way things were, entering assessment grades into the MIS once per half term and forgetting about them. So, the next job is to make sure the information resulting from the data is being accessed, discussed and used; that it leads to intervention, to differentiation, and has an impact in the classroom. Otherwise, there is no point in collecting it in the first place.

This is where what I call the ‘Informing Improvement’ plan comes into play. This is going to involve allowing ‘faculty time’, but if we are all in agreement that the point of collecting the assessments is to lead to intervention and impact in the classroom, then it is time spent well. What I am proposing is a regular programme of discussion and action, after each assessment point.

Informing Improvement!

Following each assessment point, at each departmental meeting, your school’s data analysis system should be opened up and shown on a whiteboard, and the teachers teaching a particular qualification will be expected to discuss the data they can see, and what it tells them together with the Head of Department. Discussion could be centred around a short questionnaire, but the point of the meeting would be to identify problem areas and formulate a plan of action and intervention. Heads of Department would then take this intervention plan to their SLT link for agreement.

Similarly, Heads of Year could also meet with tutors to discuss any students who are falling behind across all or several of their qualifications. They can then decide on some suggestions for pastoral support for those students, which again should then be agreed with the relevant SLT member.

The last ‘pathway’ on the plan is the SLT themselves. It would be the SLT’s job to look at any high-risk groups in their school and check progress, formulating a plan for student support where relevant and necessary.

So it is important to embed the discussion about data into your staff meetings, ensuring that the data that is going in is being accessed and is leading to action. All put together, the aim of this is to ensure that each and every student is learning and progressing.

Ongoing support

Lastly, we can’t assume that because they came to your training all of the staff are going to remember how to find the information they need. There needs to be an ongoing programme of support, to ensure that nobody is left struggling.

I visited a school recently where the SLT data lead is doing this very well, using SISRA Analytics as her data analysis system. Firstly, she regularly goes along to staff meetings and presents ‘what Analytics can do’ in 5 minute bitesize chunks. Alternatively, after an assessment has been uploaded, you could let staff know it’s there via email, and at the same time give a ‘hot tip’ for which report to look at, which filter to use, something specific to investigate. Next, she has set up ‘Data Friday’, a regular drop-in support session – she very sensibly bribes staff to come with the offer of food and drink. Lastly she has a prize of sweets for the most regular user of Analytics every month!

You can use your own strategies, but the aim is to make sure the teachers know what they’re doing, or if they don’t, they know where to find help.

Conclusion

The collection of student data is only worthwhile if it leads to something. The data needs to lead to information, which leads to insight, which leads to impact. A cliché perhaps, but true. In order for this to happen, your staff need to understand it and know how to use it. To help you make sure your staff are engaged with your in-school data you could:

  • Choose a time for your training when you have fresh, relevant data
  • Make it interactive, so each teacher has access to their own data
  • Do it in small, role specific groups
  • Continue the discussion, making sure the data is leading to action and impact
  • Ensure that targeted ongoing support is offered

Measuring Staff Performance

TELL US A BIT ABOUT YOUR SCHOOL

Taaleem is a company that owns and runs schools in the United Arab Emirates. The group consists of seven K-12 schools and four early years’ centres. Our schools offer a range of curriculums, including those for the American, British and International Baccalaureate (IB). Schools in Dubai and Abu Dhabi are inspected by the regulatory authorities and the quality of teaching and learning is a major focus of the inspection process. Our schools also undertake lesson observations as part of a teacher’s probation or performance review.

WHAT PROBLEMS DID YOU HAVE BEFORE YOU STARTED USING SISRA OBSERVE?

Lesson observations have always been a critical process for Taaleem schools. Different schools had different systems in place and in some cases there were differences between the secondary and primary stages in the same school. As a group, we wanted to be able to measure staff performance and compare one school with another. We also wanted to be able to see which areas of the inspection framework were strong and which needed more work. Getting Senior and Middle Leaders on board with the lesson observation process was also a priority, but because of the variations in the system, this wasn’t happening at the pace we wanted.

WHAT ARE THE MAIN BENEFITS OF USING SISRA OBSERVE?

  • It was quick and easy to use
  • Setting up the system for our school was straightforward
  • The templates provided were relevant to our observation process
  • We can extract data really quickly and easily, and for different levels – from faculty or department to individual teachers

WOULD YOU RECOMMEND SISRA OBSERVE TO OTHER SCHOOLS?

Yes! Absolutely. It’s helped us manage lesson observations all in one place. SISRA Observe stores information that can be readily retrieved and used for probation reviews or performance appraisal meetings. It is also useful for strategic planning and for targeting professional development courses. We wouldn’t be without it.

Richard Drew, Education Manager, Taaleem

Driving School Improvement

WHY USE SISRA OBSERVE?

SISRA use the slogan ‘Empowering Improvement’ and since we began working with them over a year ago, that is exactly what they have enabled us to do. Using SISRA Observe has allowed us to become a more efficient organisation and, as a result, has created more time for us to focus on our core purpose; improving the quality of teaching and learning. Gone are the days of collating endless paperwork, reminding staff to send their documents and filing any number of different pro forma as if my life depended on it. All the time that has been saved has now been invested in our key improvement area; improving the quality of our classroom provision.

WHAT ARE THE BENEFITS?

Using SISRA Observe has created greater ownership of teaching and learning at all levels of our organisation. Teachers use it to reflect on their own practice, ensuring they have a clear understanding of the strengths and development areas in their own teaching. They use this information to engage with a range of personalised CPD opportunities within the school, either sharing their best practice or further developing their own teaching. SISRA Observe is at the heart of creating a self-improvement culture within our school, where teachers take ownership of their own professional development. SISRA Observe allows teachers to make informed choices about their own professional learning, thus making a significant contribution to our culture of ‘continually learning, continually improving’.

FOR MIDDLE LEADERS

Middle leadership has been strengthened significantly by the fact that Heads of Department have a clear understanding of the strengths and priority areas for their teams. Middle leaders are often stretched to capacity and presenting information in a ‘ready-to-go’ format is crucial in enabling them to act on teaching and learning data in a timely and meaningful way. The accessibility of the reports available in SISRA Observe means that they do not have to spend time analysing teaching and learning data because it is done for them, so they can focus on developing the quality of provision instead. It allows them to identify and systematically share best practice easily across their teams, and to act swiftly to strengthen practice if and when required. Heads of Department can develop CPD at department level to ensure that pupils get the best possible learning experiences when they visit their curriculum areas. As a result, SISRA Observe empowers our middle leaders to be the driving force for school improvement at Holy Trinity.

FOR SENIOR LEADERS

Senior Leaders have total clarity on the quality of teaching across our school. Each and every member of the Senior Leadership Team clearly understands our strengths and priority improvement areas. SISRA Observe also creates a transparency within teaching and learning that has served to strengthen the line management process across the school, resulting in ‘real-time’ conversations with Middle Leaders about the quality of provision in the areas they oversee. As a result, SISRA Observe allows underperformance to be identified and challenged efficiently, whilst (more importantly) best practice can be recognised, championed and celebrated.

YOU CAN’T FATTEN A PIG BY SIMPLY WEIGHING IT!

This is my favourite analogy in regards to teaching and learning. In a nutshell, it means that we do not improve the quality of teaching and learning by simply monitoring it. However, through SISRA Observe, we carefully and systematically monitor teaching and learning to identify our major strengths and key priority areas. It is what we do with the information that SISRA Observe provides that is crucial to our school on its journey of sustainable improvement.
If you strip away all the intervention strategies, revision sessions and one-to-one support, you are left with the need for the very best classroom practice. Therefore, how we develop our teachers at Holy Trinity is vital. Through analysing the reports from SISRA Observe, I can create a whole-school CPD programme to address our priority improvement areas. More importantly, it allows me to create personalised pathways for our teachers, tailored to their individual needs to maximise their professional learning. Increasing collaboration between colleagues is a key component for improving the quality of teaching and learning at Holy Trinity. We are embedding peer coaching across the school as a mechanism to create more opportunities for reflection and collaboration. SISRA Observe allows me to successfully match teachers to the right coach to ensure that they support each other’s practice and maximise learning opportunities for both colleagues.

In summary, SISRA Observe has helped create something that is so rare in schools…time. Time that can be invested in ensuring the quality of our teaching is the very best it can be. Time that can be given to teachers and leaders to ensure that, above all else, high-quality teaching and learning is the cornerstone for driving school improvement. Put simply, it allows us to ‘keep the main thing; the main thing.’

Stuart Voyce, Assistant Head Teacher, Holy Trinity

Our Journey to Outstanding

Kelvin Hall is a comprehensive school in Hull, East Yorkshire and has 1,400 students. They were graded outstanding in all areas under the new Ofsted framework in February 2015, yet 18 months previously had been graded ‘requires improvement’.

WHAT PROBLEMS DID YOU FACE BEFORE USING SISRA OBSERVE?

Before using SISRA Observe, the monitoring of teaching and learning was conducted by one person using a single MS Excel spreadsheet. The information was not shared with any other member of staff, not even the Senior Leadership Team. When I took over this role, I knew I had to make changes as this was not the way I worked. My philosophy was that we had to work together to improve teaching and learning and not in isolation. It had to be a shared vision.
It was at this point that the flyer for SISRA Observe appeared on my desk. I was immediately drawn to the idea of no paperwork and a way of collecting, analysing and sharing useful data centrally. I could see the potential for everyone to be involved in their own professional development and the positive impact this could have on staff and students’ progress.

WHAT ARE THE BENEFITS?

  • Easy to personalise and set up the observation forms
  • Accessible for everyone at some level (you can decide who sees what)
  • Staff have instant access to any judgements made on them
  • Staff can use data to inform their own CPD needs and appraisal targets
  • Leaders can analyse data to inform target-setting, improvement plans and coaching
  • School can analyse data to inform CPD sessions
  • Once it is set up, there is little to do
  • Can develop ways of involving non-teaching staff, e.g. Year leaders conducting drop-ins

WHY WOULD YOU RECOMMEND SISRA OBSERVE TO ANOTHER SCHOOL?

For all of the above reasons and the fact that it makes it so easy to demonstrate where you are at in terms of teaching and learning.
When I met with the Ofsted inspector to talk about teaching and learning, it was easy to show him where we were with regards to standards. I could show him book scrutiny, learning walk and formal observation data for each staff member, department, the school as a whole and even look at the trend over the past year(s). He was impressed with how we used it and could see the impact it has had on our journey to outstanding. It had been used as a tool to focus our priorities and drive the standards of teaching and learning to an outstanding level.

I cannot wait to get to grips with the new version and the anticipated flexibility for it!

Claire Turnbull, Assistant Head Teacher, Kelvin Hall School

Detailed Tracking for New 9-1 Grades

WHAT ARE THE BENEFITS?

SISRA Analytics enables us to effectively identify those students who would benefit from intervention in any given subject and easily identify those who have made good progress, who could be challenged to push on and make more than expected progress.

The filters within SISRA Analytics allow us to monitor the progress of specific cohorts, such as students who have joined the school late, those with low attendance and even those with summer birthdays.

HOW IS SISRA ANALYTICS USED BY STAFF?

  • Form Tutors are able to monitor the progress of students in their form group, recognising any issues early and providing extra support where necessary.
  • Class Teachers can monitor the performance of their students over a longer period of time and put systems in place to enhance progress.
  • Curriculum Leaders can see at a glance which classes are making good progress and identify those where support may be needed.
  • Achievement Coordinators can view students individually or in selected focus groups, which allows them to monitor progress and enables early intervention if required.

ANY OTHER COMMENTS?

As we move through this academic year and future years, building our student history, we will be able to set benchmark comparators. It will be interesting to see, as students move through the grades, how this information informs our curriculum as we adapt our systems and intervention methods to ensure that all students make the best possible progress.

By using SISRA Analytics to ensure that all students are being challenged and making the best possible progress, the impact on the school performance as a whole will be unmistakable.

Gill Edwards, Academy Data Manager, Sandbach High School & Sixth Form College

An Accurate Match with DFE Publications

WHAT ARE THE BENEFITS?

Highfields school starting using SISRA Online as an exams analysis tool as we felt our management information system was failing to provide us with a complete picture. When SISRA Analytics was launched, we soon realised the wealth of additional features available, i.e. the opportunity to upload our termly assessment data to track, analyse and inform intervention throughout the year as well as analyse summer results.

Following each data capture, I upload the grades from SIMS into SISRA Analytics and usually within 10 minutes I have matched qualifications, checked for errors and published the reports for the staff to view. Data is timely and we can act on it immediately. The reports are split sensibly into three views: Headlines, Qualifications and Students.

Following data capture, we use the student reports in yearly achievement group meetings. The Deputy Head, Head of Year, Data Manager and Faculty Leaders meet to discuss the progress of individual students. These conversations revolve around reports produced in SISRA Analytics to highlight underperforming students.

HOW DO YOU USE THE REPORTS IN SISRA ANALYTICS?

Qualification reports are an integral part of line management meetings. Subject attainment and progress data from SISRA Analytics can be analysed and compared with previous data points, and projected summer outcomes are discussed. Vulnerable groups are easily identified or can be customised through ‘focus groups’. One feature particularly liked by the Heads of Department are the Progress Matrix tables (just like the DfE Transition Matrices) that can be produced for subject areas or even individual class groups.

Headline Reports are used by the Senior Leadership Team and the Governing Body with Ofsted, the school improvement partner and the local authority. We are required to project summer outcomes and are held to account for the accuracy of these projections. Without SISRA Analytics I don’t think we would be able to produce these figures with such confidence. SISRA Analytics deals with the accredited / non-accredited qualifications, discounting rules, first and best results, Attainment 8 and Progress 8, to name but a few.

HAS SISRA ANALYTICS MADE A DIFFERENCE ON RESULTS DAY?

Results Days are now ‘easy’. SISRA Analytics will analyse first and best results separately and allows us to make direct comparisons with projections and to analyse trends. I am proud to say that the results in SISRA Analytics have matched the data on the Forvus tables checking website for a number of years now.
The next challenge for us will be ‘life beyond levels’ and I am confident that whatever system we move to, SISRA Analytics will be right alongside me!

Becky Hill, Data and Curriculum Manager, Highfields School

Instant Trend Analysis

TELL US A BIT ABOUT YOUR SCHOOL

The Burgate is an 11–18 comprehensive school in the New Forest with 12% of students identified as disadvantaged. Our year groups average 155 in the main school and 160 in the sixth form. Before using SISRA Analytics we used SISRA Online, and have been SISRA users for over six years. SISRA Analytics has made a huge difference to the accessibility and readiness of data at our fingertips, alongside giving the school the ability to analyse data in a very in-depth way that was not previously possible without a large amount of extra time and effort.

WHAT ARE THE BENEFITS?

The speed of access on Results Day is excellent, and the support offered by SISRA in the run-up to Results Day has been comprehensive and worthwhile. This has enabled us to share results quickly, highlight the positives and focus on the areas for improvement with ease. Teachers and governors enjoy being able to access results quickly and accurately soon after release, even though they may be 1,000 miles away.

SISRA Analytics allows us to compare datasets and trends in an instant, and has been instrumental in allowing us to find our gaps and act on them. My favourite feature has to be the new graphical representation of the headline report, which was warmly received by governors for Results Day last summer. However, SISRA Analytics is for life, not just results! I warmly recommend this product to other schools, and frequently do so.

Dr Karen Riding, Assistant Head Teacher, The Burgate School

Improved Ofsted Outcomes

WHAT PROBLEMS DID YOU HAVE BEFORE YOU STARTED USING SISRA ANALYTICS?

We first started using SISRA Online in 2013 following an Ofsted inspection that had placed us as ‘Requiring Improvement’. One of the actions that came out of this inspection was to improve the tracking and monitoring of assessment data across the school. Previously we had been using the assessment programme within our management information system, and although this generated useful data, it was not being used by staff with any degree of success due to the fact that it was so difficult for them to access.

SISRA Online was a good step forward in that it enabled staff to look at their data far more easily, although this tended to be on a fairly basic level, and was predominantly just by the Senior Leadership Team, Subject Leaders and Heads of House.

HOW HAS SISRA ANALYTICS HELPED YOU?

SISRA Analytics has enabled another significant step forward in the effectiveness of the tracking and analysis of our data. We now collect Key Stage 4 data monthly. Subject Leaders and Heads of House meet with the Head Teacher each month to discuss changes since the previous month. Target groups are identified and monitored through focus groups, and form tutors have had training in SISRA Analytics to facilitate regular mentoring discussions. We have uploaded a set of FFT estimates, which are used for comparisons when talking to individual students.

In addition to our work at KS4, we have now been working for a year with GCSE Flightpaths for KS3 in place of the National Curriculum levels. These are easily reported and analysed in SISRA Analytics on a termly basis

Last summer we were re-inspected and found to be ‘Good’ with some outstanding features. This was ahead of our summer results but our tracking and analysis of the data in SISRA Analytics allowed us to clearly demonstrate that we understood where our strengths and weaknesses were. Inspectors were satisfied that we had a good grasp of how well our students are doing and that we could clearly demonstrate progress. There is no doubt that SISRA Analytics has been a significant element in securing our school improvement, and has now become an integral and embedded part of our school culture.

Simon Tong, Assistant Head Teacher, Tiverton High School

An Invaluable Tool for Intervention

WHAT ARE THE BENEFITS?

We used SISRA right back in the old ‘SISRA Online’ days. It helped us evaluate our data drops and annual exam results forensically. But not until the introduction of SISRA Analytics and the added features did we refine our retrospective analysis and more importantly get an invaluable tool for intervention, identifying underachievers so that appropriate action could be taken.

HAS IT HELPED YOU IDENTIFY AREAS FOR IMPROVEMENT?

We’ve used the Grade Search facility extensively to create KS4 focus groups, based on children who are predicted English and not maths, or maths and not English. We then group these children into Gold, Silver or Bronze categories based on their KS2 entries. Bronze pupils have a higher entry level and Gold pupils a lower one. The idea is that Gold pupils are harder to convert – hence the gold medal for a more demanding intervention. Silver pupils are less hard to convert and Bronze pupils should achieve most easily.

We also use SISRA Analytics to identify the foundation subjects in which these pupils are underachieving. With the new Progress 8 measure, we recognise that ‘every grade matters’, and so this type of intervention is essential. And in KS3 we use SISRA Analytics to create focus groups for intervention by creating a league table of points differences between current achievement and targets. SISRA Analytics works this out straightaway, identifying current KS3 underachievers at any data drop.

All staff in our school have engaged with SISRA Analytics and find the interface intuitive and easy to use. Years 7 to 10 have three data drops a year and Year 11 has four. All staff contribute to the drop and also use SISRA Analytics to check their predictions against datasets and benchmarks.

DID YOU USE SISRA ANALYTICS ON RESULTS DAY?

On Results Day, SISRA Analytics comes into its own. Data can be input easily and quickly, giving us instant headline stats. The system has also given us instant accurate value-added scores – so no more waiting months for RAISEonline to appear. Levels of progress and transition matrices are instantly accessed and we now have Progress 8 and Attainment 8 results readily available. As in many schools nationally, ‘closing the gap’ has been an issue for us. SISRA Analytics identifies our disadvantaged children and provides a gap analysis, enabling us to pinpoint areas to develop.

So SISRA Analytics has become part of everyday life here at Eggar’s School, changing how we look at achievement data and how we use it to enhance attainment and progress. It’s now hard to imagine life without it.

Neil Waite, Assistant Head Teacher, Eggar’s School

Streamlined Assessment Analysis

WHAT ARE THE BENEFITS?

SISRA Analytics has been an invaluable tool for Fullhurst Community College since we joined two years ago. It has enabled us to streamline our analysis of data for all assessment points and therefore increased our efficiency in informing staff, students and parents of achievement and improved the effectiveness of our interventions as these are now targeted on the right areas at the right times.

WHAT IMPACT HAS SISRA ANALYTICS HAD ON RESULTS DAY?

The speed with which data is analysed on GCSE Results Day has been vital. Previously I manually calculated headline, attainment and progress figures using spreadsheets, which took a considerable amount of time. I am not sure this would have been possible with the introduction of the new Progress 8 measures. SISRA Analytics now allows our Senior Leadership Team instant access to headline figures and lets Heads of Faculty see data on attainment and progress at whole-subject, class and student-by-student level weeks in advance in comparison with my previous manual system.

The reports are endless and have saved me hours of work. You can add as many filters as you need and this has enabled staff to drill down on key groups such as Pupil Premium, English as an Additional Language and Special Needs. The ‘Tracker’ options allow staff to see how the cohort has performed in comparison with previous assessment points and ‘Compare With’ allows analysis of results against target grades – this is something we track closely.

ANY OTHER COMMENTS?

SISRA regularly updates Analytics in line with changes from the DfE, which ensures that the information we are using is as accurate as possible. The announcements on the Home Page are always worth reading to keep abreast of any changes in the data world. The Help section is particularly well stocked with guidance both in paper and video format. These have been so helpful, but if you can’t find the answer to a question, there is the friendly Live Support team, who are always on hand to give guidance.

Sam Gray, Data Manager, Fullhurst Community College

Examination Analysis in Record Time

WHAT ARE THE BENEFITS?

We started using SISRA Analytics the day it was released in March 2014.The difference was felt straightaway. Not only did it free up my own time to develop other data tools for staff to use within school, but the speed and accuracy of its analysis meant we were able to catch students who needed additional support within hours of the assessment data being captured. Prior to that, it may have been at least a week before we really got to grips with what the data was telling us.

HOW WAS YOUR FIRST RESULTS DAY USING SISRA ANALYTICS?

GCSE exam results day demonstrated SISRA Analytics’ capabilities perfectly: by 11am, the grades were analysed, SLT had met and some were able to get back to their holidays! The figures even married up perfectly with the Department for Education’s own figures.

HOW EASY IS IT TO TRAIN STAFF?

The aim here at Alsager has been to build staff’s understanding of data gradually in order to eventually utilise it to support teaching and learning. We have eased our staff gradually into SISRA Analytics through providing bespoke training and their enthusiasm for it has taken even me by surprise, with everyone now using it regularly.
Data is no longer the mysterious, threatening thing some staff felt it was; rather, staff can finally see that it is a vital tool to support their work in the classroom. SISRA Analytics has been the key to this!

David Meller, SIMS and Data Manager, Alsager School

EMMA MALTBY

After a decline in her school’s results, Emma’s eye for data resulted in the school creating a new post of Curriculum and Data Manager for her, which included the setting up and managing of data and assessment systems for Years 7 to 13. She quickly recognised that SISRA Analytics would be an invaluable tool for staff, became an expert in the use of SIMS, and successfully completed the SSAT National Data Managers Award.

Emma encouraged and trained staff in the effective use of both SIMS and SISRA Analytics, not just at her own school, but within the large multi academy trust when her school’s model was rolled out to other schools. She promoted the use of data and also wrote her own school’s target setting policies. Even Ofsted recognised that ‘The availability and use of school information has improved dramatically’.

Before joining SISRA, Emma was thoroughly involved through hosting DataMeets and acting as a guru at training events. Her love of helping others in the use of systems and data led to her joining the Consultancy Team at SISRA in July 2016.

“Emma was very patient with me and answered any questions I had – though there weren’t many as she had pre-empted a lot of what I was going to ask. Emma is a credit to SISRA – as are all the consultants!”
Data Manager, Cambridgeshire School

“Emma was absolutely fantastic. I can’t fault her in any way. Her depth of knowledge and ability to go above and beyond are a real credit to your company.”
Assistant Head Teacher, Nottingham School

“Emma has very knowledgeable about all aspects of SISRA as well as Sims. The training was pitched at just the right level for all 3 of the staff involved.”
Administration Team Leader, Derby School

“It was a very useful day indeed, and Emma coped admirably with the difficulties presented by our MIS malfunctioning.”
Deputy Head Teacher, Hackney School

“Emma was very professional and was able to provide information and training on the systems. I learned a lot on the day and with her guidance was able to apply that knowledge on the day.”
Head of ICT, Nottingham School

KATE MOON

Kate’s previous roles have given her seven years’ experience of working in data management. Before working in education, Kate supported and trained clinical and non-clinical healthcare staff using information systems across Liverpool Community Services. She would travel between different healthcare centres, hospitals and prisons ensuring the accuracy and consistency of the data being recorded. In 2010, Kate moved into a role in education as an Exams Officer and Data Manager where she gained further experience of managing data and multiple spreadsheets before becoming a SISRA user herself. She was solely responsible for streamlining the tracking processes within her school. She improved the quality of the data recorded in school by facilitating behavioural change and encouraging and training staff to use both SIMS and SISRA. Kate has experience of working closely with Senior Leadership Teams and Heads of Department and understands the effectiveness of data within education.
Kate has recently completed and passed the Ordinary Certificate in Statistics with the Royal Society of Statistics.

“We are very grateful for the speedy response to our request which was at short notice. Kate quickly understood our needs and spent the day with my colleague completing the required data analysis and training. We are delighted with the analysis she was able to produce at the end of the day.”
Data Manager, Wirral School

“Kate was fantastic when helping me set up SISRA from scratch. She was a great help in areas where our School structure on SIMS did not match up with what needed to be uploaded to SISRA. Kate gave me a great “foot up” to starting to use SISRA.”
Data Manager, Bradford School

“Having the training in house, allowed us to use live data, and made the process real. Kate was very clear in her delivery. We were able to ask questions throughout the day and Kate was able to answer any queries we had. The planning before and follow up support meant that the course met our unique requirements.”
Senior Leader, Sefton School

“Kate’s background in school data clearly helped her to be able to see potential problems from our point of view and add context to everything delivered, and have a good background knowledge of school data issues, which made the training more appropriate for us and extremely valuable.”
Data Manager, Sheffield School

“I found Kate easy to work with and she made me feel comfortable during the training.”
Data Manager, Stockport School

“Kate was fantastic. Great sense of humour, very knowledgeable and we made great progress by setting up and creating all the assessment for Years 10 & 11. What is even better is that afterwards, without help, I was able to complete Years 7-9.”
Assistant Head Teacher, Cumbria School

MATT O’BRIEN

Matt’s SISRA adventure started in 2008 when his school signed up to SISRA Online. In a dual role as a Data Manager and Assistant Exams Officer, Matt used SISRA Online for nearly 4 years alongside SIMS before moving to join SISRA in January 2012. Utilising his SISRA Online knowledge, Matt joined as the first of a team of consultants responsible for training users both within schools and also at SISRA training days. Experienced in various modules of SIMS, including Assessment, Exams Organiser and Course Manager, Matt can assist schools with both their SIMS setup and SISRA Analytics.

“Superb as always and very flexible. Matt always goes the extra mile and his support is very much appreciated by everyone. A real pleasure to work with.”
Assistant Principal, Manchester School

“Matt’s level of expertise and knowledge was fantastic. What really stood out beyond even that was his commitment towards us for the day and addressing the work that we needed to do. This commitment went far above and beyond what we could have expected. All this combined with his good humour and approachable manner made for an absolutely fantastic service.”
Data Manager, Cheshire West and Chester School

“Matt is a pleasure to have supporting the school. Both knowledgeable and able to adapt quickly to changes on the day! Thank you for all your help and support.”
Assistant Head Teacher, Trafford School

“Matt was excellent. With his extensive knowledge within SIMS he was an excellent help! We are now fully up and running with SISRA Analytics which wouldn’t have been possible without him. Thank you!”
Deputy Head Teacher, Cumbria School

“The service provided by Matt was superb. He was extremely knowledgeable and patient. Nothing was too much trouble and he gave everyone involved with data the confidence to utilise new systems.”
Assistant Principal, Manchester School

“Matt was very patient and explained some very technical and complex matters in a very simple and accessible way. As a team, we were really happy with the quality of the training and thought the day was very worthwhile.”
Assistant Head Teacher, Liverpool School

HELEN CONLEY

Helen joined SISRA in 2014, having worked in schools for over ten years with students aged 11 to 19 years old, most recently in her role as an Examinations Officer and Data Manager. She is an expert in SIMS.net, with a wide knowledge of all modules ranging from attendance and behaviour management to assessment and examinations. Helen works closely with Senior and Middle Leaders using SISRA in order to gauge the individual needs of the school and its students in order to raise achievement. Helen has received high praise from Ofsted for the systems she has put in place. Outside work, Helen is a qualified football referee and regularly officiates for contributory leagues, such as the Evo-Stik Premier Division and the Conference Premier, the fifth tier of English football.

“Helen was prompt, efficient and very personable, with an effective style of delivery. Communication before and after the session was excellent.”
Assistant Head Teacher, East Riding of Yorkshire School

“Our Assistant Head and I had different agendas to meet on the day but Helen expertly switched between both. She was extremely knowledgeable and supportive on the day and has continued to support us since. She was open to discussions and debates and she happily took any of our queries or ideas back to SISRA for further discussion. I would highly recommend her.”
Data Manager, Northumberland School

“The training session was great, there was no need for improvement. Helen was great, very knowledgeable and patient. She answered all our questions!”
Assessment Officer, Bradford School

“Helen was incredibly knowledgeable, pleasant and delivered the training with considerable expertise.”
Executive Head Teacher, South Tyneside School

“Helen gave clear guidance in answer to all queries raised. The information she had prepared and delivered was exactly in tune with the needs of the staff. We found the session very useful and now feel more confident in using SISRA going forwards.”
Data Manager, Cumbria School

“I had no knowledge of SISRA. Helen explained very clear on an email (in advance) what the needs were for the day and I had all in hand during the training. She was an excellent trainer and now with only that training I have been able to set up all the system on my own.”
Information System Manager, Stockton-on-Tees School

CLAIRE SPENCER

Claire began working as a Data Manager in a large, failing secondary school in 2005. During her 10 years with the school, she was responsible for setting up and managing all data systems, and quickly became an expert in many modules of SIMS.

Claire introduced SISRA to the school in 2011 and was responsible for training staff at all levels in the effective use of data, SIMS and SISRA Online and Analytics. The school was judged Outstanding in all five categories by Ofsted in November 2013. The inspection report states that the school’s systems to ensure the rigorous collection, analysis and use of student performance data were ‘exceptional’.

In October 2011, Claire was part of the first cohort nationally to gain the SSAT National Data Managers Award.

Claire joined the Consultancy Team at SISRA in 2015, supporting different schools across the Midlands and was invited to line-manage the team in July 2017.

Claire is also a Safeguarding Governor of a local secondary school.

“Claire as always was fantastic. Spotted areas where our knowledge of results day was lacking and ensured all the holes were plugged. SISRA support is brilliant and her offer to look over our setup after a few weeks was typical of an organisation that really care for its clients.”
Senior Leader, Ealing School

 

“To have Claire come in to help us set up was invaluable. Claire is incredibly knowledgeable with SIMS and our existing systems as well as some of the more ‘school specific’ problems we encounter with timetabling/codes etc. Claire has great interpersonal skills and is a real credit to your company. Very impressed and even more grateful for her support.”
Assistant Head Teacher, Essex School

 

“Claire’s help has been invaluable in ensuring the school’s data needs are met. Her knowledge, flexibility and passion for data have enabled our school to cover significant distance in a short space of time, ensuring all key stakeholders are engaged in the school’s data management strategy.”
Assistant Principal, Nottinghamshire School

 

“We were over the moon with the training Claire provided, she was very friendly and nothing but helpful. Since the training Claire has continued to support us by checking our system to ensure we are doing it correctly and thanks to Claire we are now feeling very confident in using SISRA on this coming results day.”
Data Administrator, Lewisham School

 

“Claire’s help was outstanding and thoroughly appreciated by the admin, support, teaching and leadership bodies. Her presentation and engagement with the staff was excellent and consistent. It can be said that she ensured staff at all levels – using SISRA in all the different ways – understood the power of SISRA and how it really benefits them in their role, and that they understood how to use it as well as seeing the value of it. A thoroughly valuable day for our school.”
Data Manager, Nottinghamshire School

 

“Claire is very knowledgeable in SIMS which really helped us get through the day. She knew how to tackle every discrepancy involved in set up and no issue was unsolvable. She was informative and really cared about ensuring we made massive progress throughout the day.”
Senior Leader, Ealing School

Too much of a good thing – when data stops being useful

I am very aware that, in our blogs, we are busy giving you lots of ideas about how you can analyse, evaluate and use your data in school. However, possibly the most important point is that we are not suggesting that you do it all, and we are not suggesting you do it all the time.

In March, GL Assessment published a report called ‘Smart Data’, which, amongst other things, looked at the attitudes of teachers to data.

Greg Watson, Chief Executive of GL Assessment, said the findings showed that teachers accepted that data was essential in the classroom but that far too much of it was superfluous or poorly applied. “Too much data is about control not improvement, too much of it is misused and far too much of it is pointless. As a consequence, an awful lot of the benefits of assessment are lost,” he said.

One of the key points of the report is: don’t over assess. Too much assessment adds to the teacher’s workload and takes time away from teaching but won’t produce any added insight.

It is very easy, in school, to assume that loads of data is a good idea. We recently worked with a school who update their assessment grades in Analytics on a weekly basis. Admittedly this does mean that the data is always up to date, but what is it actually then used for? The Heads of Department don’t have time to analyse the data this frequently, or use it to inform intervention or adjustment of activity in the classroom. Just having up-to-date data doesn’t benefit anybody, it is just adding to the workload of the teachers without any benefit to the students.

The recent study of workload commissioned by the government, ‘Eliminating unnecessary workload associated with data management’ backs up the GL Assessment findings. The report voices concern that ‘too often, the collection of data becomes an end in itself, divorced from the core purpose of improving outcomes for pupils, increasing the workload of teachers and school leaders for little discernible benefit.’

It follows up its concerns with the advice that ‘government, school leaders, and teachers, rather than starting with what is possible in collecting data, should challenge themselves on what data will be useful and for what purpose, and then collect the minimum amount of data required to help them evaluate how they are doing.’

They finish with five overarching principles that they feel should be applied to data activities in school:

  • Be streamlined: eliminate duplication – ‘collect once, use many times’
  • Be ruthless: only collect what is needed to support outcomes for children.
  • The amount of data collected should be proportionate to its usefulness. Always ask why the data is needed.
  • Be prepared to stop activity: do not assume that collection or analysis must continue just because it always has.
  • Be aware of workload issues: consider not just how long it will take, but whether that time could be better spent on other tasks.

The point is really just to only collect as much data as can actually be actively used to lead to action, intervention, and to benefit the students. Never collect any data just for the sake of it; on the whole, bear in mind that less is more.

There is one particular section in the report, in which it recommends the use of electronic tracking and analysis packages – with a caveat that I wholeheartedly support; that the electronic package should be used to support the process of data management, not define it. This is why our new Life After Levels system in SISRA Analytics has been developed to be so flexible, and can be used by any school with any assessment system and grade type. As the Workload Review Group quite rightly say, the tail should not wag the dog.

Read the original reports by clicking on the links below :

Smart Data – Study by GL Assessment

Eliminating unnecessary workload associated with data management – Report of the Independent Teacher Workload Review Group

The dangers of calculating a ‘Subject Progress 8’ score

Here at SISRA, we are currently receiving a few enquiries about how to calculate a ‘Subject Progress 8′ score. I have some serious doubts about the validity of such a score, and if your school is using it, I would urge that it is treated with caution. We could calculate one, but does that mean that we should? It can be done for Maths, since it populates its own individual slots, but for the other baskets I’m not so sure.

For a start, let’s have a look at the ways that a subject-specific Attainment 8 estimate for the EBacc and Open baskets could be calculated:

  • Overall A8 divided by 10 (2 x English, 2 x Maths, 3 x EBacc and 3 x Open)
  • Element A8 divided by 3
  • Element A8 divided by national average number of slots filled

And now let’s have a look at an example. If we calculated the 2015 EBacc subject estimate by dividing the EBacc estimate by 3, this means that students with a 4b would be expected to attain a high D. According to the national transition matrices in the RAISE Online library, in 2015 95% of Physics students nationally actually attained a D, while only 68% of Computer Science students did.

We have the same issue with the Open basket. For simplicity I have again looked at 4b students gaining a D. The variance between the attainment of students with the same starting point in different qualifications means that the expectation of the English Language or Art teacher that their 4b students will attain a D is relatively realistic, but to expect the same of the Business or ICT teacher is not in any way fair. Cutting the cloth differently, by calculating the estimate via either of the other options I mentioned above doesn’t help – the whole idea is inherently flawed.

This is compounded by the fact that we don’t actually know, on a national basis, how each qualification has contributed to the calculation of the overall or element ‘estimate’.

Imagine a student has a B in each of four EBacc subjects. According to the DfE the three of those subjects that contribute to the basket are selected by ‘result number’ which is allocated at the time. Basically it is arbitrary, so we can never know how any qualification contributes to the EBacc basket on a national level, again making it less than fair to apply equal expectations to the students and teachers studying each one. In my capacity as a Data Manager in school, I was advised by my contact in the DfE School Performance Data Unit that due to the way subjects are selected for each basket, is is not advisable to calculate subject specific A8 scores.

Lastly, note how volatile the Attainment 8 estimates are. They currently change from year to year, to such an extent that any projections for the following year are highly unreliable.


The DfE has issued no guidance on how to calculate a subject Progress 8 score, because it is basically incalculable. If you wish to try, please ask the DfE to specify how it should be done. If they can provide a fair and realistic way of calculating it, we’ll do it!

Why your Progress 8 score has gone down

The DfE released the 2016 Attainment 8 estimates on Monday (27/9), and schools across the country recalculated their Progress 8 scores, previously calculated using the estimates from summer 2015. Schools across the country also found that this made their Progress 8 scores, on the whole, go down, which obviously was disappointing.

What is even more disappointing, however, is the stories I am hearing from Data Managers who are being challenged about the accuracy of their initial Progress 8 calculations, and even being blamed for having ‘got it wrong all year’.

Actually, when you think about it, nobody’s Progress 8 score has ‘gone down’. The score could only be calculated once that year’s provisional “estimates” were released, so in fact on 27/9, when the 2016 provisional estimates were released, your school’s Progress 8 score was calculated for the first time. However, since all schools had been projecting the score using the 2015 estimates, the actual score was lower than some people were expecting.

Before 27/9 your Progress 8 figure calculation was most likely: 2016 exam results (or assessments if looking back through the year) compared with 2015 Attainment 8 estimates, so basically your 2016 cohort’s attainment compared with the attainment of the 2015 national cohort.

 

Today your Progress 8 figure calculation is: 2016 exam results compared with 2016 Attainment 8 estimates, so basically your 2016 cohort’s attainment compared with the attainment of the same, 2016, national cohort.

Various educational bodies, including the DfE and ourselves, have been warning school leaders against relying on projected Progress 8 scores as a measure of expected student or cohort success, due to the expectation that the Attainment 8 estimates would change. Due to curriculum changes across the country, it was expected that at least the EBacc estimates would go up between 2015 and 2016, so the drop in P8 figure should not be a surprise to anyone.
Below are some charts that show the difference between the estimates based on 2015 national performance, and those recently released based on 2016 national performance:

From these charts we can clearly see that it is the EBacc element attainment that has gone up considerably, and therefore affected the overall estimates. This is due to schools changing their curricula to encourage more students, especially lower ability students, to take the EBacc, as is shown by the comparison between number of slots filled in 2015 and 2016.

One last word of warning. The 2016 estimates released yesterday were provisional. This means there may yet be changes to the national picture, and therefore the estimates, once the September checking exercise has been completed, and remarks have been taken into account.

Effectively tracking and evaluating your new Y11 (pt.2)

As discussed in part 1 of this article ‘Data to avoid for your new Y11’ , for our new Y11 students calculating useful headline figures seems to be a problem.
However, there are many ways we can analyse the data to ensure that we know our students are making good progress without the need to project headlines. If we look after the students and departments, the overall performance measures will look after themselves. We need to focus on the small data, rather than the big data.

Let’s start by looking at useful data we have for the non-reformed subjects that will give us a good indication of how our new Y11 students are, in fact, doing. Note that the illustrations are taken from SISRA Analytics, but you can do all these calculations using any analysis system, SIMS or Excel.

Grade banding and point scores

The percentage of students gaining A*-A, A*-C and A*-G for each unreformed qualification, and the overall average point score for each subject are really useful figures. With the exception of English and maths, our Y11s are doing unreformed GCSEs, so for these subjects these figures can be compared to previous cohorts’ achievement. Even if you didn’t opt into Progress 8 last year, you know from your shadow data in RAISE how your school did on the new performance measures, so you know whether you need to hold firm or raise the game for 2016.

We can, and should, also break these figures down into our groups. Check how your Pupil Premium students are faring, or your students on the SEN register, and also break them down into high, middle and low ability students. From this we can get a good picture of where there might be issues.

Our department leaders can do the same thing for their classes, to really get to the nub of where there might be problems.
We can also look at how they compare to national achievement from last year, for example 90.7% of our Art students are assessed as attaining a C or above compared to 75% nationally in 2015.

We can then compare them to our in-school targets. The red box here shows that despite the fact that the % achieving C+ is above national, our Art department is still below target, because we are expecting 100%.

Transition matrices

The transition matrices that can be found on the RAISE website can also really help us here. If we compare our own in-school matrices to the national ones from last year, we can get a very detailed idea of how the progress of our current year group, broken down into KS2 sublevels, compares to that of last year’s national cohort. Again, for new Y11, for everything apart from English and maths, we are looking at unreformed GCSEs, so the comparison is valid.

Also, this is an easy way to check who the KS2 level 4 students are who are not making 3 levels of progress, or the KS2 level 5 students who are not making 4 or 5 levels, or look for the rogue students who are not achieving as well as the rest of their department, class or group.

Progress 8 – Ebacc and Open baskets

While the Maths, English and the Overall P8 score cannot really be calculated with any accuracy (see previous article), we can look at the baskets other than English and Maths, since for these we can compare like with like, as long as we look at in-school scores using 2016 points compared with national estimates using 2016 points. We need to be careful because the changing entry patterns may have some effect on the EBacc basket estimates, but not so much on the Open. This is also not an exercise just for the performance tables – if a student has a negative score for the open basket, they are not achieving the median score for this basket that was achieved by students with the same KS2 fine level last year – and they have the potential to do that.

Here we are looking at the latest predicted grades for current Year 10, Open basket scores. Overall, we are looking at a slightly negative progress score for the open basket. We need to investigate to find out where the problem is.

By looking at the student list, we can identify which students have negative scores, and investigate further. Remember that it might be worth comparing with 2014 estimates rather than 2015 for students with a KS2 of 5.5 due to the boycott effect. I would also suggest looking at who has an only slightly positive score as well, in case of inflation of national acheivement this year.

By looking at the grades in the Open basket for a student (basket 3), we can easily check in which subject the student needs support. Adam Ant is a 4a student but is only getting a D, which is 2 LOP, in English Language and Literature. He is assessed at D+ though, so he has a chance of pulling it up to a C with help. He also has a D+ for Spanish. The B at the bottom is not being included in the open basket, because it is a short course. Perhaps Adam should have gone for full course RE!

We can also do this for the EBacc basket, but as I have said, we should really avoid projecting P8 scores for the English/Maths basket for the 2017 cohort. So what should we be looking at for English and Maths for this year group that will mean something to us?

Threshold in English and Maths

(previously known as the Basics)

This is not points related, so we can very easily establish who is and who isn’t projected to attain it. We can’t compare the overall figure to previous year groups’ achievement, but we can use it to identify which students need help to get it this year. Achieving a ‘good pass’ in English and Maths is not only of benefit to the school performance tables, but also to the student. And if they don’t achieve it and they want to carry on to sixth form they’re going to have to retake it, possibly more than once.

Here we can see that only 56.9% of my students are attaining it. Who are they? We can create a list of which students are on track to get a 5 in English but not maths, and give it to the Head of maths, and vice versa for English. One thing to remember is that students only need one entry in either Language or Literature for their grade to be counted, unlike the 5A*-C including EM measure.

Ebacc

Let’s find out who is getting a good pass in the five EBacc subject areas, and who is not. Which subjects do they need in order to do so? In this case we can include English and maths, because the EBacc is grade based, not points based.

Here we can see that Chris Cornell is getting a ‘good pass’ (C or 5) in all subjects except English. So it is easy enough to pass that information on to the teacher of that subject. Since he/she is currently assessed at a 4 in that subject there is a good chance of him/her being able to move it up to a 5. Likewise Mariah Carey’s D+ in History or Geography.

Let’s get started!

So, as you can see there are plenty of ways that you can ensure that your Y11 students are making good progress. Just focus on the small data. Look after those pennies (students) and the pounds (headline figures) will look after themselves!

Data to avoid for your new Y11 (pt.1)

To project or not to project, that is the question…are you worried about how best to evaluate your 2017 cohort? You may be finding that your headline measures don’t seem to be comparable to previous years, and that Progress 8 scores and EBacc pass percentages seem to be awfully low.

What is the problem? The issue, as you are no doubt aware, lies in the tension between the 9-1 and A*-G grades, and the tension between the 2016 and 2017 points, which makes the calculation and use of some of the new performance measures, though not all, quite problematic.

Let’s look at the measures one by one. I’m going to start with the EBacc and work up.

Ebacc

At a headline level for Year 10 any projected % achieving the EBacc is going to be accurate, since it is easy to calculate who has got 5 in English and maths, and C in the other relevant subjects. However, it’s not going to be that useful. It isn’t comparable to previous years, since in order to achieve it, your students need to attain a 5 in English and Maths, whereas previously they would have had to attain a C.

As we can see from the Ofqual postcard, only one third of those who would have attained a C will get the new 5, so you can expect your EBacc attainment to drop.

Threshold in English & maths

The next measure, the % achieving the threshold in English and maths, is also an easy measure to project. Again though, it’s not comparable to previous year groups due to the difference between a C and a 5, so the jury is still out as to how you work out whether your percentage for Year 10 is a good one or not!

Attainment 8

Your Attainment 8 figures can also be calculated quite easily, by allocating the new 2017 points to the A*-G grades for your unreformed qualifications. Once again, the resulting scores are not comparable to previous years, so be careful how you use them.

Of course the Attainment 8 scores are also used to calculate Progress 8, and it’s here that things get really tricky.

Progress 8

This is because the only national estimates we have are based on 2016 exam results, and therefore are calculated using 2016 points. A comparison of in-school figures using 2017 points with national figures using 2016 points is not going to produce Progress 8 figures that mean anything. I have heard data managers and members of SLT say ‘but even if they are not accurate, aren’t they better than nothing?’ I personally think that it’s the other way round, ‘nothing’ is better than data that is at best misleading and at worst downright wrong. The change in the points, as you can see from the graphic, is not even linear, so there really is no correlation.

An alternative option would seem to be to stick with 2016 points across the board. Surely we could compare in-school Attainment 8 based on 2016 points with national estimates based on 2016 points, and it will at least give us an idea of where we stand?

Sadly not. We can’t actually choose to stick with 2016 points in school, due to the 9-1 grades in English and maths. The 8 points a student will be allocated for a grade 8, 7 points for a grade 7 etc. are the 2017 points, and therefore don’t equate directly to the available English/maths estimates, which are based on 2016 points. The easiest way to demonstrate this is by way of an example. Let’s have a look at the English basket, and how the change from A*-G to 9-1 would affect Attainment and Progress 8 figures for similar students.

One of last year’s Y11, Bobby Smith, arrived in Year 7 with a KS2 Eng/mat average of 4.4, giving him an English basket estimate of 9.88. He is predicted a low to middling grade C in English language or literature (it doesn’t matter which as they’re now equally weighted) giving him an English basket score of 10, i.e. 5 doubled. This gives him an English Progress 8 score of +0.06, just slightly positive.

However, a Y11 student this year with the same KS2 level, and the same projected English achievement – a low to middling C, which will actually give him a 4 in 2017, will end up with an English basket score of 8. If we compare this to the 2015 estimate, he gets a pretty hefty negative English basket progress score, which is clearly wrong. This of course will in turn affect the overall P8 calculation.

What would make all our lives a lot easier would be a set of ‘shadow’ estimates based on the 2016 national achievement, but calculated using the 2017 points, to enable projection of P8 scores for current Y11. However, the DfE have said that they are not going to produce any ‘shadow’ estimates, because they actually don’t WANT us to project our scores for the new measures.

Workarounds

Now, if I am completely honest, there are a couple of ways in which you could get around this. Firstly, you could create your own set of national estimates based on 2017 points, using a formula to adjust each estimate. Then you could compare your in-school 2017 Attainment 8 scores to your fake 2017 national estimates to get Progress 8 scores based on 2017 points. Secondly, you could take your 9-1 grades, and adjust the points allocated to them, to give them equivalency to the 2016 points allocated to the A*-G grades. You would then have a set of in-school 2016-style Attainment 8 scores to compare to the actual 2016-style national estimates. Due to enormous pressure from our customers, SISRA are actually looking into the least misleading way of doing it.

However, I have concerns about using any kind of workaround. Both these options, and any others that could be available, carry the risk of encouraging you, as school leaders, into basing decisions on data that is unreliable, because it is calculated using figures that are not real. Just because you can do it doesn’t mean you should. Doing this could lead to some major surprises come summer 2017. If you really have to use a workaround, I would use option 2, since it is almost impossible to know by how much to adjust the national estimates.

So what can you do?

For our new Y11 students calculating useful headline figures seems to be a problem. In some ways, I can’t help feeling that this should be liberating for school leaders. Instead of constantly worrying about the headline figures, the focus can shift to the departments, the classes and the individual students. There are many ways we can analyse the data to ensure that we know our students are making good progress. We are going to have to adopt the idea that, in the same way that if we look after the pennies the pounds will look after themselves, if we look after the students and departments, the overall performance measures will look after themselves. We need to focus on the small data, rather than the big data.

For details of how, see part two of this blog, ‘Effectively tracking and evaluating your new Y11

Using data effectively to prepare for an Ofsted inspection

The word ‘Ofsted’ can strike fear into most and leave even the strongest of people, a quivering wreck. If this is you, then you’re not alone! Between them, the consultants at SISRA have been involved in nearly 20 Ofsted inspections including HMI monitoring visits, section 8 inspections and no notice inspections. We understand how terrifying a visit from Ofsted can be, and with this blog, are aiming to help ensure that you can prepare and use your in-school data effectively to ensure that any inspections are not like a dark cloud looming over you.

What are Ofsted expecting?

Before any inspection, always ensure that you have read the latest Ofsted Inspection Handbook as well as the Ofsted myths that are available from the same link (surprising, there are quite a few myths!). Don’t let an Inspector force you into giving them data and figures that you believe could be inaccurate and stick with the facts that you know are accurate. Ofsted do not expect performance or pupil-tracking information to be presented in a particular format. The information should just be provided to the Inspector in the format that the school would ordinarily use to monitor the progress of pupils in that school. Neither do Ofsted require schools to undertake additional work specifically for the inspection which is why they normally contact the school by telephone during the afternoon of the working day before the inspection. On some occasions, Ofsted may conduct inspections without notice although when this happens, the lead inspector will normally telephone the school 15 minutes before arriving on site.

There are so many paragraphs within the Inspection Handbook, we can’t possibly comment on all of them but what we believe is important to note is that inspectors will evaluate evidence relating to the achievement of specific groups of pupils and individuals including disadvantaged pupils, the most able pupils and pupils who have special education needs and/or disabilities.

Disadvantaged Students

With regard to disadvantaged students, they state that they will gather evidence about the use of the pupil premium in relation to –

‘Any difference made to the learning and progress of disadvantaged pupil as shown by outcomes data and inspection evidence’

I will be using SISRA Analytics to demonstrate how we can prepare for this, but it can of course be done using Excel or within your MIS. For example within SIMS Assessment Manager, an extra student information column can be added to a marksheet by right clicking on the student name column and ticking Pupil Premium Indicator. Alternatively the marksheet can be filtered using the Group Filter icon and just showing Pupil Premium students or non-Pupil Premium Students.

In SISRA Analytics, if we take a look at our Y11 Spring assessment (fig 01) we can see at a glance what KS2 baseline our disadvantaged students entered our school with compared to non-disadvantaged and National.

fig 01

If we now look at the Attainment 8 score in the Year 11 Spring term (fig 02), our disadvantaged students are now falling well below National whereas our non-disadvantaged students are achieving well.

fig 02

But if we look at our SISRA Analytics tracker report (fig 03) and export this to Excel (fig 04), we can do a simple formula which shows progress between each assessment point and also progress from the Y10 Autumn assessment. We can see that progress for non-disadvantaged students is 6.18 and for disadvantaged students it is 6.17, so progress is actually very similar. So we need to be asking what happened in years 7 – 9? If we had a flightpath model or Expected Attainment Pathway model tracking progress from Years 7 through to Year 11, we would be able to closely monitor any dips in progress.

fig 03

fig 04

Most Able

The same applies for Most Able. In fig 05 below, we are looking at the Y11 Spring assessments for our Upper/High ability students. We can see that 32.9% are achieving an A/A* compared with 17.1% nationally. Also, our average points are significantly higher than national average points.

fig 05

Pupils with Special Educational Needs and/or disabilities.

It is important to be able to determine how SEND pupils are doing academically. Again, this can be done via Excel or within your MIS but it can also be filtered very easily within SISRA Analytics. In fig 06 below, we can see at a glance how students within different special educational needs are doing compared with National.

Fig 06 shows how our SEND pupils are doing in relation to the EBacc.

fig 06

Fig 07 shows the same pupils in relation to Basic measures

fig 07

What NOT to provide to an inspector

Finally, it is worth noting that on no occasion should you be asked to predict a Progress 8 score for your cohort of students.

In fact, RAISEOnline state within their FAQ that

‘it is not possible in advance to estimate what the actual Progress 8 scores will be as they are based on the national average results of the same cohort.’

They go onto say

‘trying to predict Progress 8 scores based on previous years Attainment8 estimates is likely to be time consuming and inaccurate and is not something that the Department wants to incentivise’

In March 2016 Charlotte Harling, our Principal Consultant, helped out at a school during their inspection. The school was previously judged as ‘Requires Improvement’. They are now judged ‘Outstanding’. A quote from their report says

‘procedures to monitor pupils’ progress are first class. Leaders accurately identify pupils who are not performing at their best and swiftly intervene to secure improvements’

During the inspection, Charlotte was asked what the school were predicting for 5A*-C(EM) for the current Year 10. Charlotte explained that it was not a measure that they were looking at and would not be able to give a figure for this. This answer was accepted by the Inspector.

She was also asked to predict a Progress 8 score for the Year 10, again she explained that this was not possible but she did give them a threshold figure for English and maths.

A confident approach

These are just a few things to consider when preparing for an Ofsted inspection. An Ofsted inspection should not be scary if you’re prepared. Of course nerves are natural but are those butterflies in your stomach really just adrenalin?

If you have any comments or have recently been through an Ofsted inspection and have anything that would be useful to share with other schools, we’d love to hear your thoughts.

Finally, if you’re expecting Ofsted any time soon, good luck!