Author: admin

Life at SISRA HQ with Bex, Support Manager

Our support team answer your queries on a daily basis, so we thought you might be interested to hear what they have to say about life at SISRA HQ! We’re starting off with Bex Heenan, Support Manager, who has been working at SISRA for over 5 years.


Q: What do you love most about your role as the Support Manager at SISRA?

A: All of you! I would definitely say helping and getting to know the SISRA Admins at our schools. It’s really rewarding when we can help someone find what they’re looking for or better yet, speed up their data analysis. The lovely comments you give us really make our day!

Q: Do you have any favourite moments from speaking to users over the years?

A: There are so many! We get to know our data managers really well and this means some of our conversations can get a bit distracted. We end up going a bit off topic, I’ve had chats where we pretended to be pirates, discussed The Great British Bake Off and even shared some song lyrics! Not to mention, we get to see how inventive our SISRA Admins are when they change their name in the service; we’ve had Edward SISRAhands, Donald Duck, SISRA Lord and Q Branch, to name a few.

I’ve also had a few touching chats with data managers retiring over the years, after speaking to them for so long its quite difficult! That’s definitely stuck with me.

Q: What is the greatest challenge you face in your day-to-day role?

A: Problem solving and trying to find different ways to achieve what our schools are looking for. We often get complex queries on chat and we always want to give the best possible outcome or solution. So I would say trying to think outside the box, but it’s also the most enjoyable thing too – feel free to ask us anything!

Q: Are there any exciting plans for the year ahead?

A: As you probably know, this summer we launched our Data Collaboration programme and were able to produce accurate Attainment 8 estimates long before the DfE, and we’re really looking forward to building on this success. We’ve also just released the re-design of Analytics and the KS3/4 Reports, and are planning on improving the usability and friendliness of the reports still further.

Support have started working on a new project to introduce bitesize videos. They’re expected to be super short (1-2 minutes) and give quick overviews of where to find snippets of information in the KS3/4 Reports.

Observe has also been redesigned, with new features being introduced every month. We’re currently working on a new approach to our Observe user guides, which involves releasing shorter, to-the-point documents. We’ve re-organised Help > Guides & Handouts to make this easier to navigate and use our new resources.

Q: Tell us something unusual about you?

A: I’d have to say (and I think my team would agree) that the most unusual thing is that I will dip anything in tea (from chocolate to biscuits and even ham!).

by Bex Heenan, Support Manager

The Science of Science

Confused about Science? There’s a strong chance you are not alone! I’ve had many discussions with schools recently in relation to the reformed Science qualifications, which affect all current cohorts.

What’s changed?

As part of the Government’s introduction of tougher and more challenging linear GCSEs, new specifications for Science came into effect in September 2016. As the content is dictated by the Government there will, to a certain extent, be a lot of similarity between exam boards and indeed specifications from the same board.
There is no longer a single combined science option now – farewell Core and Additional Science! They are replaced by a Combined Science qualification worth two GCSEs (double award). Separate sciences continue to be an option as always, with 9-1 grades.

It doesn’t help that the most common course for the new Combined Science qualification is AQA’s ‘Trilogy’ (QN 601/8758/X). As Trilogy means ‘a group of three things’, you could be quite easily forgiven for thinking this may comprise of three sciences. AQA also offers ‘Synergy’ (QN 601/8760/8), OCR offers ‘Gateway A’ (601/8687/2) and ‘21st Century B’ (601/8690/2), whilst Edexcel keeps it simple (hooray!) with ‘Combined Science’ (601/8765/7)!
Further details for each exam board can be found here:

But what happens to the grading system?

As Combined Science is now a double award, a 17 point grading system has been introduced. These grades will never be more than a grade apart and (we understand) the first grade will always be the higher of the two. For example:


Grades of 1-1 to 5-5 can be awarded in foundation tier, with higher tier offering grades from 4-4 to 9-9. A fail grade would simply be a U but worth two entries.
I haven’t been able to find an official explanation of the grading system, other than this document:

Exam boards, such as AQA, also offer (limited) information on their websites.

What does this mean for the headline figures?

Combined Science can take up to two slots in either the EBacc or Open elements of Progress 8 where this represents the highest relevant grades achieved. One grade from this qualification can fill one slot if higher grades are achieved in other qualifications.

However the points awarded to Combined Science are averaged – because of this it is crucial that you set this qualification up correctly in any data analysis tool. For example, grades of 6 and 5 would be averaged to two 5.5 points to fill either one or two slots as appropriate (a maximum of 11 points across two slots). Let’s take a look at how this will work for a current Year 11 student:

Drama does not contribute towards the student’s P8 score

How do I ensure my set up is correct in SISRA Analytics?

Firstly, you will need to set up a 9-1 Double grade method in CONFIG > KS3/4 Grade Methods. Here’s an example (sub grades are optional):

A top tip when naming your grade methods, is to look at the order in which they appear in the Config area – their names and order appear in the reports, so do use appropriate grade method names (BTEC rather than D*DMP) and think about the order in which you would like them to appear (most common at the top, for example).
Secondly, when setting up an EAP, ensure that the Combined Science EAP uses the 9-1 Double grade method:

The EAP determines which grade method Analytics will use when calculating your grades data – this is equally as important for all other subjects! Another top tip for naming your EAPs is to use references to their baselines and end point e.g. ‘Avg KS2 to 9-1 Double’ if using KS2 as baselines:

Or ‘Combined Science 9-1 Double’ if using FFT estimates:

Including the grade method in either example has the advantage of not only helping you when cloning EAPs, but also ensuring the ‘KS4 MEASURE’ column on the matching page is accurate:

Finally, to the matching page! KS3/4 EAP allows you to correctly set up Combined Science as ONE qualification (but as a double award). You should have just one (green) row using an EAP that uses the 9-1 Double grade method. It will also need to be nominated as ‘Science’ in the Special column if it is an approved qualification that counts towards the Science specific measures, such as EBacc (all the QN codes above do).

If your Science staff only award single grades (common in Year 10), then these grades are effectively doubled by Analytics e.g. if a single grade of 5 is uploaded, then it will be treated as a 5 5 (providing the 9-1 Double grade method is used). This will save you time by not having to ‘double up’ the grades.

We do occasionally see Combined Science set up on two rows – as Combined Science 1 and Combined Science 2 for example.

Whilst this will not affect your Attainment 8 headlines and entries, it will have an effect on your Attainment 8 & Progress 8 elements – there’s an example further below.

How do I know my headlines are correct?

Once you’ve followed all of the above steps, take a look at your Reports. There’s potentially 3 different set ups for Combined Science. The first uses the correct set-up (one row for a double award qualification), the second has Combined Science set up as two different qualifications (like the old Core and Additional), with the third as one qualification using the 9-1 Single grade method.

This example school has a cohort of 210 and one student has a grade which could affect the Attainment 8/Progress 8 EBacc and Open elements. When looking at the EBacc and Open Attainment 8/Progress 8 elements below, we can see slight differences between the two. Only slight, but bear in mind this is in relation to just one student and will mean your figures do not match the DfEs – imagine if the grades of half your cohort affected these figures!

We can see a huge difference for the Ebacc Attainment 8/Progress 8 element when we look at the figures where the 9-1 single grade method has been used – it’s now negative!

Your overall Attainment 8 and Progress 8 figures should not be affected unless you use the 9-1 single method. Checking this table in the Headlines Summary Report will give you an indication of whether you are using the correct grade method:

We’ve seen how this affects our headlines, so let’s take a look at how this would affect an individual student. In this example using the Student Headlines Report, we can see this student’s Combined Science qualifications have been averaged from his total Combined Science points of 9 and 4.5 points fall into both the EBacc and Open baskets.

When set up as two separate qualifications, the lower Combined Science grade does not contribute to the Open basket:

Whilst in this example, his Attainment 8 score will remain the same, we can see here the difference it has made to his EBacc and Open Progress 8 scores:

What if my school tracks Biology, Chemistry and Physics grades for Combined Science?

This can still be done. There’s an example in the screenshot below. Just ensure that your Combined Science separate qualifications are set to Unapproved. This way, you will be able to analyse the data for the separate qualifications, classes, and students, without it affecting your headline figures.

For any schools still using Legacy mode, please note it does not cater for 9-1 double grades. Setting Combined Science up as two separate qualifications will ensure the number of entries is correct, however it will count the qualifications separately towards Attainment 8/Progress 8 calculations and the average points will not be used (just like in the example shown above).

Hopefully your own SISRA Analytics set up will be spot on, but it’s always worth checking! Our Live Support team can always assist any SISRA Administrators if help is needed.

by Emma Maltby, Data Consultant

Attainment 8 Estimates Data Collaboration

‘The goal is to turn data into information and information into an insight’
– Carly Fiorina

If you already use SISRA Analytics you will know what a valuable tool it can be for monitoring students’ progress both at current tracking as well as potential performance. The features to model different groups and compare with past data are also useful.

I’ve been an advocate of SISRA Analytics since I first encountered the service in 2012, and have introduced it in my last two schools as a tool for all members of staff.

Therefore, it was with great interest that my colleagues and I learned of SISRA’s collaboration plans (led by ASCL) to create their own Attainment 8 estimates. We signed up immediately, and were keen to encourage others to do the same. For a school in our position (with a low P8 score in 2016), this has been invaluable.

We have been able to demonstrate that, when compared with 1,177 other schools*, we are likely to have made substantial improvements on 2016. This gave us the confidence to review our own self-evaluation at an early stage. The shared data has been very close to the data within the checking exercise too (slightly conservative in comparison, but still closer than anything we’ve had before).

‘Without data you’re just another person with an opinion’
– W. Edwards Demming

That was true of us. We thought our progress had improved, but in this brave new world of 4s and 5s, we had no reassurance. Sharing our data with all these other schools gave credence to our opinion.

As a school we are committed to sharing our data and working with other schools where we can – this supports our value of aspiration (by sharing and learning with others we are supported to be the “best we can be”).

After the success of the 2017 opt-in I’m very much looking forward to taking part in the 2018 opt-in, which SISRA is suggesting may well stretch beyond A8/P8 data. I would strongly urge all schools to opt on to this once it becomes available. The bigger the data set, the more reliable the hypothesis, and the more insight you will get into the progress of your students.

*as of October 2017

Graeme Smith, Principal, Derby Moor Community Sports College

How Ransomware can cost you your data

On Friday 15th April 2016 the school email systems, the internet and SIMS went down; we then systematically lost access to the servers throughout the day. IT confirmed early on that we had been hit by a virus and they were working on clearing it. The virus could be seen by a name on the files, and all staff were advised to save unaffected files as soon as possible. As an aside I mentioned to a colleague that so long as it wasn’t Ransomware we were fine. This proved to be prophetic in all the wrong ways. By the end of the day IT had confirmed that it was Ransomware and all the servers were affected and encrypted. The attack had come through the new BT server which had passwords that were not as strong as the rest of the network, and it had come through a brute force assault – a password cracker run for a month against BT lines.

No chance of a quick recovery

On Monday, it was clear that there were significant recovery problems, as the server backups saved into the servers and as such were compromised as well. By this point we were copying the paper standby fire registers to use for am/pm registration, and this situation was to continue for the next 6 weeks. All of the servers were unrecoverable and had to be rebuilt, Capita came in on contract to assist. One external backup from the previous year had been located and could be reinstalled, but the date for it was early August 2015. As a consequence it would be missing any data for the current academic year, results data and the year’s timetable.

Over the next few weeks the servers were rebuilt, SIMS reinstalled and the only backup uploaded. This gave a draft timetable that was too unstable to amend, no attendance data and all current calculations in SIMS for the first year of the new GCSE grade scores were gone. Fortunately, the student and staff drives and the work in them could be recovered to a degree. The Local Authority was initially unable to return the attendance data, and we ended up having to pay for them to reverse engineer the upload. This did not come back until the start of the next academic year. The exams data could be reimported but as we use SISRA Analytics, the collection data was secure on their website and could be recovered and dropped back into our systems.

The effect on exams and data

During the six week period this happened in, life was not pleasant in the student office, as all our systems were down, and very little could be done to rebuild the missing SIMS calculations, input the growing pile of paper registers for all year groups, or check the behaviour logs. The exams officer had to recreate approximately 4,500 individual exam entries and special circumstances, as our records for the GCSE and A level exams no longer existed. Curriculum staff were informed SIMS was back up and running after about five weeks, though this statement omitted the level of damage to records and led to some acrimonious exchanges when asked to provide attendance or behaviour data.

The timetable was frankly shot, and could not be amended in any way for the rest of the academic year without causing its total collapse. Every calculation needed in SIMS had to be rebuilt from creating grade sets onwards. All of that year’s data had to be reimported from SISRA Analytics, and at the same time the final data collection and reports for the year collated and disseminated. Timetable migration and the results day preparation also had to be completed before the end of July.

The damage sustained continued to be felt into the next academic year with errors being found in SIMS, calculations, and processes. Initial repairs had focused on those areas in use and as new events rolled round more compromised systems were found. The class migration completed in July failed on 1st September leading to all groups having to be recreated by hand from records.

What did we learn?

A number of lessons came from this process. External backups are essential, as are strong passwords and not opening any suspicious emails. Regularly saving and exporting essential data, documents and reports should be done. Likewise, class membership can be exported and saved, which provides an extra level of backup when the year dates roll over. Finally, cloud based or externally hosted systems like SISRA Analytics are fantastic, because you don’t lose all your data!

Nathan Page, Data Manager, The Ferrers School

Identifying Students Not On Track

My school started using SISRA Analytics approximately two years ago, initially just for external exam results. Once we saw the power of the analysis we began tracking internal assessment grades for all year groups. We also find being able to look at other datasets such as FFT estimates very useful.

The EAP area is very easy to set up and the reports allow us to easily identify students who are not on track to achieve their projected grades throughout all year groups. There is also the added bonus of being able to track progress throughout both Key Stage 3 and 4.

Our assessment policy has changed for the new academic year and I am reassured that I need to make very few changes in EAP. It really is very adaptable and flexible to schools’ needs.

We have enjoyed exploring the new functionality within the reports. The reports have improved the way we report to parents too, so they can quickly see whether students are on track to reach their end of year target.

SLT really like using the power of the Qualification Totals report which provides them with an instant overview of on track performance across all subjects. It’s really quick to drill into faculty, class or student data too.


Victoria Kirkwood, Data Manager, Spalding Grammar School

Using Data sets for ‘What if’ scenarios

Using SISRA Analytics for exams analysis? Check. Using it for assessments? Check. Using it to see how things might work out? Read on.

The benefits for your SLT, middle managers, and class teachers in using SISRA Analytics are well known. Hopefully they are well trained on using it to investigate and highlight under-performing groups on a regular basis. But I’d like to talk about another use for SISRA Analytics: the ‘what if’ scenarios.

In SISRA EAP, you can create up to five regular data sets in addition to ‘Exams’ and ‘Assessment’. And within ‘Assessment’, you can create as many data sets as you like (publishing up to six in a particular year). So at our schools, we often use one of these spare data sets to try things out, either with grades for all subjects, or for selected subjects. Some possible scenarios include:

  • A member of SLT is interested in how the FFT Benchmarks compare to the targets your students have.
  • The head of maths wants to analyse the results of a recent test to see if the Pupil Premium interventions have worked.
  • A set of ‘aspirational targets’ are suggested by a deputy head and they want to see how the headlines would look before committing to using them.
  • A target review has taken place, and class teachers have made some suggested changes. Before they accept the changes, SLT want to see how things would be affected.
  • A set of nationally recognised assessments are undertaken – SLT want to see how a cohort has performed using CATS predictors, MidYIS assessments etc.
  • It can be done in the KS5 Legacy area of SISRA too, to see how ALPS/Alis/Level 3 transition estimates would stack up in the sixth form.

Y11 Reports Homepage with minimum and aspirational target datasets, and also FFT Estimates

Sometimes these data sets are analysed and compared for a short time, and so you can publish them as ‘Locked’ data sets to avoid confusion for other SISRA users. You can set up a ‘Locked’ authority group in the ‘USERS’ section of SISRA Analytics to give selected users access to these locked reports.

Aspirational targets published with ‘Locked’ status

Remember to review the use of these temporary ‘what if’ data sets regularly, as keeping the vast majority of your data sets available to all users is important for data transparency.

So next time you find yourself creating complicated SIMS marksheets or Excel spreadsheets to figure something out about a set of grades, consider popping them into SISRA Analytics which can do all the heavy lifting for you. SISRA is not just for the regular data cycle, it is there for those one-off occasions too.

Matthew Begg, SIMS and Data Lead Development Manager, Education and Leadership Trust

Hiding behind your Data? A Governor’s perspective

Are you giving your governing body accurate data? Does it allow them to see the full picture? As a governor, I am often presented with a snapshot of the latest assessment, but many of us have full time jobs, and for most these are not within education. So, does a small snapshot of your current assessment give them enough information to challenge? That is after all the role of the governing body isn’t it?

It’s increasingly frustrating to hear of schools using old measures to communicate their headline data and others bandying Progress 8 figures around as though they are accurate. It is difficult enough for school staff to understand that P8 is not going to be accurate until the validated results of that national cohort are released, so can you imagine how hard it is for a governor with no education background? I do believe that Progress 8 can have a place though, but it is just not suitable for whole school accountability. For example, it can be used to rank students and investigate areas of weakness for intervention and also to look at the gaps between groups of students across time.

So, if we can’t share old measures or the new Progress 8 measure, what information can school staff give their governing body? I’ve been thinking about this a lot recently and thought I’d share a few of my ideas. The examples below are using data that is based on ‘Working At’ grades, but it would still be as powerful if you are using ‘Predictions’.

Threshold in English and maths

These charts offer a very visual picture of the school performance at this point. In this case, I’ve compared them to the school targets to give a better indication of how far they are from expectations.

The more data savvy governors will probably be biting at the bit for more detailed information – this example shows the key groups within a cohort and how each of these groups are performing against the basic measure. This would allow governors an opportunity to see the strengths and weaknesses within your school in this key headline measure.

Is this still enough data though? From this you would deduce that the one student from ‘White and any other background’ is a key concern and intervention is required. However, if you add the targets in you can see that this student has not been targeted to achieve the threshold measures – another conversation altogether!

Below are some other charts that give a good indication of the cohort performance:

You’ll notice that I included the Progress 8 chart, despite saying that it shouldn’t be used as a figure. Provided that the targets and assessments are both using the same A8 Estimates, this can be used to measure the gap from this assessment to the target, as long as a little notice is taken of the actual number that is being produced! But that does lead me nicely on to how Progress 8 can be presented to the Governing body.

This report allows the governors to see that there have been no dips in overall performance, but they can still see that there is some way to go from the target as they have the charts to reference.

Another question that I’ve been asking myself is whether it is appropriate to give qualification level data to the governing body? My conclusion is ‘yes’, as it allows them to question the data in more detail. If you are using pathways in your analysis and tracking, then being able to combine that with your gap analysis is going to give a clear indication on where subjects are having success with a particular group of students. This information can then be used to inform teaching & learning across other subject areas.

Governors will be able to see that English have the greatest percentage of pupil premium students who are below where they should be at this point in time. You could take it one step further and look at the bottom 5 qualifications against targets. This is an example of the English Language Spring assessments vs the KS4 Targets – by presenting this, it is easy to see the numbers of students and how far away they are.

As a governor myself, this is the type of information that I would hope to be presented with at meetings, but it’s still very bitty. I would suggest putting together a governor return so that the information is consistent at each meeting and the comparable data is readily available for questioning!

If your analysis tool allows you to set up bespoke permissions, you could consider giving your governors access (and an assessment timetable!). You will need to train them fully on how to use the system, and this will take more than one session. But allowing governors to interrogate the data for themselves, and come to the meetings with questions at the ready, will lead to a more productive meeting and keep them to time.

For SISRA Analytics customers, keep an eye out for our Governor Training Pack that is coming soon!

by Ali Platt, Data Consultant

Life After Levels

My school first started using SISRA Online in March 2013 before moving to SISRA Analytics in June 2014. We started using the Analytics EAP mode, which was specifically designed for Life after Levels, in July 2016.

I found EAP mode straightforward to set up, and made use of the materials available in the HELP area, such as the videos and help guides.
My school developed its own KS3 competency grading system which, was easy to set up as EAP mode allows you to create your own bespoke grade methods. Ideal for our grades of Exceptional / Extending / Secure / Developing / Beginning. Previously we had to use Excel which was time consuming and unwieldy.

We were also able to choose to use flat lines as flightpaths too which, was my school’s preference. The EAP area is very similar to Legacy mode in Analytics with regard to uploading data, so this has helped with the transition too and prevented me having to create additional reports and marksheets in SIMS.

We also use EAP mode to analyse ‘Attitude to Learning’ grades which is excellent and has been really useful for my school.

Due to the many recent changes implemented by the DfE, greater understanding is required by data managers, but using SISRA Analytics saves time processing and publishing the analysis compared to other methods. Thank goodness for SISRA help guides which digest the DfE docs into palpable ‘what to do’ guidance!

I am thankful for Jon Williams, SISRA Technical Director and his Webinars as we went through the phased releases too.

Ruth Williams, Data Manager, The Minster School


Prior to joining SISRA, John had a varied and interesting career path. Following a practical apprenticeship in plumbing after leaving school, his love of both computers and music led him to study the technology of engineering in musical recording. Whilst still a personal passion, his career path deviated towards data analysis, first for a utilities maintenance organisation, and then for a premier solicitors.

John started his journey into the world of education at a Manchester school in 2013, working in a small data and exams department with a focus on using SISRA Analytics for the school’s data analysis. During his time there he perfected the school’s use of SISRA, as well as becoming an adept user of SIMS for data and exams purposes.

John has developed and delivered many in-house training programmes on the use of SISRA Analytics. This experience and knowledge of the system opened the door for John to join the consultancy team at SISRA in April 2017, covering schools across North West England, Yorkshire, and Derbyshire.

Changing Data Management

We began using SISRA Analytics in September 2016. It was brought in by our new Head and we needed to be up and running almost immediately with it. We were, to be honest, slightly reluctant to change our ways with data management, as we had very effective Excel spreadsheets in place with which we were able to accurately produce all our headline figures and student analysis. However, after some SISRA training and using the guide sheets, we were soon able to have some data in the Legacy area and could immediately see how the speed with which SISRA can produce all of our analysis was going to help our efficiency immensely.

We have found SISRA to be very user friendly in terms of data management. The help guides are very clear to use and by simply working through these we have been able to set up a complete system and get into an efficient routine of using it. The support from SISRA has been great, being very quick to help with any issues and the DataMeets have also been very useful for gaining know-how. We are extremely impressed with how SISRA keeps up-to-date with all National developments and delivers exactly to time on its upgrades.

We have been running both EAP and Legacy areas in tandem. This has enabled us to have different Progress 8 scenarios in place for Year 11, based on 2016 or 2017 points. It is also very quick to see other ‘What If’ situations by reconfiguring and republishing. EAP is more logical to set up, being split out into Students, KS2 and Grades, and much quicker to publish! We have up to 5 data collection points per Year during the year and we are now able to have unchecked data into SISRA within a couple of hours of the collection deadline. This then enables teachers and Heads of Department immediate access to their data for checking purposes, easily spotting any errors, omissions and trends, before we republish checked data.

The possibilities for analysing and viewing data in the EAP area are vast, especially now with the added dashboards and charts
– we are definitely SISRA converts!

Sue Lefley, Data Administrator, The Deepings School

How to be a great Data Manager

I stumbled into the role of a data manager. I was coming to the end of a fixed-term contact and saw a position advertised for exams officer and data manager which I thought sounded interesting. It seemed to bring together all of the skills I’d acquired in my previous roles; data, Excel, information systems, training, data quality, and data accuracy. I had no knowledge of education or the education sector and certainly didn’t realise at the time that the data manager position existed as a vital role within all schools, which meant Google quickly became my best friend!

The role of the data manager varies from school to school; some are involved in exams, supervising cover, timetabling, MIS management. Some do all of these and more! Even the job title varies across the country; data officer, data lead, assessment officer, to name a few. In my three years as a SISRA Consultant, I’ve learnt that the pathway that leads people to becoming data managers within schools also varies. Some come from a data and IT background from another sector, others develop into doing the role having worked in a school in an administrative position and others “just end up doing it”.

However you end up doing this role and whatever your experience is, I strongly believe that training, having a good level of communication with key staff at school, and being supported is vital in helping you to do your job well. There’s lots of things that you can do yourself to make your life easier, so I’ve listed my top tips below:

Understanding Assessment Data

Make sure you understand the assessment cycle within your school:

  • What data is your school collecting?
  • Why are you collecting this data?
  • What happens with the data that is collected?

Understanding Education

  • What type of school are you working for?
  • What Key Stages does your school have?

If you have come into the role from another industry, you will come across plenty of educational acronyms and terminology. You may find it useful to keep a list of these to refer to:


Be Organised

  • Do you know when assessment data is due?
  • Do you know when reports are due to be sent out to parents?

Being organised is key to good data management. I used to keep a chart on my office wall of when each year group had reports due, similar to the two examples below. Collate this information and update it for each academic year. If you are also involved in other things such as exams or the census, include these key dates as well.


Implement Processes

I love flow charts or diagrams of steps that need to be taken when following a specific process or procedure.
When first being involved in assessment data and reports, you may find it useful to put the school procedure together as below:

Understanding Headline data and DfE Performance tables

If you are new to education, how do you find out what figures your school is measured on?
The DfE School Performance website is a good place to start. Search for your school and view the latest headline figures. Make sure that you have an understanding of how these figures are calculated by accessing the relevant DfE guides and sign up to receive email notifications about new releases and updates.


If you’ve taken up this role without previous experience of the education system you must be trained or be able to spend sufficient time with key staff in school who can further explain to you what you need to know. If you are struggling, try to think of the things that you feel would help you to do your job more effectively and if possible, ask for additional training.

Stay up-to-date

Things in the education sector are always changing. Stay up to date by:

  • Joining forums and networks
  • Building relationships with data managers at other local schools
  • Attend SISRA datameets (even if you don’t use SISRA Analytics you are very welcome!)
  • Join the Data Managers Facebook group
  • Sign up to receive DfE Gov UK email alerts


And remember…

  • Take your time
  • Be accurate
  • Be inquisitive
  • Don’t be afraid to ask questions!

We’d love to hear any best practice, hints, tips or ideas from your experience of data management, so feel free to comment below!

by Kate Moon, Data Consultant

Getting the right results (PT.3)

Welcome to my third and final blog post for this academic year. Hopefully you will have found the tips in my first post Getting Results Day Ready and second post Comparing Against Targets useful.

If you find yourself asking ‘Why don’t our headline figures match the DfE’s?’ don’t panic – here are 4 key areas to check:

1) Cohort total and Key Stage 2 data

Always check the number of pupils on roll in the DfE June Tables Checking Exercise (your headteacher will have received a letter in May). You should also check pupils who are classed as SEN and FSM Ever 6. Ensuring eligible pupils and their characteristics are correct can make significant differences to performance tables, as this data effectively determines the calculations.

Finally, check the KS2 data you have matches what is supplied by the DfE. It has been known for the DfE to have KS2 data for some students which you have never been able to track down. Ensure the data matches exactly to 2 decimal places – the odd student with differences of 0.01 can affect your Attainment 8 and Progress 8 figures.

Remember, exams results are based on the cohort at the time of the January Census (19th January 2017) – you can ask that certain pupils are removed if they meet specific criteria e.g. ‘admitted from abroad with English not first language’, or ‘Permanently left England’. Guidance can be found in the documentation on the Tables Checking website:

2) Non-curricular subjects

It’s good practice to always check with your exams officer whether any students are sitting exams for non-curricular subjects. At the school in which I worked, we often entered pupils for examinations in their home language. These can make a significant difference to your school’s headline figures.

Below we can see the Attainment 8, Progress 8 and EBacc data for our school, compared against targets:

The above data excludes the 6 home language grades which were all B grades.

The above data includes the 6 home language grades which were all B grades.

I also found that the odd student took Grade 6, 7, or 8 music qualifications. Whilst these exams were sat externally, they are still included in the performance measures. A Grade 6 Pass is the equivalent of an A in the open basket. This has the benefit of increasing the Average Attainment 8 score and grade for the student and your school. It might be slight but every little helps! For some students, it may even fill an empty slot in the open basket.

A GCSE Music grade could also count in the open basket. I’d recommend you always check the discount codes here though:

I used to ask form tutors to enquire whether their tutees had external music awards and tried to obtain a copy of any Certificates.

Beware of reformed and unreformed qualifications particularly with home languages – if a student takes an unreformed exam prior to Year 11 and it is reformed in Year 11, it will count for student performance, however it will NOT count for school performance! These qualifications can still be analysed within Analytics, but should always have ‘Unapproved’ as the method so they do not affect the headline figures.
A timeline of reformed qualifications can be found here:

You may wish to consider AS-levels for some students who have already taken an unreformed GCSE. These will count in the appropriate Progress 8 ‘basket’ for their subject. If a GCSE in the same subject has been taken, the AS will always discount the GCSE – grades A and B will score higher points than an A* at GCSE!

Finally, in the October Tables Checking Exercise you will be able to access a csv file which contains details of the grades awarded. There’s also a handy Progress 8 column – an easy way of checking your data is to compare this to the Progress 8 figure in your own internal software, whether it’s SISRA Analytics, Excel, or otherwise.

3) The grades data

Have any students been previously examined in any qualifications? The DfE’s early entry/first entry rules came into effect on 29th September 2013 and from this ‘only a student’s first entry to a GCSE examination will count in their school’s performance tables’. Any subsequent entries after this point (in the same or any related qualifications) are ineligible to count towards school performance measures, although they will still count for the student. Guidance can be found here:
During the autumn term, you may have some grades which need to be updated due to remarks too.

4) Incorrectly set-up qualifications

Finally, always check that qualifications are set up correctly. Start by checking the measure is correct – this affects how it counts towards your KS4 measure and headlines. Are GCSEs correctly set up as either A*-G or 9-1, BTECs and Cambridge Nationals as Non GCSEs?

As mentioned above, AS qualifications discount GCSEs so ensure discount codes are used (you do not need to use the official ones in Analytics e.g. for Polish GCSE you could use POL, and for Polish AS *POL – the asterisk ensures the AS takes priority). Also check that similar qualifications do not discount each other too!

Are your grade methods using the correct points? These can be found here:

Now it’s time to check which subjects count towards the EBacc (this is known as the ‘special’ column in Analytics).  It is really important to ensure qualifications are correctly set up as it can make a significant difference to your figures if not.  For example home languages such as Polish, Urdu etc. should be set to ‘language’ in the ‘special’ column.  Computer Science is another subject which is often omitted from the Ebacc too (it should be set to ‘Comp Sci’). Conversely Religious Studies is often incorrectly included as a subject which counts towards the Ebacc as a ‘Humanity’.  It doesn’t count towards the EBacc, but can count in the open basket.  A list of the qualifications which count towards the EBacc can be found here:

Last but not least, are you using the most recent publishing options?  As at June 2017 in SISRA Analytics, these are 2017 DfE rules, 2016 (2017 points) Attainment 8 estimates and 2016 Value Added.

At this point your headlines should be matching ☺

by Emma Maltby, Data Consultant

Comparing against targets (PT.2)

With Results Day fast approaching we will soon be making comparisons between our exams and targets data. As discussed in my earlier blog ‘Getting Results Day Ready’, preparation is key! With this in mind, it’s a good time to ensure that the number of targets tally with the number of results you are expecting. Let’s take a look at some data to see how important this is.

Below we can see the Attainment 8, Progress 8 and EBacc data taken from the Headlines > Charts in SISRA Analytics. I have compared the Y11 spring data against school targets. All of the school’s timetabled qualifications have been included in both datasets.


Imagine we have just found out that 6 students will take a GCSE in Polish. The Head of MFL expects they will all achieve a grade B. I have added these grades to our Y11 Spring collection. As a result, our Attainment 8 figure has increased slightly and our Progress 8 figure is now positive *big cheer!*


Is there anything else we need to consider? Yes; for complete accuracy we should also ensure that any datasets we compare against have the same number of grades uploaded. For the 6 exam grades I have entered, I should also enter 6 target grades to enable me to make accurate comparisons.
See how this has affected the Attainment and Progress 8 target figures.


This is extremely important at qualification and class level, particularly if the data forms part of a teacher’s performance management. Here we are looking at the cumulative pass for the MFL faculty without the Polish grades:


Once the Polish grades have been added to the spring collection this affects some of the summary figures for the department, most notably the average points and residual.


When we factor the targets in too, see how the figures change again.


As Heads of Department and Class Teachers can be judged on A*-C performance, in this example ensuring the Polish results and targets are added, has a positive effect on the data. The average points have seen an increase too, as has the Residual figure, which shows how well the faculty is performing overall across qualifications with the same point scale. This could be the difference between a pay increase or not!

This also applies to any other dataset you compare with – whether its FFT estimates, performance management targets, CATs, MIDYIS, or YELLIS. Always ensure the figures tally!

Many schools use Analytics to model targets or for forecasts; again, a complete set of grades is essential for accuracy.
Another common mistake is that qualifications are not correctly nominated as EBacc subjects. Here we can see the effect of computer science on some of the key headlines when it is incorrectly set up, against when it is correctly set up as a special.


Another subject often incorrectly nominated is RE as a humanity. This has the opposite effect of Computer Science on the EBacc basket.

A simple check can be made in Analytics to see if your datasets tally; simply compare an assessment collection against your targets and check the ‘Total Grades’ column. Your colleagues may just thank you for it!

Hopefully by reading my earlier blog as well as this one, you should now be feeling more confident about results days and the accuracy of your data. Within the next few days, you will be able to read a further blog which will help with troubleshooting if there are discrepancies between your figures and the DfEs.

by Emma Maltby, Data Consultant

Getting Results Day Ready (Pt.1)

It’s almost that time of year when we are filled with a mix of dread and nervous anticipation.
Whilst chatting with some Data Managers the other day, someone new to the role asked for some survival tips. Our top tip was preparation, preparation, preparation! Even for those who have been in the role for years, this still applies. Ensuring you are well prepared does take time, but it will make embargo days less fraught.

I think Abraham Lincoln had a very good point when he said;

“Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

Of course, we cannot always prepare for all eventualities but it is good to have a back-up plan in case something does go wrong, such as staff illness or no internet connection.

As we all love a good spreadsheet, it is handy to prepare a schedule similar to the one shown below. It ensures all staff are aware of their responsibilities and that the exam period runs as smoothly as possible. I used to break mine down into pre-results days, A-level embargo and results day, and GCSE embargo and results day.

If you are new to the role, you could visit another local school to discuss their procedures, or attend any local data staff meetings. Some Local Authorities host these or there are the SISRA DataMeets too (you don’t need to be a customer to attend) and you can find out more about these here.


 …so what kind of preparation can you do?


  • Check A2C is working and you can connect to the exam boards. Install it on a second machine too just in case!
  • Refresh exams base data.
  • Set up embargos in your MIS – don’t forget to include yourself and be mindful of JCQ regulations!
  • Upload any banked exams (with the appropriate date for first entry rules) to SISRA Analytics.
  • Ensure your student data reflects the January Census (students, KS2, SEN & FSM Ever 6 information).
  • Partially complete any LA or Trust returns with as much information as you can e.g. cohort numbers, PP, SEN to cut down on paperwork on the day.
  • Ensure all upgrades are applied to your MIS.
  • Speak to site staff and ensure you will be able to get into school (not fun waiting outside the school gates at 5am – yes it happened to me!).

Padlock / barricade the office door – I say this in jest, but politely reiterate to overly keen members of staff that producing accurate headlines is easier when left to work undisturbed and not rushed.



  • Check the IT staff have nothing planned which will disrupt your internet connection or access to the server.
  • Have a list of entries ready so you can check all the results are in. Ensure your number of targets match this too to ensure your analysis is accurate.
  • If you are expecting any results for home languages, external music qualifications etc., create these manually in Analytics in readiness.
  • If you use student admission numbers in Analytics as the student ID, prepare a look up if you want to use the certification broadsheet in SIMS Exams Organiser (as it only contains either the UPN or exam number).
  • Remind all staff to have their logins ready (there will still be one that asks for a password reminder!).
  • Fill your supplies drawer. If you are going to have a long day, have some lunch and snacks ready.
  • Stock up on paper and any other stationery you will need.

There could of course be things that happen which are outside our control.

However, we should treat these and any mistakes as something we can learn from. This blog is the first in a series of three – look out for the others which discuss how to ensure your results day analysis is accurate!

SISRA Analytics administrators can also find results day resources within the HELP section of SISRA Analytics.

Throughout this term, I will be tweeting top tips on getting results day ready – follow me on Twitter @EmmaSISRA to find out more.
Good luck!

by Emma Maltby, Data Consultant

Shadow Data

The DfE recently released schools shadow data which aligned the changes in point scores for 2017 against 2016 results recently.

There were some interesting differences and implications generated by the shadow data. If schools continue to use old point scores to calculate/predict attainment scores they will be inaccurate and probably over inflated. If Governors and/or Leadership Team are ‘exposed’ to Attainment 8 scores I would recommend the shadow data be explained to prepare them for a drop in attainment and progress scores although in reality it would not be a drop but just a different way data has been calculated.
It would be interesting to find out how many schools have dropped as a result of the new point scores. It would also be very interesting to see how many (if any?) schools have increased their Attainment 8 or Progress 8 based on this shadow data. National data suggests not many?!

For my school we took a hit of -02 in Progress 8 and almost 6 points in Attainment 8 going from 53.07 to 47.87. Make sure your governors are aware! If you are unsure where you get this information the DfE sent this to all schools on the 4th April and is available on the Tables Checking Website… or similar!

by Nigel Sheppard, Deputy Head Teacher, Horndean Technology College

No Grades? No Problem.

During almost ten years in education, my overall use of data is at an all-time low. I know many people have a need to quantify; whether it’s the number of steps taken with a pedometer, followers on social media, or mentally totting up the number of cups of coffee consumed before midday, if we can add it up, average it, or worse, we will! Sometimes people don’t even mind if the resulting figure is made up as long as it gives us some information to paste into our latest pack or document.

Money? Love? No… DATA makes the world go round!

I too enjoy a graph, a table of data or an investigative hour with a spreadsheet. I can’t lie about that, but if you need your fix, the Office of National Statistics website* has plenty of data to play with so perhaps it might be best to use that as your sandbox and not stuff which might actually affect someone’s life. Such horror stories include (but not limited to) appraisal ‘targets’ for teachers based on Progress 8 predictions for Year 7. If you fancy a laugh at data’s expense, there are some incredible examples which tie in with this sentiment on Tyler Vigen’s popular site*, Spurious Correlations. *See links at the bottom of the page.

I frequently devote time to thinking of ways to monitor the quality of teaching and learning without applying the traditional outstanding to inadequate. In fact, how to evaluate teaching and learning without grades is the topic I am asked about most whenever I’m working with a school or showing Observe to a school who’s never seen it before.

For schools who have stopped grading altogether, there are still plenty of ways to analyse what’s observed while steering clear of whole grades for lessons.

All the analyses below have been taken from real schools using SISRA Observe for their observations (identifying data removed).

Question Statements
Lots of schools have opted for a set of questions based on what they expect to see in a lesson. They then go on to say ‘Yes, I’ve seen it’ or ‘No, I didn’t see it’.

Quantifying observation data then becomes straight forward, if a little limited. Over time, this approach can certainly give a good overall picture about what’s being done consistently, and what needs to be embedded further. It’s also reasonable to apply targets to this type of information as it is objective in nature and works well at whole school level.

 % Yes% No
Teacher in class to greet pupils on arrival?7525
Objectives displayed?6040
Are folders set-up for each lesson within the unit?6634
Differentiated outcomes?937
Resources ready so pupils begin to learn on entry?1000

Further analysis could then be performed at faculty or subject level to provide more detail and allow targeted support. Analysing the data in this way takes the focus off what’s not being done and can show where standards are consistently high.

Areas of Strength and Development
Another effective approach is to decide which areas to focus on taken from a set of whole school priorities. This works so well when those areas are well thought out and relate specifically to a range of objectives for the year. This could provide a direct link back to the whole school improvement plan and individual faculty improvement plans as well as the teacher standards if desired.

Observers choose one or more strengths of the lesson from a finite list and then choose one or more areas of development from the same list. Accompanied by written and verbal feedback, over time this can provide a wealth of information including matching up staff who have complementary strengths and areas for development. Schools using Observe tell us that they use this information to pair up teachers for joint observing and lesson study.

The school in the example above can immediately see what they do well (climate, relationships) and in which areas staff require more support (challenge, assessment).

Student Based Questions
‘Were students focused on learning?’

‘Did students have the right equipment?’

Some schools prefer to use a scale such as; All Students, Most Students, Some Students, No Students. Taking this approach with multiple drop-ins or learning walks can allow a leadership team to see who is consistently meeting the expectations and who may benefit from some support. Once a school is satisfied that an area is being met consistently, they may decide that observing it regularly is not necessary and this could allow them to concentrate their efforts elsewhere.

In my experience, it’s certainly possible to use data in a reasonable and sensible way in relation to observations of all types. If you’ve moved away from grading whole lessons, these approaches can allow you to measure impact over time and also shout about what’s going well.

by Charlotte Harling, Head of Observe



Use of data in the classroom

When I was thinking about topics for this post, I was reading some of the other blogs that my colleagues and I have written recently and realised that the focus has been on whole school or year 11 data. None of us have actually talked about the data that is used in the classroom. My strong belief is that data should start at classroom level and work up. Data staff in schools should ensure that the data they provide to teachers will help them deliver a better learning experience to the young people in front of them.

Before I joined SISRA, I worked as a Data Manager in a large Outstanding school where all teachers would have what we referred to as “Standards Folders”. These contained a profile of all of their classes showing contextual information such as gender, pupil premium, special educational needs & disabilities (need and type), EAL, % attendance etc. I also provided class photos as well as templates for seating plans. There wasn’t any expectation on how teachers used their Standards Folders, but they were expected to know the contents and ensure that it was kept up to date following any class changes or assessment points.

This class profile shows all students within a class together with contextual information i.e. gender, latest attendance, ethnicity, SEND etc. together with KS2 fine level, CAT test results and target grade.

An excerpt from my school’s Ofsted inspection when they were judged Outstanding states:

“Systems used to ensure the rigorous collection, analysis and use of student performance data are exceptional. They allow teachers to plan effectively for all students’ individual needs. This key feature is an essential component of the excellent teaching which the students experience and the rapid progress they make.”

There are so many different software systems available for class profiles and seating plans out there (many are free to download) so I’m not going to comment on them here. A simple Word document, Excel spreadsheet or a mark sheet set up from your MIS would be enough, so long as it contains relevant information. Don’t overload staff with anything that is not relevant or helpful to them (most schools have some staff that are data shy and you don’t want to scare them).

Have a think about what a class teacher may need to be able to find out from the data that is given to them? If we take a look at some students in the class profile shown above:

Damon A

  • Attendance is 91% so therefore a concern
  • He has a special need of Behaviour, Emotional, Social Difficulty
  • Middle ability
  • Armed forces

What could be done to help Damon? The Service Pupil Premium is designed so that schools can offer mainly pastoral support during challenging times and to help mitigate the negative impact on service children. Could a mentor be appointed to ensure that Damon’s attendance improves? Is he accumulating behaviour points because of his special needs? Is there a 6 week targeted programme that he could be involved in? Or any parental workshops and engagement?

Jennifer A

  • 100% attendance
  • More Able
  • High Prior Attainment

Is Jennifer being challenged in class? Are additional tasks being provided to her to ensure that she doesn’t get bored?

Jane A

  • Attendance good
  • Pupil Premium
  • FSM

Jane A has a high CAT Verbal score whereas her CAT Quantitative score is below average. This will mean that her ability to use numerical skills to solve problems is weaker than her ability to comprehend words. A class teacher should be aware of this when setting work.

We have looked at three individual students here but this kind of information should be sought for all students in every class.

Once an assessment is done, then the pastoral information should be used alongside any assessment data to identify:

  • Who is furthest away from expectations?
  • Who is and who is not improving as expected?
  • Does this relate to the topics taught?
  • Who is making less progress than expected or than their peers?

Once a teacher has identified this, they can then move on and see if there are any patterns forming. For example:

  • What are the trends for key groups; are girls outperforming boys?
  • Are pupil premium students doing worse than non-pupil premium students – if so, is there any PP money available for intervention?
  • Is work differentiated for more able/SEND students?

So in conclusion, have a think about what data you are giving to staff to ensure that the children or young people get the education that they deserve.

My colleague Becky’s blog ‘How to Train and Engage Teaching Staff in the Use of In-School Data’ covers the next steps.

by Claire Spencer, Data Consultant

Measuring Staff Performance


Taaleem is a company that owns and runs schools in the United Arab Emirates. The group consists of seven K-12 schools and four early years’ centres. Our schools offer a range of curriculums, including those for the American, British and International Baccalaureate (IB). Schools in Dubai and Abu Dhabi are inspected by the regulatory authorities and the quality of teaching and learning is a major focus of the inspection process. Our schools also undertake lesson observations as part of a teacher’s probation or performance review.


Lesson observations have always been a critical process for Taaleem schools. Different schools had different systems in place and in some cases there were differences between the secondary and primary stages in the same school. As a group, we wanted to be able to measure staff performance and compare one school with another. We also wanted to be able to see which areas of the inspection framework were strong and which needed more work. Getting Senior and Middle Leaders on board with the lesson observation process was also a priority, but because of the variations in the system, this wasn’t happening at the pace we wanted.


  • It was quick and easy to use
  • Setting up the system for our school was straightforward
  • The templates provided were relevant to our observation process
  • We can extract data really quickly and easily, and for different levels – from faculty or department to individual teachers


Yes! Absolutely. It’s helped us manage lesson observations all in one place. SISRA Observe stores information that can be readily retrieved and used for probation reviews or performance appraisal meetings. It is also useful for strategic planning and for targeting professional development courses. We wouldn’t be without it.

Richard Drew, Education Manager, Taaleem

Driving School Improvement


SISRA use the slogan ‘Empowering Improvement’ and since we began working with them over a year ago, that is exactly what they have enabled us to do. Using SISRA Observe has allowed us to become a more efficient organisation and, as a result, has created more time for us to focus on our core purpose; improving the quality of teaching and learning. Gone are the days of collating endless paperwork, reminding staff to send their documents and filing any number of different pro forma as if my life depended on it. All the time that has been saved has now been invested in our key improvement area; improving the quality of our classroom provision.


Using SISRA Observe has created greater ownership of teaching and learning at all levels of our organisation. Teachers use it to reflect on their own practice, ensuring they have a clear understanding of the strengths and development areas in their own teaching. They use this information to engage with a range of personalised CPD opportunities within the school, either sharing their best practice or further developing their own teaching. SISRA Observe is at the heart of creating a self-improvement culture within our school, where teachers take ownership of their own professional development. SISRA Observe allows teachers to make informed choices about their own professional learning, thus making a significant contribution to our culture of ‘continually learning, continually improving’.


Middle leadership has been strengthened significantly by the fact that Heads of Department have a clear understanding of the strengths and priority areas for their teams. Middle leaders are often stretched to capacity and presenting information in a ‘ready-to-go’ format is crucial in enabling them to act on teaching and learning data in a timely and meaningful way. The accessibility of the reports available in SISRA Observe means that they do not have to spend time analysing teaching and learning data because it is done for them, so they can focus on developing the quality of provision instead. It allows them to identify and systematically share best practice easily across their teams, and to act swiftly to strengthen practice if and when required. Heads of Department can develop CPD at department level to ensure that pupils get the best possible learning experiences when they visit their curriculum areas. As a result, SISRA Observe empowers our middle leaders to be the driving force for school improvement at Holy Trinity.


Senior Leaders have total clarity on the quality of teaching across our school. Each and every member of the Senior Leadership Team clearly understands our strengths and priority improvement areas. SISRA Observe also creates a transparency within teaching and learning that has served to strengthen the line management process across the school, resulting in ‘real-time’ conversations with Middle Leaders about the quality of provision in the areas they oversee. As a result, SISRA Observe allows underperformance to be identified and challenged efficiently, whilst (more importantly) best practice can be recognised, championed and celebrated.


This is my favourite analogy in regards to teaching and learning. In a nutshell, it means that we do not improve the quality of teaching and learning by simply monitoring it. However, through SISRA Observe, we carefully and systematically monitor teaching and learning to identify our major strengths and key priority areas. It is what we do with the information that SISRA Observe provides that is crucial to our school on its journey of sustainable improvement.
If you strip away all the intervention strategies, revision sessions and one-to-one support, you are left with the need for the very best classroom practice. Therefore, how we develop our teachers at Holy Trinity is vital. Through analysing the reports from SISRA Observe, I can create a whole-school CPD programme to address our priority improvement areas. More importantly, it allows me to create personalised pathways for our teachers, tailored to their individual needs to maximise their professional learning. Increasing collaboration between colleagues is a key component for improving the quality of teaching and learning at Holy Trinity. We are embedding peer coaching across the school as a mechanism to create more opportunities for reflection and collaboration. SISRA Observe allows me to successfully match teachers to the right coach to ensure that they support each other’s practice and maximise learning opportunities for both colleagues.

In summary, SISRA Observe has helped create something that is so rare in schools…time. Time that can be invested in ensuring the quality of our teaching is the very best it can be. Time that can be given to teachers and leaders to ensure that, above all else, high-quality teaching and learning is the cornerstone for driving school improvement. Put simply, it allows us to ‘keep the main thing; the main thing.’

Stuart Voyce, Assistant Head Teacher, Holy Trinity

Our Journey to Outstanding

Kelvin Hall is a comprehensive school in Hull, East Yorkshire and has 1,400 students. They were graded outstanding in all areas under the new Ofsted framework in February 2015, yet 18 months previously had been graded ‘requires improvement’.


Before using SISRA Observe, the monitoring of teaching and learning was conducted by one person using a single MS Excel spreadsheet. The information was not shared with any other member of staff, not even the Senior Leadership Team. When I took over this role, I knew I had to make changes as this was not the way I worked. My philosophy was that we had to work together to improve teaching and learning and not in isolation. It had to be a shared vision.
It was at this point that the flyer for SISRA Observe appeared on my desk. I was immediately drawn to the idea of no paperwork and a way of collecting, analysing and sharing useful data centrally. I could see the potential for everyone to be involved in their own professional development and the positive impact this could have on staff and students’ progress.


  • Easy to personalise and set up the observation forms
  • Accessible for everyone at some level (you can decide who sees what)
  • Staff have instant access to any judgements made on them
  • Staff can use data to inform their own CPD needs and appraisal targets
  • Leaders can analyse data to inform target-setting, improvement plans and coaching
  • School can analyse data to inform CPD sessions
  • Once it is set up, there is little to do
  • Can develop ways of involving non-teaching staff, e.g. Year leaders conducting drop-ins


For all of the above reasons and the fact that it makes it so easy to demonstrate where you are at in terms of teaching and learning.
When I met with the Ofsted inspector to talk about teaching and learning, it was easy to show him where we were with regards to standards. I could show him book scrutiny, learning walk and formal observation data for each staff member, department, the school as a whole and even look at the trend over the past year(s). He was impressed with how we used it and could see the impact it has had on our journey to outstanding. It had been used as a tool to focus our priorities and drive the standards of teaching and learning to an outstanding level.

I cannot wait to get to grips with the new version and the anticipated flexibility for it!

Claire Turnbull, Assistant Head Teacher, Kelvin Hall School

Detailed Tracking for New 9-1 Grades


SISRA Analytics enables us to effectively identify those students who would benefit from intervention in any given subject and easily identify those who have made good progress, who could be challenged to push on and make more than expected progress.

The filters within SISRA Analytics allow us to monitor the progress of specific cohorts, such as students who have joined the school late, those with low attendance and even those with summer birthdays.


  • Form Tutors are able to monitor the progress of students in their form group, recognising any issues early and providing extra support where necessary.
  • Class Teachers can monitor the performance of their students over a longer period of time and put systems in place to enhance progress.
  • Curriculum Leaders can see at a glance which classes are making good progress and identify those where support may be needed.
  • Achievement Coordinators can view students individually or in selected focus groups, which allows them to monitor progress and enables early intervention if required.


As we move through this academic year and future years, building our student history, we will be able to set benchmark comparators. It will be interesting to see, as students move through the grades, how this information informs our curriculum as we adapt our systems and intervention methods to ensure that all students make the best possible progress.

By using SISRA Analytics to ensure that all students are being challenged and making the best possible progress, the impact on the school performance as a whole will be unmistakable.

Gill Edwards, Academy Data Manager, Sandbach High School & Sixth Form College

An Accurate Match with DFE Publications


Highfields school starting using SISRA Online as an exams analysis tool as we felt our management information system was failing to provide us with a complete picture. When SISRA Analytics was launched, we soon realised the wealth of additional features available, i.e. the opportunity to upload our termly assessment data to track, analyse and inform intervention throughout the year as well as analyse summer results.

Following each data capture, I upload the grades from SIMS into SISRA Analytics and usually within 10 minutes I have matched qualifications, checked for errors and published the reports for the staff to view. Data is timely and we can act on it immediately. The reports are split sensibly into three views: Headlines, Qualifications and Students.

Following data capture, we use the student reports in yearly achievement group meetings. The Deputy Head, Head of Year, Data Manager and Faculty Leaders meet to discuss the progress of individual students. These conversations revolve around reports produced in SISRA Analytics to highlight underperforming students.


Qualification reports are an integral part of line management meetings. Subject attainment and progress data from SISRA Analytics can be analysed and compared with previous data points, and projected summer outcomes are discussed. Vulnerable groups are easily identified or can be customised through ‘focus groups’. One feature particularly liked by the Heads of Department are the Progress Matrix tables (just like the DfE Transition Matrices) that can be produced for subject areas or even individual class groups.

Headline Reports are used by the Senior Leadership Team and the Governing Body with Ofsted, the school improvement partner and the local authority. We are required to project summer outcomes and are held to account for the accuracy of these projections. Without SISRA Analytics I don’t think we would be able to produce these figures with such confidence. SISRA Analytics deals with the accredited / non-accredited qualifications, discounting rules, first and best results, Attainment 8 and Progress 8, to name but a few.


Results Days are now ‘easy’. SISRA Analytics will analyse first and best results separately and allows us to make direct comparisons with projections and to analyse trends. I am proud to say that the results in SISRA Analytics have matched the data on the Forvus tables checking website for a number of years now.
The next challenge for us will be ‘life beyond levels’ and I am confident that whatever system we move to, SISRA Analytics will be right alongside me!

Becky Hill, Data and Curriculum Manager, Highfields School

Instant Trend Analysis


The Burgate is an 11–18 comprehensive school in the New Forest with 12% of students identified as disadvantaged. Our year groups average 155 in the main school and 160 in the sixth form. Before using SISRA Analytics we used SISRA Online, and have been SISRA users for over six years. SISRA Analytics has made a huge difference to the accessibility and readiness of data at our fingertips, alongside giving the school the ability to analyse data in a very in-depth way that was not previously possible without a large amount of extra time and effort.


The speed of access on Results Day is excellent, and the support offered by SISRA in the run-up to Results Day has been comprehensive and worthwhile. This has enabled us to share results quickly, highlight the positives and focus on the areas for improvement with ease. Teachers and governors enjoy being able to access results quickly and accurately soon after release, even though they may be 1,000 miles away.

SISRA Analytics allows us to compare datasets and trends in an instant, and has been instrumental in allowing us to find our gaps and act on them. My favourite feature has to be the new graphical representation of the headline report, which was warmly received by governors for Results Day last summer. However, SISRA Analytics is for life, not just results! I warmly recommend this product to other schools, and frequently do so.

Dr Karen Riding, Assistant Head Teacher, The Burgate School

Improved Ofsted Outcomes


We first started using SISRA Online in 2013 following an Ofsted inspection that had placed us as ‘Requiring Improvement’. One of the actions that came out of this inspection was to improve the tracking and monitoring of assessment data across the school. Previously we had been using the assessment programme within our management information system, and although this generated useful data, it was not being used by staff with any degree of success due to the fact that it was so difficult for them to access.

SISRA Online was a good step forward in that it enabled staff to look at their data far more easily, although this tended to be on a fairly basic level, and was predominantly just by the Senior Leadership Team, Subject Leaders and Heads of House.


SISRA Analytics has enabled another significant step forward in the effectiveness of the tracking and analysis of our data. We now collect Key Stage 4 data monthly. Subject Leaders and Heads of House meet with the Head Teacher each month to discuss changes since the previous month. Target groups are identified and monitored through focus groups, and form tutors have had training in SISRA Analytics to facilitate regular mentoring discussions. We have uploaded a set of FFT estimates, which are used for comparisons when talking to individual students.

In addition to our work at KS4, we have now been working for a year with GCSE Flightpaths for KS3 in place of the National Curriculum levels. These are easily reported and analysed in SISRA Analytics on a termly basis

Last summer we were re-inspected and found to be ‘Good’ with some outstanding features. This was ahead of our summer results but our tracking and analysis of the data in SISRA Analytics allowed us to clearly demonstrate that we understood where our strengths and weaknesses were. Inspectors were satisfied that we had a good grasp of how well our students are doing and that we could clearly demonstrate progress. There is no doubt that SISRA Analytics has been a significant element in securing our school improvement, and has now become an integral and embedded part of our school culture.

Simon Tong, Assistant Head Teacher, Tiverton High School

An Invaluable Tool for Intervention


We used SISRA right back in the old ‘SISRA Online’ days. It helped us evaluate our data drops and annual exam results forensically. But not until the introduction of SISRA Analytics and the added features did we refine our retrospective analysis and more importantly get an invaluable tool for intervention, identifying underachievers so that appropriate action could be taken.


We’ve used the Grade Search facility extensively to create KS4 focus groups, based on children who are predicted English and not maths, or maths and not English. We then group these children into Gold, Silver or Bronze categories based on their KS2 entries. Bronze pupils have a higher entry level and Gold pupils a lower one. The idea is that Gold pupils are harder to convert – hence the gold medal for a more demanding intervention. Silver pupils are less hard to convert and Bronze pupils should achieve most easily.

We also use SISRA Analytics to identify the foundation subjects in which these pupils are underachieving. With the new Progress 8 measure, we recognise that ‘every grade matters’, and so this type of intervention is essential. And in KS3 we use SISRA Analytics to create focus groups for intervention by creating a league table of points differences between current achievement and targets. SISRA Analytics works this out straightaway, identifying current KS3 underachievers at any data drop.

All staff in our school have engaged with SISRA Analytics and find the interface intuitive and easy to use. Years 7 to 10 have three data drops a year and Year 11 has four. All staff contribute to the drop and also use SISRA Analytics to check their predictions against datasets and benchmarks.


On Results Day, SISRA Analytics comes into its own. Data can be input easily and quickly, giving us instant headline stats. The system has also given us instant accurate value-added scores – so no more waiting months for RAISEonline to appear. Levels of progress and transition matrices are instantly accessed and we now have Progress 8 and Attainment 8 results readily available. As in many schools nationally, ‘closing the gap’ has been an issue for us. SISRA Analytics identifies our disadvantaged children and provides a gap analysis, enabling us to pinpoint areas to develop.

So SISRA Analytics has become part of everyday life here at Eggar’s School, changing how we look at achievement data and how we use it to enhance attainment and progress. It’s now hard to imagine life without it.

Neil Waite, Assistant Head Teacher, Eggar’s School

Streamlined Assessment Analysis


SISRA Analytics has been an invaluable tool for Fullhurst Community College since we joined two years ago. It has enabled us to streamline our analysis of data for all assessment points and therefore increased our efficiency in informing staff, students and parents of achievement and improved the effectiveness of our interventions as these are now targeted on the right areas at the right times.


The speed with which data is analysed on GCSE Results Day has been vital. Previously I manually calculated headline, attainment and progress figures using spreadsheets, which took a considerable amount of time. I am not sure this would have been possible with the introduction of the new Progress 8 measures. SISRA Analytics now allows our Senior Leadership Team instant access to headline figures and lets Heads of Faculty see data on attainment and progress at whole-subject, class and student-by-student level weeks in advance in comparison with my previous manual system.

The reports are endless and have saved me hours of work. You can add as many filters as you need and this has enabled staff to drill down on key groups such as Pupil Premium, English as an Additional Language and Special Needs. The ‘Tracker’ options allow staff to see how the cohort has performed in comparison with previous assessment points and ‘Compare With’ allows analysis of results against target grades – this is something we track closely.


SISRA regularly updates Analytics in line with changes from the DfE, which ensures that the information we are using is as accurate as possible. The announcements on the Home Page are always worth reading to keep abreast of any changes in the data world. The Help section is particularly well stocked with guidance both in paper and video format. These have been so helpful, but if you can’t find the answer to a question, there is the friendly Live Support team, who are always on hand to give guidance.

Sam Gray, Data Manager, Fullhurst Community College

Examination Analysis in Record Time


We started using SISRA Analytics the day it was released in March 2014.The difference was felt straightaway. Not only did it free up my own time to develop other data tools for staff to use within school, but the speed and accuracy of its analysis meant we were able to catch students who needed additional support within hours of the assessment data being captured. Prior to that, it may have been at least a week before we really got to grips with what the data was telling us.


GCSE exam results day demonstrated SISRA Analytics’ capabilities perfectly: by 11am, the grades were analysed, SLT had met and some were able to get back to their holidays! The figures even married up perfectly with the Department for Education’s own figures.


The aim here at Alsager has been to build staff’s understanding of data gradually in order to eventually utilise it to support teaching and learning. We have eased our staff gradually into SISRA Analytics through providing bespoke training and their enthusiasm for it has taken even me by surprise, with everyone now using it regularly.
Data is no longer the mysterious, threatening thing some staff felt it was; rather, staff can finally see that it is a vital tool to support their work in the classroom. SISRA Analytics has been the key to this!

David Meller, SIMS and Data Manager, Alsager School


After a decline in her school’s results, Emma’s eye for data resulted in the school creating a new post of Curriculum and Data Manager for her, which included the setting up and managing of data and assessment systems for Years 7 to 13. She quickly recognised that SISRA Analytics would be an invaluable tool for staff, became an expert in the use of SIMS, and successfully completed the SSAT National Data Managers Award.

Emma encouraged and trained staff in the effective use of both SIMS and SISRA Analytics, not just at her own school, but within the large multi academy trust when her school’s model was rolled out to other schools. She promoted the use of data and also wrote her own school’s target setting policies. Even Ofsted recognised that ‘The availability and use of school information has improved dramatically’.

Before joining SISRA, Emma was thoroughly involved through hosting DataMeets and acting as a guru at training events. Her love of helping others in the use of systems and data led to her joining the Consultancy Team at SISRA in July 2016.

“Emma was very patient with me and answered any questions I had – though there weren’t many as she had pre-empted a lot of what I was going to ask. Emma is a credit to SISRA – as are all the consultants!”
Data Manager, Cambridgeshire School

“Emma was absolutely fantastic. I can’t fault her in any way. Her depth of knowledge and ability to go above and beyond are a real credit to your company.”
Assistant Head Teacher, Nottingham School

“Emma has very knowledgeable about all aspects of SISRA as well as Sims. The training was pitched at just the right level for all 3 of the staff involved.”
Administration Team Leader, Derby School

“It was a very useful day indeed, and Emma coped admirably with the difficulties presented by our MIS malfunctioning.”
Deputy Head Teacher, Hackney School

“Emma was very professional and was able to provide information and training on the systems. I learned a lot on the day and with her guidance was able to apply that knowledge on the day.”
Head of ICT, Nottingham School


Originally from Scotland, Sandra left her operations manager role with the Halifax when she and her family moved to Essex in 2007. With sixteen years of data management in banking behind her, Sandra put this to use when she joined a local secondary school as their Data Manager in 2010. The school became SISRA online customers in 2011, and Sandra was quickly impressed at both the efficiency and effectiveness of the system. A firm believer that accurate assessment data and challenging targets provide the basis for sound intervention, she encouraged and trained staff in the best use of Facility, Go 4 Schools and SISRA as a powerful means to maximise student and school results. Sandra was delighted to join SISRA as a Data Consultant in 2018, expanding the consultant presence in the south east of England.



Kate’s previous roles have given her seven years’ experience of working in data management. Before working in education, Kate supported and trained clinical and non-clinical healthcare staff using information systems across Liverpool Community Services. She would travel between different healthcare centres, hospitals and prisons ensuring the accuracy and consistency of the data being recorded. In 2010, Kate moved into a role in education as an Exams Officer and Data Manager where she gained further experience of managing data and multiple spreadsheets before becoming a SISRA user herself. She was solely responsible for streamlining the tracking processes within her school. She improved the quality of the data recorded in school by facilitating behavioural change and encouraging and training staff to use both SIMS and SISRA. Kate has experience of working closely with Senior Leadership Teams and Heads of Department and understands the effectiveness of data within education.
Kate has recently completed and passed the Ordinary Certificate in Statistics with the Royal Society of Statistics.

“We are very grateful for the speedy response to our request which was at short notice. Kate quickly understood our needs and spent the day with my colleague completing the required data analysis and training. We are delighted with the analysis she was able to produce at the end of the day.”
Data Manager, Wirral School

“Kate was fantastic when helping me set up SISRA from scratch. She was a great help in areas where our School structure on SIMS did not match up with what needed to be uploaded to SISRA. Kate gave me a great “foot up” to starting to use SISRA.”
Data Manager, Bradford School

“Having the training in house, allowed us to use live data, and made the process real. Kate was very clear in her delivery. We were able to ask questions throughout the day and Kate was able to answer any queries we had. The planning before and follow up support meant that the course met our unique requirements.”
Senior Leader, Sefton School

“Kate’s background in school data clearly helped her to be able to see potential problems from our point of view and add context to everything delivered, and have a good background knowledge of school data issues, which made the training more appropriate for us and extremely valuable.”
Data Manager, Sheffield School

“I found Kate easy to work with and she made me feel comfortable during the training.”
Data Manager, Stockport School

“Kate was fantastic. Great sense of humour, very knowledgeable and we made great progress by setting up and creating all the assessment for Years 10 & 11. What is even better is that afterwards, without help, I was able to complete Years 7-9.”
Assistant Head Teacher, Cumbria School


Helen joined SISRA in 2014, having worked in schools for over ten years with students aged 11 to 19 years old, most recently in her role as an Examinations Officer and Data Manager. She is an expert in, with a wide knowledge of all modules ranging from attendance and behaviour management to assessment and examinations. Helen works closely with Senior and Middle Leaders using SISRA in order to gauge the individual needs of the school and its students in order to raise achievement. Helen has received high praise from Ofsted for the systems she has put in place. Outside work, Helen is a qualified football referee and regularly officiates for contributory leagues, such as the Evo-Stik Premier Division and the Conference Premier, the fifth tier of English football.

“Helen was prompt, efficient and very personable, with an effective style of delivery. Communication before and after the session was excellent.”
Assistant Head Teacher, East Riding of Yorkshire School

“Our Assistant Head and I had different agendas to meet on the day but Helen expertly switched between both. She was extremely knowledgeable and supportive on the day and has continued to support us since. She was open to discussions and debates and she happily took any of our queries or ideas back to SISRA for further discussion. I would highly recommend her.”
Data Manager, Northumberland School

“The training session was great, there was no need for improvement. Helen was great, very knowledgeable and patient. She answered all our questions!”
Assessment Officer, Bradford School

“Helen was incredibly knowledgeable, pleasant and delivered the training with considerable expertise.”
Executive Head Teacher, South Tyneside School

“Helen gave clear guidance in answer to all queries raised. The information she had prepared and delivered was exactly in tune with the needs of the staff. We found the session very useful and now feel more confident in using SISRA going forwards.”
Data Manager, Cumbria School

“I had no knowledge of SISRA. Helen explained very clear on an email (in advance) what the needs were for the day and I had all in hand during the training. She was an excellent trainer and now with only that training I have been able to set up all the system on my own.”
Information System Manager, Stockton-on-Tees School


Matt’s SISRA adventure started in 2008 when his school signed up to SISRA Online. In a dual role as a Data Manager and Assistant Exams Officer, Matt used SISRA Online for nearly 4 years alongside SIMS before moving to join SISRA in January 2012. Utilising his SISRA Online knowledge, Matt joined as the first of a team of consultants responsible for training users both within schools and also at SISRA training days. Experienced in various modules of SIMS, including Assessment, Exams Organiser and Course Manager, Matt can assist schools with both their SIMS setup and SISRA Analytics.

“Superb as always and very flexible. Matt always goes the extra mile and his support is very much appreciated by everyone. A real pleasure to work with.”
Assistant Principal, Manchester School

“Matt’s level of expertise and knowledge was fantastic. What really stood out beyond even that was his commitment towards us for the day and addressing the work that we needed to do. This commitment went far above and beyond what we could have expected. All this combined with his good humour and approachable manner made for an absolutely fantastic service.”
Data Manager, Cheshire West and Chester School

“Matt is a pleasure to have supporting the school. Both knowledgeable and able to adapt quickly to changes on the day! Thank you for all your help and support.”
Assistant Head Teacher, Trafford School

“Matt was excellent. With his extensive knowledge within SIMS he was an excellent help! We are now fully up and running with SISRA Analytics which wouldn’t have been possible without him. Thank you!”
Deputy Head Teacher, Cumbria School

“The service provided by Matt was superb. He was extremely knowledgeable and patient. Nothing was too much trouble and he gave everyone involved with data the confidence to utilise new systems.”
Assistant Principal, Manchester School

“Matt was very patient and explained some very technical and complex matters in a very simple and accessible way. As a team, we were really happy with the quality of the training and thought the day was very worthwhile.”
Assistant Head Teacher, Liverpool School


Claire began working as a Data Manager in a large, failing secondary school in 2005. During her 10 years with the school, she was responsible for setting up and managing all data systems, and quickly became an expert in many modules of SIMS. She introduced SISRA to the school in 2011.

She was responsible for training staff at all levels in the effective use of data, SIMS and SISRA Online and Analytics. The school was judged Outstanding in all five categories by Ofsted in November 2013. The inspection report states that the school’s systems to ensure the rigorous collection, analysis and use of student performance data were ‘exceptional’.

In October 2011, Claire was part of the first cohort nationally to gain the SSAT National Data Managers Award.

Claire joined the Consultancy Team at SISRA in 2015, supporting different schools across the Midlands and has recently become the Safeguarding Governor of a local secondary school.

“Claire as always was fantastic. Spotted areas where our knowledge of results day was lacking and ensured all the holes were plugged. SISRA support is brilliant and her offer to look over our setup after a few weeks was typical of an organisation that really care for its clients.”
Senior Leader, Ealing School


“To have Claire come in to help us set up was invaluable. Claire is incredibly knowledgeable with SIMS and our existing systems as well as some of the more ‘school specific’ problems we encounter with timetabling/codes etc. Claire has great interpersonal skills and is a real credit to your company. Very impressed and even more grateful for her support.”
Assistant Head Teacher, Essex School


“Claire’s help has been invaluable in ensuring the school’s data needs are met. Her knowledge, flexibility and passion for data have enabled our school to cover significant distance in a short space of time, ensuring all key stakeholders are engaged in the school’s data management strategy.”
Assistant Principal, Nottinghamshire School


“We were over the moon with the training Claire provided, she was very friendly and nothing but helpful. Since the training Claire has continued to support us by checking our system to ensure we are doing it correctly and thanks to Claire we are now feeling very confident in using SISRA on this coming results day.”
Data Administrator, Lewisham School


“Claire’s help was outstanding and thoroughly appreciated by the admin, support, teaching and leadership bodies. Her presentation and engagement with the staff was excellent and consistent. It can be said that she ensured staff at all levels – using SISRA in all the different ways – understood the power of SISRA and how it really benefits them in their role, and that they understood how to use it as well as seeing the value of it. A thoroughly valuable day for our school.”
Data Manager, Nottinghamshire School


“Claire is very knowledgeable in SIMS which really helped us get through the day. She knew how to tackle every discrepancy involved in set up and no issue was unsolvable. She was informative and really cared about ensuring we made massive progress throughout the day.”
Senior Leader, Ealing School

SISRA welcomes Tom Furlong

SISRA would like to welcome a new member to the Support Team, Tom Furlong! Tom joins the team following his experience customer service and administration and in his spare time is studying towards a degree in Spanish!
Tom is enjoying helping and getting to know our SISRA Schools as well as helping towards preparing new and helpful resources.

Too much of a good thing – when data stops being useful

I am very aware that, in our blogs, we are busy giving you lots of ideas about how you can analyse, evaluate and use your data in school. However, possibly the most important point is that we are not suggesting that you do it all, and we are not suggesting you do it all the time.

In March, GL Assessment published a report called ‘Smart Data’, which, amongst other things, looked at the attitudes of teachers to data.

Greg Watson, Chief Executive of GL Assessment, said the findings showed that teachers accepted that data was essential in the classroom but that far too much of it was superfluous or poorly applied. “Too much data is about control not improvement, too much of it is misused and far too much of it is pointless. As a consequence, an awful lot of the benefits of assessment are lost,” he said.

One of the key points of the report is: don’t over assess. Too much assessment adds to the teacher’s workload and takes time away from teaching but won’t produce any added insight.

It is very easy, in school, to assume that loads of data is a good idea. We recently worked with a school who update their assessment grades in Analytics on a weekly basis. Admittedly this does mean that the data is always up to date, but what is it actually then used for? The Heads of Department don’t have time to analyse the data this frequently, or use it to inform intervention or adjustment of activity in the classroom. Just having up-to-date data doesn’t benefit anybody, it is just adding to the workload of the teachers without any benefit to the students.

The recent study of workload commissioned by the government, ‘Eliminating unnecessary workload associated with data management’ backs up the GL Assessment findings. The report voices concern that ‘too often, the collection of data becomes an end in itself, divorced from the core purpose of improving outcomes for pupils, increasing the workload of teachers and school leaders for little discernible benefit.’

It follows up its concerns with the advice that ‘government, school leaders, and teachers, rather than starting with what is possible in collecting data, should challenge themselves on what data will be useful and for what purpose, and then collect the minimum amount of data required to help them evaluate how they are doing.’

They finish with five overarching principles that they feel should be applied to data activities in school:

  • Be streamlined: eliminate duplication – ‘collect once, use many times’
  • Be ruthless: only collect what is needed to support outcomes for children.
  • The amount of data collected should be proportionate to its usefulness. Always ask why the data is needed.
  • Be prepared to stop activity: do not assume that collection or analysis must continue just because it always has.
  • Be aware of workload issues: consider not just how long it will take, but whether that time could be better spent on other tasks.

The point is really just to only collect as much data as can actually be actively used to lead to action, intervention, and to benefit the students. Never collect any data just for the sake of it; on the whole, bear in mind that less is more.

There is one particular section in the report, in which it recommends the use of electronic tracking and analysis packages – with a caveat that I wholeheartedly support; that the electronic package should be used to support the process of data management, not define it. This is why our new Life After Levels system in SISRA Analytics has been developed to be so flexible, and can be used by any school with any assessment system and grade type. As the Workload Review Group quite rightly say, the tail should not wag the dog.

Read the original reports by clicking on the links below :

Smart Data – Study by GL Assessment

Eliminating unnecessary workload associated with data management – Report of the Independent Teacher Workload Review Group

by Becky St.John, Principal Consultant

The dangers of calculating a ‘Subject Progress 8’ score

Here at SISRA, we are currently receiving a few enquiries about how to calculate a ‘Subject Progress 8′ score. I have some serious doubts about the validity of such a score, and if your school is using it, I would urge that it is treated with caution. We could calculate one, but does that mean that we should? It can be done for Maths, since it populates its own individual slots, but for the other baskets I’m not so sure.

For a start, let’s have a look at the ways that a subject-specific Attainment 8 estimate for the EBacc and Open baskets could be calculated:

  • Overall A8 divided by 10 (2 x English, 2 x Maths, 3 x EBacc and 3 x Open)
  • Element A8 divided by 3
  • Element A8 divided by national average number of slots filled

And now let’s have a look at an example. If we calculated the 2015 EBacc subject estimate by dividing the EBacc estimate by 3, this means that students with a 4b would be expected to attain a high D. According to the national transition matrices in the RAISE Online library, in 2015 95% of Physics students nationally actually attained a D, while only 68% of Computer Science students did.

We have the same issue with the Open basket. For simplicity I have again looked at 4b students gaining a D. The variance between the attainment of students with the same starting point in different qualifications means that the expectation of the English Language or Art teacher that their 4b students will attain a D is relatively realistic, but to expect the same of the Business or ICT teacher is not in any way fair. Cutting the cloth differently, by calculating the estimate via either of the other options I mentioned above doesn’t help – the whole idea is inherently flawed.

This is compounded by the fact that we don’t actually know, on a national basis, how each qualification has contributed to the calculation of the overall or element ‘estimate’.

Imagine a student has a B in each of four EBacc subjects. According to the DfE the three of those subjects that contribute to the basket are selected by ‘result number’ which is allocated at the time. Basically it is arbitrary, so we can never know how any qualification contributes to the EBacc basket on a national level, again making it less than fair to apply equal expectations to the students and teachers studying each one. In my capacity as a Data Manager in school, I was advised by my contact in the DfE School Performance Data Unit that due to the way subjects are selected for each basket, is is not advisable to calculate subject specific A8 scores.

Lastly, note how volatile the Attainment 8 estimates are. They currently change from year to year, to such an extent that any projections for the following year are highly unreliable.

The DfE has issued no guidance on how to calculate a subject Progress 8 score, because it is basically incalculable. If you wish to try, please ask the DfE to specify how it should be done. If they can provide a fair and realistic way of calculating it, we’ll do it!

by Becky St.John, Principal Consultant

Why your Progress 8 score has gone down

The DfE released the 2016 Attainment 8 estimates on Monday (27/9), and schools across the country recalculated their Progress 8 scores, previously calculated using the estimates from summer 2015. Schools across the country also found that this made their Progress 8 scores, on the whole, go down, which obviously was disappointing.

What is even more disappointing, however, is the stories I am hearing from Data Managers who are being challenged about the accuracy of their initial Progress 8 calculations, and even being blamed for having ‘got it wrong all year’.

Actually, when you think about it, nobody’s Progress 8 score has ‘gone down’. The score could only be calculated once that year’s provisional “estimates” were released, so in fact on 27/9, when the 2016 provisional estimates were released, your school’s Progress 8 score was calculated for the first time. However, since all schools had been projecting the score using the 2015 estimates, the actual score was lower than some people were expecting.

Before 27/9 your Progress 8 figure calculation was most likely: 2016 exam results (or assessments if looking back through the year) compared with 2015 Attainment 8 estimates, so basically your 2016 cohort’s attainment compared with the attainment of the 2015 national cohort.


Today your Progress 8 figure calculation is: 2016 exam results compared with 2016 Attainment 8 estimates, so basically your 2016 cohort’s attainment compared with the attainment of the same, 2016, national cohort.

Various educational bodies, including the DfE and ourselves, have been warning school leaders against relying on projected Progress 8 scores as a measure of expected student or cohort success, due to the expectation that the Attainment 8 estimates would change. Due to curriculum changes across the country, it was expected that at least the EBacc estimates would go up between 2015 and 2016, so the drop in P8 figure should not be a surprise to anyone.
Below are some charts that show the difference between the estimates based on 2015 national performance, and those recently released based on 2016 national performance:

From these charts we can clearly see that it is the EBacc element attainment that has gone up considerably, and therefore affected the overall estimates. This is due to schools changing their curricula to encourage more students, especially lower ability students, to take the EBacc, as is shown by the comparison between number of slots filled in 2015 and 2016.

One last word of warning. The 2016 estimates released yesterday were provisional. This means there may yet be changes to the national picture, and therefore the estimates, once the September checking exercise has been completed, and remarks have been taken into account.

by Becky St.John, Principal Consultant

Effectively tracking and evaluating your new Y11 (pt.2)

As discussed in part 1 of this article ‘Data to avoid for your new Y11’ , for our new Y11 students calculating useful headline figures seems to be a problem.
However, there are many ways we can analyse the data to ensure that we know our students are making good progress without the need to project headlines. If we look after the students and departments, the overall performance measures will look after themselves. We need to focus on the small data, rather than the big data.

Let’s start by looking at useful data we have for the non-reformed subjects that will give us a good indication of how our new Y11 students are, in fact, doing. Note that the illustrations are taken from SISRA Analytics, but you can do all these calculations using any analysis system, SIMS or Excel.

Grade banding and point scores

The percentage of students gaining A*-A, A*-C and A*-G for each unreformed qualification, and the overall average point score for each subject are really useful figures. With the exception of English and maths, our Y11s are doing unreformed GCSEs, so for these subjects these figures can be compared to previous cohorts’ achievement. Even if you didn’t opt into Progress 8 last year, you know from your shadow data in RAISE how your school did on the new performance measures, so you know whether you need to hold firm or raise the game for 2016.

We can, and should, also break these figures down into our groups. Check how your Pupil Premium students are faring, or your students on the SEN register, and also break them down into high, middle and low ability students. From this we can get a good picture of where there might be issues.

Our department leaders can do the same thing for their classes, to really get to the nub of where there might be problems.
We can also look at how they compare to national achievement from last year, for example 90.7% of our Art students are assessed as attaining a C or above compared to 75% nationally in 2015.

We can then compare them to our in-school targets. The red box here shows that despite the fact that the % achieving C+ is above national, our Art department is still below target, because we are expecting 100%.

Transition matrices

The transition matrices that can be found on the RAISE website can also really help us here. If we compare our own in-school matrices to the national ones from last year, we can get a very detailed idea of how the progress of our current year group, broken down into KS2 sublevels, compares to that of last year’s national cohort. Again, for new Y11, for everything apart from English and maths, we are looking at unreformed GCSEs, so the comparison is valid.

Also, this is an easy way to check who the KS2 level 4 students are who are not making 3 levels of progress, or the KS2 level 5 students who are not making 4 or 5 levels, or look for the rogue students who are not achieving as well as the rest of their department, class or group.

Progress 8 – Ebacc and Open baskets

While the Maths, English and the Overall P8 score cannot really be calculated with any accuracy (see previous article), we can look at the baskets other than English and Maths, since for these we can compare like with like, as long as we look at in-school scores using 2016 points compared with national estimates using 2016 points. We need to be careful because the changing entry patterns may have some effect on the EBacc basket estimates, but not so much on the Open. This is also not an exercise just for the performance tables – if a student has a negative score for the open basket, they are not achieving the median score for this basket that was achieved by students with the same KS2 fine level last year – and they have the potential to do that.

Here we are looking at the latest predicted grades for current Year 10, Open basket scores. Overall, we are looking at a slightly negative progress score for the open basket. We need to investigate to find out where the problem is.

By looking at the student list, we can identify which students have negative scores, and investigate further. Remember that it might be worth comparing with 2014 estimates rather than 2015 for students with a KS2 of 5.5 due to the boycott effect. I would also suggest looking at who has an only slightly positive score as well, in case of inflation of national acheivement this year.

By looking at the grades in the Open basket for a student (basket 3), we can easily check in which subject the student needs support. Adam Ant is a 4a student but is only getting a D, which is 2 LOP, in English Language and Literature. He is assessed at D+ though, so he has a chance of pulling it up to a C with help. He also has a D+ for Spanish. The B at the bottom is not being included in the open basket, because it is a short course. Perhaps Adam should have gone for full course RE!

We can also do this for the EBacc basket, but as I have said, we should really avoid projecting P8 scores for the English/Maths basket for the 2017 cohort. So what should we be looking at for English and Maths for this year group that will mean something to us?

Threshold in English and Maths

(previously known as the Basics)

This is not points related, so we can very easily establish who is and who isn’t projected to attain it. We can’t compare the overall figure to previous year groups’ achievement, but we can use it to identify which students need help to get it this year. Achieving a ‘good pass’ in English and Maths is not only of benefit to the school performance tables, but also to the student. And if they don’t achieve it and they want to carry on to sixth form they’re going to have to retake it, possibly more than once.

Here we can see that only 56.9% of my students are attaining it. Who are they? We can create a list of which students are on track to get a 5 in English but not maths, and give it to the Head of maths, and vice versa for English. One thing to remember is that students only need one entry in either Language or Literature for their grade to be counted, unlike the 5A*-C including EM measure.


Let’s find out who is getting a good pass in the five EBacc subject areas, and who is not. Which subjects do they need in order to do so? In this case we can include English and maths, because the EBacc is grade based, not points based.

Here we can see that Chris Cornell is getting a ‘good pass’ (C or 5) in all subjects except English. So it is easy enough to pass that information on to the teacher of that subject. Since he/she is currently assessed at a 4 in that subject there is a good chance of him/her being able to move it up to a 5. Likewise Mariah Carey’s D+ in History or Geography.

Let’s get started!

So, as you can see there are plenty of ways that you can ensure that your Y11 students are making good progress. Just focus on the small data. Look after those pennies (students) and the pounds (headline figures) will look after themselves!

by Becky St.John, Principal Consultant

Data to avoid for your new Y11 (pt.1)

To project or not to project, that is the question…are you worried about how best to evaluate your 2017 cohort? You may be finding that your headline measures don’t seem to be comparable to previous years, and that Progress 8 scores and EBacc pass percentages seem to be awfully low.

What is the problem? The issue, as you are no doubt aware, lies in the tension between the 9-1 and A*-G grades, and the tension between the 2016 and 2017 points, which makes the calculation and use of some of the new performance measures, though not all, quite problematic.

Let’s look at the measures one by one. I’m going to start with the EBacc and work up.


At a headline level for Year 10 any projected % achieving the EBacc is going to be accurate, since it is easy to calculate who has got 5 in English and maths, and C in the other relevant subjects. However, it’s not going to be that useful. It isn’t comparable to previous years, since in order to achieve it, your students need to attain a 5 in English and Maths, whereas previously they would have had to attain a C.

As we can see from the Ofqual postcard, only one third of those who would have attained a C will get the new 5, so you can expect your EBacc attainment to drop.

Threshold in English & maths

The next measure, the % achieving the threshold in English and maths, is also an easy measure to project. Again though, it’s not comparable to previous year groups due to the difference between a C and a 5, so the jury is still out as to how you work out whether your percentage for Year 10 is a good one or not!

Attainment 8

Your Attainment 8 figures can also be calculated quite easily, by allocating the new 2017 points to the A*-G grades for your unreformed qualifications. Once again, the resulting scores are not comparable to previous years, so be careful how you use them.

Of course the Attainment 8 scores are also used to calculate Progress 8, and it’s here that things get really tricky.

Progress 8

This is because the only national estimates we have are based on 2016 exam results, and therefore are calculated using 2016 points. A comparison of in-school figures using 2017 points with national figures using 2016 points is not going to produce Progress 8 figures that mean anything. I have heard data managers and members of SLT say ‘but even if they are not accurate, aren’t they better than nothing?’ I personally think that it’s the other way round, ‘nothing’ is better than data that is at best misleading and at worst downright wrong. The change in the points, as you can see from the graphic, is not even linear, so there really is no correlation.

An alternative option would seem to be to stick with 2016 points across the board. Surely we could compare in-school Attainment 8 based on 2016 points with national estimates based on 2016 points, and it will at least give us an idea of where we stand?

Sadly not. We can’t actually choose to stick with 2016 points in school, due to the 9-1 grades in English and maths. The 8 points a student will be allocated for a grade 8, 7 points for a grade 7 etc. are the 2017 points, and therefore don’t equate directly to the available English/maths estimates, which are based on 2016 points. The easiest way to demonstrate this is by way of an example. Let’s have a look at the English basket, and how the change from A*-G to 9-1 would affect Attainment and Progress 8 figures for similar students.

One of last year’s Y11, Bobby Smith, arrived in Year 7 with a KS2 Eng/mat average of 4.4, giving him an English basket estimate of 9.88. He is predicted a low to middling grade C in English language or literature (it doesn’t matter which as they’re now equally weighted) giving him an English basket score of 10, i.e. 5 doubled. This gives him an English Progress 8 score of +0.06, just slightly positive.

However, a Y11 student this year with the same KS2 level, and the same projected English achievement – a low to middling C, which will actually give him a 4 in 2017, will end up with an English basket score of 8. If we compare this to the 2015 estimate, he gets a pretty hefty negative English basket progress score, which is clearly wrong. This of course will in turn affect the overall P8 calculation.

What would make all our lives a lot easier would be a set of ‘shadow’ estimates based on the 2016 national achievement, but calculated using the 2017 points, to enable projection of P8 scores for current Y11. However, the DfE have said that they are not going to produce any ‘shadow’ estimates, because they actually don’t WANT us to project our scores for the new measures.


Now, if I am completely honest, there are a couple of ways in which you could get around this. Firstly, you could create your own set of national estimates based on 2017 points, using a formula to adjust each estimate. Then you could compare your in-school 2017 Attainment 8 scores to your fake 2017 national estimates to get Progress 8 scores based on 2017 points. Secondly, you could take your 9-1 grades, and adjust the points allocated to them, to give them equivalency to the 2016 points allocated to the A*-G grades. You would then have a set of in-school 2016-style Attainment 8 scores to compare to the actual 2016-style national estimates. Due to enormous pressure from our customers, SISRA are actually looking into the least misleading way of doing it.

However, I have concerns about using any kind of workaround. Both these options, and any others that could be available, carry the risk of encouraging you, as school leaders, into basing decisions on data that is unreliable, because it is calculated using figures that are not real. Just because you can do it doesn’t mean you should. Doing this could lead to some major surprises come summer 2017. If you really have to use a workaround, I would use option 2, since it is almost impossible to know by how much to adjust the national estimates.

So what can you do?

For our new Y11 students calculating useful headline figures seems to be a problem. In some ways, I can’t help feeling that this should be liberating for school leaders. Instead of constantly worrying about the headline figures, the focus can shift to the departments, the classes and the individual students. There are many ways we can analyse the data to ensure that we know our students are making good progress. We are going to have to adopt the idea that, in the same way that if we look after the pennies the pounds will look after themselves, if we look after the students and departments, the overall performance measures will look after themselves. We need to focus on the small data, rather than the big data.

For details of how, see part two of this blog, ‘Effectively tracking and evaluating your new Y11

by Becky St.John, Principal Consultant

Using data effectively to prepare for an Ofsted inspection

The word ‘Ofsted’ can strike fear into most and leave even the strongest of people, a quivering wreck. If this is you, then you’re not alone! Between them, the consultants at SISRA have been involved in nearly 20 Ofsted inspections including HMI monitoring visits, section 8 inspections and no notice inspections. We understand how terrifying a visit from Ofsted can be, and with this blog, are aiming to help ensure that you can prepare and use your in-school data effectively to ensure that any inspections are not like a dark cloud looming over you.

What are Ofsted expecting?

Before any inspection, always ensure that you have read the latest Ofsted Inspection Handbook as well as the Ofsted myths that are available from the same link (surprising, there are quite a few myths!). Don’t let an Inspector force you into giving them data and figures that you believe could be inaccurate and stick with the facts that you know are accurate. Ofsted do not expect performance or pupil-tracking information to be presented in a particular format. The information should just be provided to the Inspector in the format that the school would ordinarily use to monitor the progress of pupils in that school. Neither do Ofsted require schools to undertake additional work specifically for the inspection which is why they normally contact the school by telephone during the afternoon of the working day before the inspection. On some occasions, Ofsted may conduct inspections without notice although when this happens, the lead inspector will normally telephone the school 15 minutes before arriving on site.

There are so many paragraphs within the Inspection Handbook, we can’t possibly comment on all of them but what we believe is important to note is that inspectors will evaluate evidence relating to the achievement of specific groups of pupils and individuals including disadvantaged pupils, the most able pupils and pupils who have special education needs and/or disabilities.

Disadvantaged Students

With regard to disadvantaged students, they state that they will gather evidence about the use of the pupil premium in relation to –

‘Any difference made to the learning and progress of disadvantaged pupil as shown by outcomes data and inspection evidence’

I will be using SISRA Analytics to demonstrate how we can prepare for this, but it can of course be done using Excel or within your MIS. For example within SIMS Assessment Manager, an extra student information column can be added to a marksheet by right clicking on the student name column and ticking Pupil Premium Indicator. Alternatively the marksheet can be filtered using the Group Filter icon and just showing Pupil Premium students or non-Pupil Premium Students.

In SISRA Analytics, if we take a look at our Y11 Spring assessment (fig 01) we can see at a glance what KS2 baseline our disadvantaged students entered our school with compared to non-disadvantaged and National.

fig 01

If we now look at the Attainment 8 score in the Year 11 Spring term (fig 02), our disadvantaged students are now falling well below National whereas our non-disadvantaged students are achieving well.

fig 02

But if we look at our SISRA Analytics tracker report (fig 03) and export this to Excel (fig 04), we can do a simple formula which shows progress between each assessment point and also progress from the Y10 Autumn assessment. We can see that progress for non-disadvantaged students is 6.18 and for disadvantaged students it is 6.17, so progress is actually very similar. So we need to be asking what happened in years 7 – 9? If we had a flightpath model or Expected Attainment Pathway model tracking progress from Years 7 through to Year 11, we would be able to closely monitor any dips in progress.

fig 03

fig 04

Most Able

The same applies for Most Able. In fig 05 below, we are looking at the Y11 Spring assessments for our Upper/High ability students. We can see that 32.9% are achieving an A/A* compared with 17.1% nationally. Also, our average points are significantly higher than national average points.

fig 05

Pupils with Special Educational Needs and/or disabilities.

It is important to be able to determine how SEND pupils are doing academically. Again, this can be done via Excel or within your MIS but it can also be filtered very easily within SISRA Analytics. In fig 06 below, we can see at a glance how students within different special educational needs are doing compared with National.

Fig 06 shows how our SEND pupils are doing in relation to the EBacc.

fig 06

Fig 07 shows the same pupils in relation to Basic measures

fig 07

What NOT to provide to an inspector

Finally, it is worth noting that on no occasion should you be asked to predict a Progress 8 score for your cohort of students.

In fact, RAISEOnline state within their FAQ that

‘it is not possible in advance to estimate what the actual Progress 8 scores will be as they are based on the national average results of the same cohort.’

They go onto say

‘trying to predict Progress 8 scores based on previous years Attainment8 estimates is likely to be time consuming and inaccurate and is not something that the Department wants to incentivise’

In March 2016 Charlotte Harling, our Principal Consultant, helped out at a school during their inspection. The school was previously judged as ‘Requires Improvement’. They are now judged ‘Outstanding’. A quote from their report says

‘procedures to monitor pupils’ progress are first class. Leaders accurately identify pupils who are not performing at their best and swiftly intervene to secure improvements’

During the inspection, Charlotte was asked what the school were predicting for 5A*-C(EM) for the current Year 10. Charlotte explained that it was not a measure that they were looking at and would not be able to give a figure for this. This answer was accepted by the Inspector.

She was also asked to predict a Progress 8 score for the Year 10, again she explained that this was not possible but she did give them a threshold figure for English and maths.

A confident approach

These are just a few things to consider when preparing for an Ofsted inspection. An Ofsted inspection should not be scary if you’re prepared. Of course nerves are natural but are those butterflies in your stomach really just adrenalin?

If you have any comments or have recently been through an Ofsted inspection and have anything that would be useful to share with other schools, we’d love to hear your thoughts.

Finally, if you’re expecting Ofsted any time soon, good luck!

by Claire Spencer, Data Consultant

Results Day Workshops

The SISRA team have released workshops specifically designed to ensure that you are ready for Results Day! The practical hands-on workshops were designed by their team of consultants who each have first-hand experience of working within a school.

The agenda will include the full results day process for Key Stage 4, to include preparation during the summer term, uploading results on Embargo Day, and the analysis of the resulting headline, qualification and student level reports.

To find out more about the results day workshops including fees, dates and locations please click here.

SISRA Support Team Welcomes Charlotte Meadows

SISRA would like to welcome Charlotte Meadows to the Support Team! Charlotte graduated from university with a BSc in IT & Multimedia Computing, and joins us after experience working in administration.

Charlotte is looking forward to getting to know and supporting schools using the SISRA services, helping to troubleshoot their data queries via Live Chat & Email.

SISRA welcomes Sophie Rawson!

Sophie moved to Liverpool to study in 2011. After graduating with a degree in English Language, she spent a year travelling and working in a ski resort before joining the SISRA Team.


We have released the dates and locations for Spring term’s DataMeets.

Data Managers, Exams Officers and other support staff are all invited to our DataMeets. If you haven’t yet heard about our DataMeets, they are free regional events hosted at SISRA schools all over the country.
The purpose is to simply provide staff with the opportunity to get together on a termly basis to discuss important changes, seek advice and to network with their peers.

If you would like to *attend one of our DataMeets, please visit our Eventbrite page to register
*places are limited to 2 attendees per school or click here DataMeets for further details.

Important note: Please be aware these DataMeets are for support staff only. If you are a member of the SLT and would be keen to run a similar meeting for your peers in your local area please contact us using details below.

If you’d be interested in hosting or chairing a DataMeet, or would like any further information, we would love to hear from you. Please email 

SISRA welcomes Claire Spencer

Claire began working as a Data Manager in a large, failing secondary school in 2005. During her 10 years with the school, she was responsible for setting up and managing all data systems, and quickly became an expert in using SIMS. She introduced SISRA to the school in 2011.

She was responsible for training staff in the effective use of data and SISRA Analytics and is proud that staff were proficient enough in the use of this to enable the school to have paper-free Results Days. The school is now judged outstanding by Ofsted – at a recent inspection, the school’s systems to ensure the rigorous collection, analysis and use of student performance data were described as ‘exceptional’.

Claire was part of the first cohort nationally to gain the SSAT National Data Managers Award. Claire joined the Consultancy Team at SISRA in 2015 and is looking forward to supporting different schools across the Midlands.

SISRA welcomes Jasmin Masih

SISRA would like to welcome Jasmin Masih to the Data Consultant team.

Jasmin began working in a large rural secondary school in 2008 and quickly became an expert in both SIMS and SISRA Analytics. During her time at the school she was able to streamline data processes and worked hard to ease the burden on school staff as well as gaining significant experience in using Exam Organiser, Assessment Manager and Course Manager. Jasmin has previously worked in FE as a data manager and has also worked for an awarding body overseeing international examinations.  She was until recently a board member of the Exam Officers Association and as part of this role sat on HM Government Data Advisory Group.

Jasmin joined SISRA as a Data Consultant in 2015, expanding SISRA’s consultant presence in the south of England.

SISRA welcomes Jessica and Jasmin

SISRA Limited are pleased to welcome Jessica and Jasmin to the Consultancy team!
Jessica Maddock has joined the team as an Assistant Team Coordinator. She was previously employed as an attendance officer in a local secondary school.

Jasmin Masih has joined the consultancy team as a Data Consultant. She has many years of experience in data management and is also a board member at the EOA. She is located in Bedford and was most recently a Data and Cover Manager.

Support team welcomes

The Support team has expanded in the hope of providing additional help through Live Chat for our SISRA Analytics and SISRA Observe schools.

Graeme Radcliffe has been with us for a month, after graduating with an honours degree in BSc Computer Studies.

Exam Results Workshops – summer 2015

Throughout the summer term, SISRA held over 25 exam results workshops across the country. The workshops were designed to help prepare administrators for Results Day, with hands-on practice in uploading data, and to help prepare Senior Leaders for the analysis of exam results.
With 97% of attendees rating our workshops as ‘good’ or ‘excellent’ and 98% agreeing that the training will help them use SISRA Analytics more effectively in school, we think you’ll agree that the workshops have been a huge success!

Attendees seemed to particularly enjoy the layout of the workshops and the use of practical exercises to mimic real-life school scenarios. The workshops have given users the confidence and knowledge to ensure a successful Results Day!

Autumn term Training Days and Workshops

We have released the autumn term training days and workshops, which will be taking place across the UK. Our sessions have been based on feedback from our customers to make sure we meet their needs.
Due to high demand, we have also released a brand new workshop specifically designed for Middle Leaders in English and maths. Please see below for a list of available training days and workshops.