Author: Sophie Rawson

Supporting CPD and QA during lockdown

I had been looking for a programme that could bring together CPD, Performance Management and Quality Assurance for some time. These areas are intrinsically linked and storing everything in different spreadsheets was both onerous to complete and difficult to analyse. I’d looked at various platforms and when I received details of SISRA Observe through the post it sounded like it did everything I wanted to do. We began with the free trial and the support we received from the SISRA team was excellent.

When it came to lockdown, CPD became a huge priority. We wanted to see the time as an opportunity for colleagues to attend the webinars and do the reading that they ordinarily wouldn’t have had time to do. We also wanted to make sure that all the CPD that was undertaken was evaluated and this is where SISRA Observe came into its own. It combined effectively with the new school CPD website and by using online meetings and screen recordings, we were able to train all colleagues on how to log and evaluate each piece of CPD that they undertook, including whether or not it met their expectations.

Using SISRA Observe, 924 CPD activities undertaken by colleagues at Queen Katharine were evaluated. It complemented the online delivery of CPD and allowed Subject Leaders to monitor exactly what their teams were learning about whilst they were off. It also allowed them to point people towards CPD that had been positively evaluated and from which everyone could benefit.

We are well aware that we have not yet been using SISRA Observe to its full potential. Moving forward, all of our Quality Assurance work and Performance Management work will be moving online. We are very excited about this and the impact it will have for our students.

 

Jo Hammond, Assistant Principal, T&L, CPD, QA, Queen Katharine Academy, Peterborough

Online teaching … a beginner’s guide

Online learning

In a time when the safest place is often in our own homes, the teachers of our country have been going above and beyond to try and help our students continue to learn and grow. With schools closed and face-to-face lessons cancelled, the world has turned to online and remote teaching and learning. Whilst some have embraced this opportunity and adapted well, some of you may still have this on your to-do list – if this is you, here’s some information and tips that may be of use and help you turn your session into an internet sensation for all the right reasons.

Choose where your session will be

First, you need to choose your platform – if your organisation already has one, you’ll probably have to use that. If not, there are lots to choose from – from the original online meeting platform Skype, there’s also Zoom, GoToMeeting, Google Classroom, Adobe Connect, VoiceThread, Blackboard Collaborate and many more. If you have a choice, check out a few and see what works for you.

Sort your set-up and get familiar

  • Test run
  • How does the screen look to your students?
  • Mute microphones
  • Clear, neutral background
  • Good lighting and camera angle
  • Split your screen

Once you’ve decided which platform you are using, you’ll want to get familiar with it. Have a test run with a friend or colleague to help you understand how the features work; are buttons located in the same place on the screen for you and your students? How do you share your screen so the students can see your PowerPoint? How do you mute and unmute the students’ microphones and how do you share your screen so they can see the resources?

You can also take this opportunity to figure out where you are going to set up, somewhere neutral without distractions behind you if possible; good lighting and camera angle are useful too – nobody is expecting a production fit for the BBC but you want it to look professional, you want to be at eye level with the webcam and for your students to be able to see you clearly. How will the screen look to your students? Will they be able to see that picture of you and your family in the background, or that pile of washing you’ve not had chance to put away yet?

Some teachers and tutors recommend to split your screen, you can have the session or ‘meeting’ open in one window, taking up half of your screen and then the resource you’re using open in a separate window on the other half. This way you can see the thumbnails of the students and keep your PowerPoint running on your screen share, whilst checking things out on a worksheet for example – again, practise this and have a play around with what works best for you.

The session – planning and preparation

  • Adapt for remote delivery – do you really need that group discussion?
  • How will students view the session?
  • Email resources to students along with the meeting link
  • Check links you are planning on using

Next up, your session plan – we’re not going to pretend you can do everything online that you can in a classroom; we’re going to have to adapt our delivery. For example group discussion is out of the window with any more than three or four students – the sound delays and practicalities of people talking over each other just don’t work in an online setting the way they do in a classroom. One of the platforms mentioned above, Blackboard Collaborate, allows you to split students into smaller groups and put them in a ‘breakout room’ where they can chat and work together for smaller tasks and then rejoin the main session. If you have large class sizes, or discussion is a pivotal part of your delivery, this might be the answer for you.

The other part of your session you may have to adjust is how you assess that learning has taken place – remember you can’t rely on your amazing body language reading in this setting and we’ve already mentioned the pitfalls of group discussion, so you need to assess differently. You could utilise the features of the platform and use the reactions or polling/voting functions, you could shift more of the assessment to the activities whether they are completed during the session or as a follow-up, you could even try something like a Kahoot! quiz where the students log in on their phones with a pin that you give them and can complete a multiple choice quiz that you have prepared before the session.

Other than that, think about how your students will be viewing the online session. They might not all have laptops or computers, they might even be joining you on their phones – try to make sure your materials and resources are clear and easy to use in this context. Emailing presentations and worksheets to the students beforehand along with the meeting link will mean those that can will be able to print them out so they have a physical copy to work from. Make sure you send the email with enough time for your group to do this if they want to. Don’t make it a requirement though, as not everyone may have access to a printer.

Don’t forget to check that all the links you’re going to use are working too – finding a great YouTube clip that perfectly demonstrates your point and then it not working during your session is somewhat frustrating!

During the session

  • Last-minute preparation and login
  • Introduction and ground rules
  • Clear speech and questioning
  • Be vigilant towards the students’ reactions
  • Resources

Just before the session starts, make sure you have been to the bathroom and got yourself a drink, just as you would for a classroom; you don’t want to have to dash off mid-sentence if you have a tickly cough from all the talking. Then, get yourself comfy and log in to the ‘meeting’ so you’re there before your students – you can always prepare a slide or sheet saying ‘Session starting soon …’ if you don’t want to make small talk.

If you have students that haven’t used the platform before, then introduce them to it and set your ground rules just as you would in the classroom – how do they ‘put their hand up’? How do you want them to answer questions? If you have cameras on, make sure they know that everybody in the session can see them (we’ve all seen the amusing videos of online meetings doing the rounds on social media). Let the group know if you are muting everyone’s microphones to avoid background noise and feedback, direct them to signal to you if they need to tell you something. There are a few ways that you can manage this – you can go for the simple wave your hand in front of the camera, or use one of the software functions such as a ‘thumbs up’. Some platforms have a ‘chat’ box which you could direct students to use for any questions – be mindful about setting boundaries for this though, you don’t want the students having a chat in there and you having to try and read it whilst still delivering the session.

Speak a little slower and clearer than you usually would to allow for sound delays and internet issues. Which goes hand in hand with your questioning methods – asking an open question to the whole class will either be met by silence or everyone speaking over each other to answer, depending on the confidence of your group; ask targeted questions to specific participants, and remember to unmute their microphone to allow them to answer, if you have muted them!

Keep an eye on the thumbnails of the participants – do the students appear engaged? Does anyone have any questions? Does anyone appear to be struggling? We’ve already mentioned the lack of opportunity to read body language, so try to be extra vigilant to what you can see.

You may want to try and incorporate some additional resources into your session, this will vary depending on your group’s age range and ability, but is especially useful and engaging for younger students. These could be photos or images you hold up, flashcards, videos or even a song to help liven things up. You could also have relaxing music playing during the time periods that the students are completing activities by themselves.

When closing the session, make sure you’re crystal clear about what you’re going to do to follow up the session and what you expect the students to do. Are you going to email them any ‘homework’ activities? When do they need to complete them by? When is the next session? How should they contact you with any questions? How should they submit any work to you? When should they submit it by? Of course, you can put all of this in the follow-up email too, but if you have mentioned it during the session, they are more likely to look out for it.

Make sure the ‘meeting’ is fully closed down before you do or say anything you don’t want your students to see or hear.

After the session

  • Well done!
  • Evaluate
  • Send that email

Firstly, if that was your first online teaching session, take a deep breath and a well-deserved pat on the back, it’s not always easy trying new things and embracing new technology, but you did it!

Just as you would with a classroom-based session, let’s do a bit of evaluation… did everything work the way you thought it would? What went well? Would it have been even better if…? What didn’t work? What will you change? You may be slightly drained from delivering the session but if you do this now, the session will be fresh in your memory and you may come across things you want to point out in your follow-up email to the students.

Lastly, send your follow-up email, make sure you have confirmed in writing any follow-up information or actions you want the students to take, along with ensuring you have attached everything required.

Generating ‘Centre Assessment Grades’: the challenges

Person writing on brown wooden table near white ceramic mug

In the guidance published by Ofqual on 3rd April, centres have been asked to provide a ‘centre assessment grade’ for each student in each subject, and to rank order the students within each grade. The guidance states:Screen Shot 2020-04-04 at 15.55.12School leaders across the country will be determining how best to manage this process but also questioning the likely outcome for their cohort.

To help think my way through all of this, I have carried out some analysis on last year’s cohort in GCSE English Literature. The graph below shows the percentage of students achieving each grade in the summer of 2019, alongside the predictions (made at Easter) and the outcomes of the mock exams, which in my school are taken immediately after Christmas (Paper 1 in the E-Bacc subjects) and then immediately after February half-term (Paper 2 in the E-Bacc subjects and all other papers).

Screen Shot 2020-04-04 at 15.59.41

Some thoughts about our actual GCSE results

There has been much debate on social media about students in turnaround schools, or rapidly improving schools losing out if the previous year’s results (or those in recent years) are taken into account when turning the centre assessment grade into actual results. In truth, all schools have significant in-school variations between subjects and across years.  In my school, Progress 8 in 2017 was -0.4.  The improvements we put in place have meant national average progress (-0.1) for the last two years (2018 and 2019). But that improvement hasn’t been seen uniformly across all subjects.

Screen Shot 2020-04-04 at 13.23.40

In English Literature, 18% of students achieved a 7+ in 2017, with 28% in 2018 and 29% in 2019, and that pattern (a big improvement in 2018, largely maintained in 2019) could be seen in the percentage of students achieving grades 4+ too. The prior attainment of the cohort over these three years is similar although cohort sizes fell as the impact of the school’s troubles over the previous 5 or 6 years was felt.

But English was not typical of all departments. Some subjects have not improved in line with the school average.  Others have seen fluctuations which buck the trend. Cohort sizes and makeup in some subjects will have had an impact, as will subject leadership, the stability and experience of the teaching teams, and a variety of other factors.

It may be too simplistic to say that students in improving schools will lose out, and those in schools going in the other direction will gain. Only those at Ofqual and the exam boards, far more able than me, will know how they will treat the centre assessed grades when they receive them.

All we can do is control those things we can control. Which brings us to predicted grades.

Some thoughts about our predictions

I have spent a large part of my career in four schools discussing what we mean by a ‘predicted grade’. I don’t think my experience will be hugely different to those elsewhere and I would be surprised if every teacher within any given school (let alone between schools) gave the same definition of a ‘predicted grade’.  I have seen it confused with both target grades and ‘working at grades’. We also know that asking some teachers to ‘predict a grade’, even at the end of Y11, without the crutch of the target grade next to them in their mark book, causes anxiety!

Predicting grades with any degree of accuracy can be surprisingly difficult but that should not deter us on this occasion. Our predictions for English Literature in 2019 were, seemingly, relatively accurate.  They followed the reassuring bell curve we see with the final outcomes. At the ‘top end’, the English department predicted 25% of students would achieve 7+ and the actual outcomes were slightly better (29%). At 5+ the department was spot on (75% predicted and achieved).

It is also worth saying that the purpose of these predicted grades (and any predicted grades) differs significantly from the ones we have been asked to produce. The predictions we have given in the past may (like it or not) have been used by some teachers as a kick up the backside for their students. They will almost certainly not have undergone any intensive standardisation or scrutiny in the way that mock examination grades might have done, and they will have suffered from the inherent assumption that they don’t really matter!

I am old enough to remember the endless sheets of paper which had to be completed (HB pencil only) with predictions for the exam boards for their use, should there be queries or special consideration required. Whatever happened to them?

To return to our predictions and taking all of the above into consideration, it is still worth noting that however accurate these predictions are, we haven’t considered these at a student level. The accountability measures have been waived and the stresses of senior leaders about whether a department is able to provide some reassurance about headline figures to tide us over the summer, is now irrelevant. What matters now is all at a student level.

If our English department were able to accurately predict that 75% of students would achieve a 5+, did they get it right with every student? Nearly. Of the 102 students they predicted would achieve 5+, 8 failed to reach that threshold (a success rate of 92%) and all bar one of those who didn’t, achieved a 4 (the other student achieved a 3).

Do thresholds matter?

I think they do. We know that this system is imperfect. We should probably try to stop thinking about whether all students will achieve the grade they would have got had they taken exams. They won’t (more of that below) so we should hold onto what we are trying to achieve.

Whilst we hope to make this process both as robust and as fair as possible, we are relying on university admissions officers and Heads of Sixth Forms to provide any fairness that this process is unable to deliver.

My own A Level grades weren’t the best, but I met the requirement (just) for my first-choice course at Newcastle University, and I moved on. I had a second chance to prove my academic credentials there. I believe in comprehensive and inclusive education but have sat in many a (difficult) meeting with students and parents where sixth form entry requirements have not been met.  None of the Sixth Forms I have worked in have been entirely open. You couldn’t pitch up with a 3 in GCSE maths and take A Level Physics.  You won’t be able to again this year, but which Head of Sixth Form is going to sit in front of a parent and ‘hold the line’ with a child who might have needed a 5 in English to take History but has been ‘awarded’ a 4?

So, keep that in mind as we delve more deeply into the accuracy of individual predictions. Of the 18 students the English department predicted in 2019 would achieve a 7 (not a 7+ but that specific grade), only 7 did so. 3 achieved a grade 8, 7 achieved a grade 6, and 1 achieved a grade 5. That is only a success rate of 40%. Clearly, the 3 students achieving the 8 would have been happy to have done better than they were predicted. They would have been (hypothetically) less happy under this process as these are the students who would potentially ‘lose out’ by being awarded a 7. The rest would have ‘gained’ as they were predicted a higher grade than they achieved.

You don’t know what you don’t know of course, so most of our current students will only be disappointed if they are awarded grades which don’t reflect the work they believe they put in. We all know students who believe they can pull it out of the bag in the final exams despite all evidence to the contrary. They might be disappointed this summer. Our least accurate prediction last year was a girl who was predicted a 5 and achieved an 8. Sadly, those success stories won’t happen this year. But being positive, those students who, despite all their effort and hard work, go to pieces in the exam, won’t be penalised either. These are the extremes; they are few and far between every year.

The curse of overpredicting

The example I have given here is of a department which usually predicts cautiously. Last year 60 of the 133 students achieved the grade they were predicted, and a further 33 students achieved a grade (or occasionally 2) more. Those departments which have typically given students the benefit of the doubt and overpredicted throughout their GCSE courses could well have disappointed students on their hands. In the past, students (and their parents) will have attributed this discrepancy to their performance in the examination (although in my last – very middle class – school, this was still often seen as our fault anyway).  This time, if we don’t anticipate this and manage expectations and communications accordingly, we’ll have some parents saying, ‘you predicted him a 6 all the way through and he achieved a 3’. Of course, these discrepancies may not be due to overprediction, but because of the improving school or department syndrome outlined earlier.

We need to keep returning to the things we can control.  Remember, the predicted grades of 2019, outlined above, were made in very different circumstances. It may be that where teachers “over predicted” (there were 40 such predictions in English Lit) it was because they were expecting a bigger improvement from the mock exam results than materialised.  Incidentally, looking at our mock exam results in 2019 (in the first graph), I am relieved that these aren’t being called for or used to work out the final grade. That was never a realistic possibility anyway.

Generating a centre assessed grade this year

I have read a number of good suggestions on social media already and will incorporate these into our system.  This has still to be finalised but is likely to involve some of the following (in each subject):

  • Finding a starting point – it could be the mock examinations (see above) or the last set of teacher predictions which were given to the students. Using the mock exams has the benefit of us being able to rank order the raw scores (it is a starting and not an end point)
  • Overlay any other data the department might have (we don’t use grades much and give raw scores out to students for assessments they do) which can be ranked too
  • The Curriculum Leader will need to pull this together first but they will need a set of predictions and, if possible, a starting rank order, for each grade
  • Fix up a series of meetings. This is a huge challenge as it will involve all teachers of the cohort “meeting” together remotely.  If that doesn’t happen, the logistics of sending round lists doesn’t bear thinking about.
  • The meetings should allow a case to be made for every single child. It is too easy to ignore the quiet students so that they find themselves lower down the ranking, or to promote the case of the lovely hard-working students at the expense of someone who has been challenging.
  • When each student is considered, the class teacher should be able to say, “they are a solid 5, or they are a very strong 5 and should be near the top of that group, or they are a borderline 4/5.
  • It is impossible for us to get a ranking like this 100% right, but we are trying to ensure that if boundaries are set so that 5 of our Grade 6 students drop to a grade 5, it is the those we consider the weaker ones and not those who could have been pushing for a grade 7.
  • There will need to be a few of these meetings, and plenty of reflection. Teachers will know 30 (or even 60) of the students on the list. The class ranking can at least be accurate (or as accurate as this process allows).

Once these meetings have taken place, and a predicted grade and ranking achieved, it needs to be considered by senior leaders. We have to be realistic. Are we predicting 20 students to get a 9, having never had any before?  We owe it to the system, not to submit unrealistic grades. As Duncan Baldwin from ASCL has said, much as we may be tempted, this is not the time to right the wrongs of the system and inflate the grades of those in vulnerable groups, unless their work and prediction demands that.

If this section is still a little muddled, my only excuse is that I am still thinking about it.  When we ‘return’ to school after Easter, I will have a plan.

Finally

I want to return to an earlier point.  This is the least-worst option. Far from being one who believes the future will not involve exams, we may find after this that we need them more. Teachers aren’t enjoying this process at all! But remember, we should be devoting our energies to the things we can control or influence, and that is to ensure that students move on to their chosen course or career. Sure, they will need these grades in years to come, and will never shake off being the 2020 cohort, but for now, we want them to join our sixth forms, colleges and universities, and we don’t need an exact grade for that to happen.

There are more students than we care to mention (I live with one) for whom achieving the exact (high) grade they have worked for matters, but we aren’t dealing with exact anymore. Of all the uncertainties in the world right now, achieving an A rather than an A* or a 4 rather than a 5, is not our most challenging.

This post first appeared on https://framheadteacher.com/ 

A Data Demise? Part 2 – Intent, Implementation and Impact – the new Ofsted framework

In this two-part series, education consultant and former head teacher Daniel Taylor discusses the changing data landscape in UK schools.

Read part 1 here

So, can a school now be judged good or outstanding off the back of consistently poor progress and attainment? Evidence from Ofsted’s own curriculum research, conducted when trialling the new framework, certainly suggests the potential for this to be the case, with approximately half of previously outstanding schools scoring moderately or poorly on curriculum during the trial – the natural caveat for this being that some such schools had not been inspected for a considerable number of years. ‘Intent, Implementation and Impact’ are the new buzzwords and it remains to be seen for how long the latter might end up sustaining poor outcomes. The rationale, after all, surely is that a high-quality education and curriculum will deliver high-quality outcomes for the learners?

Like so many others in education, I am aware of many staff having left the profession over the last decade, ultimately citing obsessions with targets and progress and a lack of realism as key factors in their decisions. For many perhaps the new framework does offer some hope. Having implemented a full curriculum review in my last post as head, giving staff time to look in detail at what was being delivered, in what order, by whom and to whom proved valuable even to some of our more experienced staff. Critical as ever, however, was ‘time’.  A thoughtful, self-critical analysis can only be achieved properly by allowing staff to look collaboratively and independently at present practice and schemes of work. For many this led to logical and common-sense improvements. Perhaps given my chance again, I might also insist on incorporating more feedback from students. Some of the best practitioners do this intuitively and for a long time it has been incorporated into quality assurance reviews of many vocational subjects but perhaps not so much so for GCSE and A Level qualifications. The same students, after all, play a key role when schools are visited but, more importantly, should be the key benefactors from any review process within education.

Has the importance of data however really diminished? Although no longer at the forefront of inspections the need for accurate internal data and its subsequent analysis naturally remains. I am not one for turning everything into numbers, indeed, if schools are truly to be judged on the quality of what is being delivered and how it is being delivered then this new focus is to be welcomed. Nevertheless measures of internal progress and attainment, used properly, allow staff and leaders to identify genuine strengths within their schools and draw upon them to tackle their weaknesses. Furthermore, it facilitates comparison internally, locally and at national levels, enabling leaders themselves to benchmark their own schools and take appropriate action. It allows for genuine review and evaluation. Most importantly, it should be used as a powerful tool to inform both parents and students but, conversely, should not be used as a stick with which to beat – be that staff or students. Where there is an average, there are always those both above and below but should being below average consign students, staff and schools to negative labelling? Students need to know how to improve and staff need to be able to identify and address these shortcomings, and data has a role to play here. Progress 8 is not a perfect measure. For many it does not take into account sufficient external and contextual factors. Curriculum and teaching should be shaped around the needs of the students and not based on a perverted obsession with outcomes and progress.

So there remains a role for progress and achievement tracking but perhaps no longer is it, or should it be, the pre-determinant of a school’s Ofsted label. If deep dives can truly gauge the quality of a school’s education then, for now at least, our spreadsheets and graphs can take more of a back seat.

A Data Demise? Part 1 – From Progress measures to curriculum

Data Meeting
In this two-part series, education consultant and former head teacher Daniel Taylor discusses the changing data landscape in UK schools.

Since the advent of league tables over 20 years ago, data for school leaders has always been foremost in their minds and integral to decision-making. The need to identify how well your school is performing in terms of attainment and progress, in addition to any exclusion and attendance data, has provided a core of evidence for a school’s accountability measures. It has so often been the cornerstone of decisions around personnel, funding and budgets and, as such, has played a fundamental role in schools – whilst causing many a sleepless night for school leaders. But has the elevated role of data now started to diminish?

For many years, I was the ‘go-to’ person for staff unsure about any aspect of performance or progress-related data. As a former head teacher now working as an education consultant, I frequently speak to school leadership teams. Emerging from the inception of the new inspection framework, a significant majority of head teachers appear in agreement about two key issues. Firstly, their apparent isolation from the process beyond their initial briefing conversation and, secondly, as I will briefly focus on here, an unwillingness from inspection teams to even contemplate the inclusion of internal data in conversations about attainment and progress.

The move to Progress-based key performance indicators had been viewed by many in education to be a big step in the right direction – a fairer measure allowing particularly for those schools with less able cohorts to demonstrate their effectiveness in delivering a high-quality education to their youngsters, even if achievement outcomes weren’t exceeding national thresholds. As a consequence, schools consistently strived to demonstrate firstly ‘expected’ levels of progress and then positive Progress 8 scores. Of course, an inability to predict this accurately (or impossibility in the case of Progress 8) meant that the validity of such data was oft called into question. Is it little wonder however? The muddied waters of ‘life after levels’ gave schools the ‘freedom’ to devise their own methods of gauging progress. Although welcomed by some, it provided little scope for comparison, was time-consuming for staff and required a re-education for staff, governors and students alike.  Although the cynics amongst us would have little sympathy for inspection teams, it doesn’t take a genius to appreciate the difficulties they encountered when moving from one school to the next in trying to interpret what was put before them. Furthermore, what incentive was there for any school, on the cusp of inspection, to present a poor set of figures during inspection visits? For those who spent many hours, weeks and even months sweating over new comprehensible tracking systems, this move away from internal data might come with considerable frustration (or perhaps in some cases relief).

But why the move? Put simply, one key reason I believe was that the validity of the data was too often called into question. One wonders on how many occasions school inspectors have waited as nervously as head teachers for August results day, hoping that outcomes have supported judgements. (I remember one of my previous head teachers, upon receipt of the school’s very positive Summer GCSE results, immediately emailed the lead inspector from the school’s spring section 5 Ofsted inspection!) For thousands of teachers the drive to performance-related (data-related) pay had the potential to cause considerable hardship; perhaps the shift to curriculum and teaching will provide for fairer reward?

Click here to read part 2

Using SISRA Observe for Teaching and Support Staff

What were the challenges you faced with your previous processes and what prompted you to explore SISRA Observe?

The biggest challenge was collating information from a variety of different records stored in different places and different ways in a timely and efficient way. There were also multiple versions of the same documents, including paper copies, so individual staff had to access information from a variety of different places too.

The key prompt to explore SISRA Observe was that we wanted a more efficient system that would allow us to store and access all necessary information about teaching and learning, staff development and CPD in one place.

Was it an easy process to create your previous templates? What support did you receive when you started the set-up process?

Creating the templates was relatively easy once I had received training from the SISRA team. The key was to ensure that I was clear about what information I wanted to analyse once records were completed.

I received fantastic support from the team throughout this process. The initial webinar training from Liam was excellent and I also signed up for a further webinar session that was being offered too.  This was also extremely valuable.

I also attended an excellent afternoon training session in Leeds that provided lots of food for thought about how we could use SISRA Observe but also about teaching and learning and staff development more generally.

During the process of getting started, the support from the team has been incredible. I have made lots of use of the help section on the website and I have always received a quick solution to any query or issue that I encountered. At all times, I have found the staff from SISRA to be responsive, friendly and extremely helpful.

What are the main benefits you’ve seen now SISRA Observe is up and running?

There are a number of key benefits we’ve seen now we are using SISRA Observe. These are:

  • Staff workload has decreased significantly, particularly now we are using the system for staff appraisal documents. It has proven to be an extremely accessible and efficient system for all staff.
  • Staff appreciate the fact that their own records are stored in one easy-to-access place.
  • Feedback from learning walks and lesson visits has been consistently delivered in a timely manner and the facility for staff to add their own reflections has encouraged some excellent dialogue about teaching and learning.
  • Analysing the whole school picture for teaching and learning has been much easier as the system allows you to quickly gain an overview of key strengths and areas for development. The facility to quickly click on more detailed records also enables a thorough evaluation of the whole school.
  • It has empowered all members of SLT and middle leaders to maintain an overview of quality assurance processes and quickly respond to any issues, give praise and share best practice.

 

You’re using SISRA Observe for your non-teachers. How have they responded to a new way of doing things?

Support staff have responded brilliantly to SISRA Observe. Similarly to teaching staff, this new way of keeping and updating appraisal records has been well-received as a result of it saving significant amounts of time. Meetings between support staff and their line managers have been more focussed due to both having access to records prior to the meeting and the facility to add reflections following the meeting has been something that staff have noted as a real positive. One line manager said she particularly likes it as a way of delivering praise to staff that don’t like receiving praise face-to-face.

How have teachers and support staff responded to SISRA Observe?

Across the whole school, teachers and support staff have responded positively to SISRA Observe. A number of staff have asked if they can create their own records when they do their own lesson visits as they find it such a straightforward way of recording information and delivering feedback. The accessibility of records to all appropriate staff enables communication to always be efficient and timely and avoids multiple versions of records/documents being created.

Would you recommend SISRA Observe to other schools?

I would highly recommend SISRA Observe to other schools. It has enhanced teaching and learning, appraisal and CPD at our school by ensuring our systems are clear, accessible and efficient.

Vicky Mahmoud, Assistant Head Teacher, Eaton Bank Academy

SISRA Observe is in the EdTech50 2020 Yearbook!

EdTech 50 Banner

We are honoured to be included in the EdTech50 2020 yearbook.

The full list, released 28th February 2020, celebrates the people, projects and products making a positive impact on EdTech in the UK.

Nominated by individuals across the UK education sector, including FE and HE, the list is arranged by the Education Foundation with winners selected by an expert panel of judges, including Mark Anderson (@ICTEvangelist), Andrew Dowell (@andredowell) and James Donaldson (@Mr_ALNCo).

To view the full EdTech50 list, click here. To find out more about SISRA Observe and how it can help streamline your professional development processes, please click here.

I had an idea in the 1980s and to my surprise, it changed education around the world

Explicit guidance and feedback from teachers is more effective in teaching students new content and skills than letting them discover these for themselves.

This is a premise of cognitive load theory, which is based on our knowledge of evolutionary psychology and human cognition, including short- and long-term memory.

I started working on cognitive load theory in the early 1980s. Since then, “ownership” of the theory shifted to my research group at UNSW and then to a large group of international researchers.

The theory holds that most children will acquire “natural” skills – such as learning to listen to and speak a native language – without schools or instruction. We have specifically evolved to acquire such knowledge automatically. It is called “biologically primary knowledge”.

Read the full article on The Conversation: https://theconversation.com/i-had-an-idea-in-the-1980s-and-to-my-surprise-it-changed-education-around-the-world-126519

A little better all the time

Woman hiker on top of a mountain

“The man who moves a mountain begins by carrying away small stones.” Confucius

This autumn, I’ve been thinking a lot about how I improve my practice as a teacher. Literacy has been on my mind for a number of years, first as a member of a school literacy group and more recently after hearing Amada Fleck (@AJTF71) present about the reading demand of the new Science GCSEs. But over that time, I have done little to improve my practice in this area. Finally, this year, I’m starting to make some improvements and that has got me thinking. What has changed? What I have done differently to prioritise this?

To continue reading, please click here: https://catalysinglearning.wordpress.com/2019/10/30/a-little-better-all-the-time/

An Objective Perspective

close up hand framing view distant over sunrise

Teacher appraisal has been on my mind a lot recently. This is because in my school we have had a series of discussions about it at senior leadership level, which have led to a decision to move away from objectives based on pupil attainment/progress data. I am well aware that this is far from original and that many schools did so a long time ago, if they ever used pupil data objectives at all. Others will no doubt be reconsidering them in the light of the Teacher Workload Advisory Group’s welcome warnings against their thoughtless use (2018). Even so, it is a big and bold move for a highly successful school with plenty of reasons to be risk-averse, and I am very proud that we have agreed to make the change. Before going any further, I should make it clear that my experience has been in secondary schools and my points apply to that setting. I am not sure whether or not they are relevant to teacher appraisal in primary schools, because I know next to nothing about it.

Click here to continue reading: https://occamshairdryer.wordpress.com/2019/06/08/an-objective-perspective/

Read part 2: https://occamshairdryer.wordpress.com/2019/06/19/an-objective-perspective-2/

School Data – more, more, more? Part 2

In this series of blogs, Chris Beeden from School Data Managed Ltd, discusses the use of data in schools, from humble beginnings to today’s measures and systems. Click here to read part 1.

Ofsted – Progress of current Pupils

In 2015 this bomb was dropped on pupil data:

  • ‘Throughout each year group and across the curriculum, including in English and mathematics, current pupils make substantial and sustained progress, developing excellent knowledge, understanding and skills, considering their different starting points.’
  • ‘The progress across the curriculum of disadvantaged pupils and pupils who have special educational needs and/or disabilities currently on roll matches or is improving towards that of other pupils with the same starting points.’

 

When Mr Wilshaw announced he would measure schools’ performance by progress of pupils at the school rather than leavers it felt fair.  From 2015 schools had to record and show the progress of pupils in every year group and every subject and for each of them split into lots of subgroups, however small.  This was at a time when subjects were being reformed and ‘no levels’ was being forced on schools.

Assessments should be simple: a test of knowledge with an outcome of a score which you can, if needed, convert to a grade. But these comments started to be used:

‘You can’t record a D because last term they were a D and they need to have made expected progress in the term’

‘This pupil’s grade cannot be less than their FFT target’

‘This pupil was a 3 at Key Stage 1 so cannot be developing in year 5’

‘These pupils are not making expected national progress’

‘What is expected national progress?’

‘2.32 points per year, and they made 2.24 points progress’.

Teachers were/are being asked to make an assessment and then being asked to record an outcome, not based on the assessment but on an assessment done 3 years earlier.  (If you want to know about Key Stage 1 issues look at Bedford Borough’s Key Stage 1 results 2014, compared to the Key Stage 2 2018 – that is for another blog!)

I understand the need to amend the Inspection framework but it should have been amended, not reformulated into a new one.  The current framework is not that bad and the change has – and will – cost the sector a lot of money that it does not have.  Saying to schools: “Don’t prepare or change practice for Inspections” is like telling pupils “not to revise or study the wrong text for the exam”.  I have been involved in about 20 inspections in the last 8 years (they are tough but a million times better since they have “educational people” leading them).

Just a note though, Ofsted has said it won’t look at internal data and that frees up data to be used as it should be, but outcomes has moved from the 4th section to the 1st section of the new framework. Outcomes from September will be more important than ever before. Schools will need to analyse the outcomes thoroughly and be careful to complete the Performance Tables checking in June.

Reformed Assessments – No levels

With systems now in place, 4G internet, educational experts advising lots of assessments and Ofsted wanting to see progress data, schools were then asked to remove levels. Systems were developed to record individual units and this took pupil data collection to another level.  In the confusion schools created their own ‘no level’ systems all while stakeholders continued to demand plenty of data. The removal of levels seemed a good idea but really it is madness. Look at the performance tables. Primary schools are judged by the % meeting the expected standard. At GCSE Basic it’s the performance measure – that is the % above a set level.  You do not get onto a sixth form course with a progress 8 score of 0.3.  You get on the course with a grade 6. Schools were advised not to use levels but at the same time they are judged by levels. When you get feedback on a pupil in year 4 you need something everyone (the parent, CEO, etc.) will understand.

 

Progress 8

Regarding the move to grouping subjects’ progress, subjects are different and have different difficulty, it is stupid to try to get them all the same. Pupils need courses that are more accessible, just don’t then judge them all together!  ECDL is a great course but cannot be compared to GCSE Music. The decline in ICT courses is a crazy result of such a system.  GCSE French is different from Combined Science.  The move by SISRA to show subject progress was fantastic but it is mad that the large DfE data team are churning out all the ASP and IDSR analysis but you cannot find any subject progress apart from maths. Thank you SISRA for creating the Subject Progress Index in 2018.  When doing the summer analysis there are three questions to answer: how did we do overall; how did each subject do: how did our vulnerable groups do? The ASP only answers 1 of these.

The way forward

Maybe we need to go back to where we were in the early 2000s but use the new technology better and let schools do what they think best for the pupil and for the other stakeholders – i.e. regular low stakes assessments and then at a maximum of twice a year convert an assessment score into an overall level for Parents, HODS, HOYS, SLT, Governors, MAT Boards, CEOs, Ofsted, RSCs.  The overall can be a level working at or end of course estimate. Moderation is important for these and/or the use of standardised assessments such as GL assessments which help check the validity of any data.

The overall level is then analysed against benchmarks.  These can be set using the prior attainment of pupils and subgroup information. Let’s use attainment benchmarks and move away from progress numberwangs. When setting targets prior attainment can be used but I think avoid other subgroup information – I do not like the fact FFT estimates are gender specific.  Us boys are way behind but need high expectations. These benchmarks should be set with the pupil. When judging schools use contextualised outcomes against actual outcomes.

Chris Beeden

Chris runs School Data Managed Limited, established three years ago. Previously, he was a school data manager for over a decade after working for Capita SIMS in a local support unit.  He is a timetabler, has a working knowledge of assessments from Early Years to Post 16; supports school census completion and is a qualified GDPR Practitioner and practicing DPO.

Chris tweets from @ChrisBeeden

School Data – More, more, more? Part 1

In this series of blogs, Chris Beeden from School Data Managed Ltd discusses the use of data in schools, from humble beginnings to today’s measures and systems.

Old School

Mark the work with ticks and crosses and the odd comment. At the bottom of the work write in a grade on the standard of the work produced.

 MIS History

 In the 80s in the depths of Central Bedfordshire a small group of teachers decided a computer could help them write school reports. They created a DOS based program to enter comment banks and, via keys presses, you could very quickly create a report. This developed into Student Teacher Academic Records and an Attendance and Exams package was added.  BBBD enabled an exam entry to be made in under 2 seconds. Form 7 could be completed from the database for funding purposes. Windows was invented and SIMS developed in the new format. When Form 7 changed to the School Census it gave the supplier a unique chance to corner the market as the more complicated return needed more development time and was (and still is) a barrier to competition. At this point the MIS was in the office and data entry was done by office staff, with an OMR sheet printed for teachers to do some of the data entry for reports, attendance and assessment.  The next step was to join the curriculum and administration networks and suddenly a school had 100 data entry clerks to complete the data.  SIMS brought out Assessment Manager 7 and school teachers became data entry clerks overnight.

Fairly Old School – What we did in the early 2000s

With the new MIS system and Excel improving weekly the data analysis world was progressing well as we survived the millennium bug. A test led to a score, which was converted to a grade (a current or an end of course forecast). Some ranked pupils to show position in year group.  We were able to collect them and used CATS, Yellis, FFT, ALis, ALPS, 2 LOP, 3 LOP or 4 LOP to benchmark outcomes. Some great analysis of progress was produced and we started to look closely at sub groups’ progress. Targets were calculated using one of the above and then agreed with the pupils. The systems were developing but data had its place.

More, more, more…

Once teaching staff became competent in completing registers and mark sheets data collection began to grow fairly quickly. “We expect 6 collections a year for an RI school” is something I have heard many times over the last decade (last heard from the DfE in past 6 months).

Whether it was a DfE advisor, Ofsted, LA advisor or MAT CEO there was a perception that if a school was good at collecting and analysing data then it was improving. Data could also be used as ‘comfort blanket’: ‘I know the pupils are progressing and here is my folder to prove it – look at this graph’. The World Wide Web was developing rapidly at the same time and the new anywhere, anytime access to data further lead to its growth. The recording of each individual task was made possible leading to hundreds of thousands of data items for one pupil – these can be done at school, on the bus, at home or at the pub.

Chris Beeden

Chris runs School Data Managed Limited, established three years ago. Previously, he was a school data manager for over a decade after working for Capita SIMS in a local support unit.  He is a timetabler, has a working knowledge of assessments from Early Years to Post 16; supports school census completion and is a qualified GDPR Practitioner and practicing DPO.

Chris tweets from @ChrisBeeden

Click here to read part 2 of Chris’ blog

Do you really need an app for that?

As we approach the end of another academic year, thoughts turn to September and the fresh challenges which await. Perhaps you’re moving schools, or maybe taking on a new role, but even if you’re staying put, you might start making resolutions for the year ahead.

Work less, drink more water, fewer cups of coffee, network, COLLABORATE… <insert buzz word here>.

Most of us can’t work longer hours than we already do, but we all want to work smarter.

What are your time-drains? Do you really need all the systems and tools you’re currently using? How do you plan to share the best practice examples of what’s seen in classrooms?

We don’t need a tool for that,” you may think. “We can do all this on a spreadsheet or write things down on paper. Our teachers don’t like change, so it’d be a waste of money!”

This is completely true. You don’t need a tool – not really. Do we truly need many of the conveniences technology afford us? Probably not. Theoretically, you don’t need a tool to do anything, but when you find a good fit, it really does help.

Spreadsheets and paper systems regularly become a tangled mess of out-of-date documents and broken lever arch files. Maybe it started off simple and neat, but by now it likely feels like a battle to even keep it accurate – and with helpful input from colleagues who just wanted to make that column ‘stand out’ or tweak the headings to make them ‘clearer’, it can only really get worse from here on out. Spreadsheets offer a familiar and flexible friend and, of course, they are (sort of) free, but isn’t this information important enough to want to share and get right?

Teachers deserve their professional development to be taken seriously and invested in – simply put, spreadsheets and paper don’t scale. In my experience, so many schools have nearly everything they need already within their four walls, but many can’t harness it efficiently enough to share what talent they already have. I believe this is a great shame, but on a positive note it is one of the more surmountable challenges of current times. My resolution is to help more schools’ leaders to embrace the power of technology which will help them to lead their schools effectively.

The best tool isn’t the one with the most graphs, the snazziest and most expensive, or the one you’ve used for the past 5 years – it’s the one which is right for you, your teachers and your school.

Community Events

Our community events are open to SISRA users and are free! They are held at different locations around the country and include:

Community Days

These are on a drop-in basis and allow you to chat to the SISRA team about all things data; from trouble-shooting your SISRA Analytics set-up, to checking that you are using SISRA Observe to its full potential.

With members of the Consultant, Support, and Quality Assurance teams in attendance, our Community Days provide you with a chance to discuss feature requests and analysis ideas. It is also a great opportunity to network and share good practice with colleagues from other local schools, as well as picking up a few hints and tips.

You can choose to attend either the morning or afternoon event and pop in at any point during your preferred session.

Coffee Mornings

Our Coffee Mornings are informal events where Senior Leaders or staff with data responsibilities can network and meet with their local SISRA Analytics Consultant or SISRA Observe Product Specialist. Pertinent topics and useful reports and features can be discussed. There may also be time to discuss individual school set ups (please bring your login details).

If you are interested in hosting a Coffee Morning, please let us know!

 

Creating A Developmental Quality Assurance Culture In School

At Horizon we have a very developmental approach to quality assurance. All staff have regular unannounced learning walks which are recorded on SISRA Observe, but discussed 1:1 with staff. We have encouraged staff to use SISRA Observe as a developmental tool for thinking about their own next steps and driving their professional development.

From a leadership perspective we use Observe hand in hand with Analytics. If there is an issue with a cohort of students not making progress, does our learning walk data on Observe help us identify a CPD need to ensure the issue does not lie with the quality of teaching and learning? When learning walks take place, observers will cross reference Observe and Analytics to build a picture of this member of staff’s practice: there is no point stating challenge is outstanding if high ability students are actually reported to be underachieving.

Our biggest change has been shifting the culture of QA from judgemental to developmental; with this change has come buy-in from staff and they log on to SISRA Observe because they are keen to think about their feedback and use it to improve.

Anna Jones, Associate Vice Principal: Teaching and Learning, Horizon Community College

Identifying Strengths And Areas Of Development

In terms of successful roll out of Observe, we introduced it in September at the start of the academic year as the sole platform for all quality assurance activities. This meant that all staff had to access the system to be able to carry out learning walks, lesson observations and book look activities.

Observe has been really successful in providing an online platform (previously we had used a combination of paper and Google forms) which allows all information to be collated in one place. It gives middle and senior leaders an oversight and analysis at the click of a button, something that previously was taking significant time to do, thus reducing workload and allowing a better understanding of strengths and areas for development in staff, faculties and across the whole school. As an Academy working across 3 different campuses with over 120 teachers, the system is  really useful for sharing information and allowing for real in depth analysis across large teams of staff.

David Lovell, Vice Principal – Teaching & Learning, The de Ferrers Academy

A Complete Picture Of Teaching And Learning

SISRA Observe is used across the college as a constant monitoring tool for all teachers and leaders to assess teaching.  Using the reporting tools we can break down lesson observations into individual departments and year groups, which enables us to focus on improvement areas.  The information we retrieve over the year is then directly linked to the CPD/INSET sessions led by the teaching and learning team.

Data collected from SISRA Observe is used to analyse trends over time helping us continually monitor what is going on in the classrooms.  We use number scales to grade lessons 1-4 which are loosely tied to Ofsted grades. This allows us to really focus on specific parts or content of lessons which ultimately enables us to constantly improve the practise at the college.

Another vital part of SISRA Observe is the ability to monitor who has completed observations and the grades that have subsequently been given, to easily quality control the observations that are taking part. Finally, a really useful aspect is the ability for the observer to give actions to the teacher to respond to.  This facility can really improve practise and ensure when an issue has been noted that ‘something’ is actually done about it.

Andrew Gray, Assistant Head Teacher, Horndean Technology College

Why coach?

The promotion of coaching as a leadership style is gaining prominence with lots of people advocating it as an approach. So, what’s in it for you?

The development of your staff is a significant responsibility for every leader. By paying a genuine interest in each member of staff and helping them to improve they will feel more valued, perform better and your organisation will positively benefit. With no cost to you other than a little time it’s also great value for money.

The beauty of coaching is that solutions to problems and new ways of working are all developed from within your organisation’s system; staff understand the context of how your organisation operates that an external solution wouldn’t so that they fit and compliment your existing ways.

Coaching isn’t telling someone how something should be done. Those that do the job are the ones who best understand what works well and what frustrates better working. Coaching is helping the staff member to think deeper, to fully understand how things are and to look beyond the current at an alternative and better future. Once you get started the potential you release will amaze you.

So, what do you need to do?

Listen. Listen to understand your colleagues. Allow them time to think without interruption.  Ask open questions to prompt their thinking.  For example, ‘Why is that?’, ‘When does it occur?’, ‘When doesn’t it occur?’, ‘How can it be changed?’, ‘Is there anything else?’. The temptation to give your answer will be immense but resist giving it. Your solutions won’t be theirs so they won’t own it and are more likely to be less enthusiastic about implementing it than they would their own ideas.  If they dry up of ideas, maybe make an offering; ‘Would it help if I made a suggestion?’. If they are willing to hear your idea, present it as something they can consider, ‘If you tried this, how would it be different?’.

Coaching doesn’t need to be formal. Try the approach in your next 1-2-1 or maybe when a member of staff comes to you with a problem. Fight the urge to give them your answer and try a short informal coaching session – you both might be surprised at the innovative improvements that emerge!

Effective And Efficient Data Analysis For Schools And Trusts

Advance Learning Partnership have chosen SISRA Analytics as the data analysis solution within our trust as we believe it is the most effective and efficient data analysis system currently available for secondary schools. It is used within each of our secondary schools at data collection points for every year group and also for our results data analysis in summer. Using SISRA Analytics enables trust staff to access the attainment and progress data in any of our schools in a consistent format.

Although the SISRA Analytics view for trust staff is very similar, each school setup is customised dependent on school needs or to support analysis of a key area. For example, we have the ability to input KS3 data to align with the assessment system of the school or we can create filters which reflect the individual schools’ focus groups.

The data management of SISRA Analytics is very simple; data collection uploads can be completed within minutes, which gives more time to focus on the analysis of the data and the strategies and interventions which follow.

SISRA Analytics allows staff to track the projections for the headline measures, including P8 estimates (based on the most up-to-date estimates). Staff can identify trends and track subject/qualification attainment and progress to investigate any areas of strengths or weaknesses and also use the filters and breakdowns to review key groups and analyse gaps. As a multi academy trust we analyse this data across all of our secondary schools to suggest trust wide solutions or identify areas where schools would benefit from collaborative working.

The introduction of the SISRA Subject Progress Index (SPI) has been very well received in our schools, enabling staff to compare their results data as well as their projection data to the SISRA collaboration data.

Another recent update, the Attitude to Learning (AtL) information, is used across schools in our trust and this enables staff to compare AtL against current or projection grades. This provides a comprehensive picture of a subject/qualification or class, and enables staff to gain precise information on individual students.

The excellent support offered by SISRA; live chat, DataMeets, consultant support and training packs, along with the timely software developments, complement this invaluable data analysis package.

Laura Mellis, Data Analyst, Advance Learning Partnership

Contingency planning for Data Managers – A Guide

So contingency planning – that’s a bit boring isn’t it?  Well yes, it is but it’s also really important.

You may be asking why do we need a contingency plan for the role of a school Data Manager?  The same reason we need any contingency plan for any organisation: to ensure everything can run as smoothly as possible in the event of an emergency or something unexpected happening.

Those of you with responsibility for Exams have had to have a contingency plan in place since June 2016 to minimise ‘risk to examination administration and any adverse impact on students, should the examinations officer be absent at a critical stage of the examination cycle’ (JCQ Notice to Centres), so why should it be any different for the other parts of a Data Managers role?  I would argue that it shouldn’t be.  Having a contingency plan makes it more likely that your school will bounce back from an emergency situation. One of the additional benefits is that it prevents people from panicking or responding erratically in times of stress. Making decisions in a high-stress situation often leads to oversights. Whilst the big stress days for Data Managers are around the Results Days in the summer it is equally worth planning for the general academic year.

With that in mind, here is a four-step process you can use to prepare a contingency plan for your role.

Step 1: Identify the Key Risks

You’ll need to identify which potential areas in your school could possibly cause problems but here are some possibilities to consider:

  • Only one member of staff being trained to carry out key tasks
  • An unplanned long-term absence
  • Systems going down on Results Day
  • Problems with the school building

 

Step 2: Prioritise the Risks

You’ll know the top priority in your setting but in my experience the highest risk is when the only member of staff trained to carry out a role is unexpectedly absent for an extended period.

 

Step 3: Create a Contingency Plan

A few things to consider:

  • What are the key parts of your role?
  • Is it just you that knows how to use the Assessment/Exams/Analysis/Cover/Census/Timetable part of your MIS or other school software?
  • Who would be suitable emergency cover?
  • Ensure staff have the correct level of access to the systems (or know the process of each system to request this)
  • Are there guides readily available and easy to access? (For use with SISRA Analytics then the answer is yes!)
  • Is there a calendar of key dates in the academic year? Census Dates/Assessment Cycle/Tables Checking/Options etc.
  • Is there an accessible list of useful contact names and numbers?
  • Are you storing information on a personal drive that should be in a shared (but secure) area?
  • Do you have an easy to follow file structure?
  • Who holds the spare keys to the office?

 

Step 4: Maintain the Plan

Don’t just have this as a plan that never gets looked at.

  • Revisit it at the start of the academic year and make any necessary adjustments.
  • A couple of times a year get your emergency cover to sit alongside you and observe/carry out some of the tasks.

 

Why do I feel so strongly about contingency planning?  Well it happened to me – the only member of staff with the training got hit by a car on A Level Results Day 2016. A contingency plan is just like an insurance policy, you hope you’re never going to need it but it’s in place for that emergency situation… so I advise you all to get that plan in place!

 

Informing teacher appraisals using student performance data

Under The Education (School Teachers’ Appraisal) (England) Regulations 2012, schools must appraise teachers’ performance on an annual basis. The regulations apply to school teachers in all maintained schools and unattached teachers employed by a local authority. However Academies, free schools and other independent schools can determine their own appraisal arrangements.

I think most schools set at least 3 objectives or targets that relate to student outcomes, the quality of teaching and learning and an individual’s professional development enveloped by the ‘teacher standards’. We also set an additional objective for all main stream teachers to indicate their ‘contribution to the one year College Improvement plan. For subject leaders we set a further objective which is the department performance/outcomes and for those on the Upper Pay Range we also set objectives related to their sharing of good practice, mentoring of disadvantaged children and INSET contribution. For this blog though I will outline our objective(s) related to outcomes. This is the one that often causes the most potential conflict and one that can be challenging if not done in context or without some common sense!

It is interesting (especially in the light of the recent report of the ‘Teacher Workload Advisory Group November 2018 – Making Data Work’) to consider how the performance of someone who does not have an exam class could be compared with someone who does have one. On top of that the challenge of measuring the influence any one individual has on a single class is fraught with difficulties and variance. Indeed, if you consider ‘confidence intervals’ in a school, let alone a class of perhaps 20 or 30 children (and sometimes in small option groups this can be considerably lower) can we really be sure of the impact any one of us truly has? Nevertheless, in secondary schools we tend to look at performance and measure outcomes in terms of exam performance.

At my school we label this as:

Objective1: Student Attainment & Progress

(A Teacher has met or made substantial progress taking into account their status, length of service and any circumstances reasonably beyond their control)

As you can see below we set benchmarks and give a score to that which is achieved. We do this for all objectives so that an overall score is calculated. At every stage though we have the caveat ‘(A Teacher has met or made substantial progress taking into account their status, length of service and any circumstances reasonably beyond their control)’ so we try to be fair but we also state that:

The points system is a ‘steer/guide’. It does not automatically determine the final judgement. The Performance Manager makes a recommendation which the Moderation Team either endorse or reject/amend and Governing Body Committee make final decisions. As a general rule however it is unlikely (although numerically possible) that should a member of staff achieve minimum expectations or below for Objective 1 – Student Progress and Attainment but achieve highly in other areas they are unlikely to gain Exceptional Performance although each case will be treated on its merits.

Furthermore the Moderation Team in making their final decisions refers to the manual of Personnel Practice which states Teachers’ performance should be assessed in totality in the context of the wide range of information that is available, including, but not limited to lesson observation information and pupil progress and attainment data.  Individual performance may be rated at different levels in relation to each key area, but schools will be required to provide an overall judgement of performance in order for this to be linked to a pay outcome.  The Moderation Team refers to Managing and developing staff / Pay Policy / Guidance on performance management ratings: descriptors of rating and expectations of teachers in making final recommendations to the Governing Body (see Pay Policy)’

In essence we set these objectives related to student’s Key Stage 2 starting point and use FFT data to help establish these. Ideally we will be in the top 20% of schools so this is what we set targets at but in reality this is not always the case and so, after a few years of ‘trial’ (and some error!) we set ours along with SISRA subject index data as follows:

There is not always a direct correlation between SISRA’s SPI index and FFT estimates but you can see from the above that in essence if a student/class are at the top of the Subject Progress Index then they are also likely to be at or above their FFT20 estimate which we feel is indeed ‘exceptional performance’.

I was particularly keen therefore to ensure the Subject Progress Index included the right subjects. I am pleased to say that there appears to be a better correlation between subjects linked to the Collaboration groups in SISRA than there was previously with FFT comparison data. Indeed, previously some BTEC course in FFT were grouped against GCSE subjects which did give a distorted figure although I think this year it is becoming more refined.

We get very few, if any, complaints now about judgements related to outcomes. Staff generally accept our methodology which has evolved over time with SISRA’s subject Progress Index really helping to consolidate this. I am happy to share or discuss any matter so please feel free to contact me if and as you wish.

Data Collaboration and the magic Subject Progress Index

Results days have changed dramatically in the last few years. I remember the days when the results would come in on that Wednesday in August and, with a few buttons pressed, you could see not only how well the students had done but where you stood as a school. How many of them had got five at C and above? How many of them had got five at C and above including English and Maths? And, if you care about such things, how many of them had got the English Baccalaureate?

I’m glad to see the back of the 5A*CEM measure, with all its perverse incentives and focus on arbitrary borderlines. For all its many, many, many flaws, Progress 8’s heart is in the right place – the progress of every individual young person from their own individual starting point “counts” towards the school’s performance. But, because the Progress 8 score is benchmarked against every other young person in the country, the school’s performance is separated from the individual’s performance by a month. It’s not until the Tables Checking opens that you can see whether the results that you thought were good, were actually good.

There are often big surprises in the new estimates – big swings in one bucket or another, unexpected variations – which mean that you can’t accurately rely on the summer 2017 estimates when making assessments about student progress towards the 2018 examinations. And, because Progress 8 is organised by buckets, you can’t disentangle an individual subject’s contribution to the overall score with any real clarity.

As a school leader, I wanted a simple answer to a simple question: is this student making sufficient progress in this subject? Thankfully, this year, SISRA have a solution. It’s called the Subject Progress Index (SPI).

The SPI relies on the data collaboration which sprang up out of the results day blackout. Because so many schools use SISRA Analytics, it was possible to anonymously “pool” students’ data (with schools’ consent, of course), to create an approximation of the 2018 Attainment 8 estimates. It wasn’t perfect – SISRA schools are not the same as “all schools” nationally, and only those schools who consented were included in the collaboration. But, the more schools opted in to the collaboration, the more robust the estimates became, and they ended up being a fairly reliable approximation of the actual DfE national estimates.

And then, the clever bit. Within the collaboration data, it was now possible to disentangle what the outcomes were from each KS2 starting point in each KS4 subject. Clearly more students took some subjects – English, Maths etc. – than others, which made their dataset more statistically reliable. But the Subject Progress Index gives school leaders a fair approximation of whether a student’s performance in an individual subject is similar to, better than, or not as good as students nationally who have taken that subject with similar starting points. This can be broken down by class, subgroup and individual students, so that it is possible to compare whether one group has made better progress than another.

And the SPI can also be applied to internal data collections too. That means that, for this year’s Year 11 mock exams, we can get a sense of whether students are performing in line with, above, or below their peers nationally with similar starting points in similar subjects. Admittedly, this is based on the 2018 exams and only uses SISRA Data Collaboration results, but I am comfortable that this gives us a close-enough approximation of progress to allow us to target interventions and differentiation where it is needed. And this isn’t just at the arbitrary strong pass or standard pass borderline, but at every student at every attainment level who could be making more progress, or doing that little bit better.

Progress 8 was designed to compare schools, to rank them, to show which schools were better than others, and which were worse. What makes the SPI special, I think, is that this measure, designed to promote competition between schools, has instead spawned a spirit of collaboration and cooperation which works in the best interests of students and their teachers and leaders.

 

 

Do your staff think privacy first?

It’s easy to approach the EU’s GDPR (General Data Protection Regulation) as a box-ticking exercise. Appointing a DPO (data protection officer), updating policies and procedures, training staff, mapping data and reviewing suppliers are all necessities for schools, but does just ticking them off go far enough to protect the vast amounts of personal data in your care? Human error is the main cause of data breaches, so understanding your school’s data handling culture and targeting areas for change will help reduce the risk.

The Regulation requires organisations to record all data breaches and report serious breaches to the ICO (Information Commissioner’s Office). Reporting must happen within 72 hours of becoming aware of the breach and, in some cases, all affected data subjects must be informed. There are no exceptions. Hitting the headlines for a data breach is not something any school wants.

A starting point in understanding the data handling culture is to see what is happening through a data protection walk. We’ve compiled a checklist to take with you as you walk around your school.

Start with the outside of your school and walk through reception as if you are a visitor. What information is available? Can you overhear conversations or view the receptionist’s computer screen? Has consent been obtained for the photographs on display? Is the staff room open access? What’s left lying about in there? Are staff pigeonholes open? Can you pick up other people’s work from the printer or out of the recycling bin? Do staff secure their devices and passwords? Are staff and pupils following the school’s BYOD (bring-your-own-device) policy? And continue around the school.

Completing the form will help identify and prioritise areas for change, where staff need support and which school processes need developing.

Click here to download the data protection walk checklist

ANDREA DOBSON

Prior to joining SISRA Andrea had a varied career path including 10 years working for the Ministry of Defence, 4 years as an Early Years SENCo. Andrea was a Data Manager in a mixed 11-19 school in rural Wiltshire, where she became a confident user of various SIMs modules, including Assessment Manager, Course Manager, Nova-T6 and Exams Organiser. A passionate believer that when data is used smartly it has the power to change student outcomes, she has spent much of her time promoting the use of systems which would save teacher time, improve the parent/school interaction and improve student outcomes. With this goal in mind Andrea started using SISRA Online in 2010 and quickly fell in love with the accessibility and flexibility of the system, as did the teaching staff in her school. Andrea joined SISRA in 2018 and is based in the south west of England.

“It was the best and most enjoyable training I’ve had. Andrea helped us set up things that we hadn’t quite mastered and we worked on some great tweaks that I am very excited to have. It was totally interactive and not constrained. Andrea is a brilliant trainer and we really enjoyed having her here for the day.”
Examinations Officer, North Somerset

“Andrea was extremely helpful, supportive and encouraging.”
Head of Maths, Devon

“Absolutely fantastic day, really helpful.”
Business Manager, Cornwall

“Andrea was fantastic and she followed up with a few emails to see how I was getting on.”
Data Manager, Herefordshire

“Andrea was really friendly and super helpful. I really liked the one to one approach.”
Deputy Head Teacher, Cornwall

“All parties came away from the session with better knowledge and next stages to help advance the use of SISRA Analytics in school.”    Data Manager, Bristol

Data Collaboration – Breaking New Ground Part 2

You may have read my previous blog in which I discussed our Data Collaboration and the importance of accurate data. This obviously doesn’t just apply to the collaboration and is fundamental to good data management anywhere. In this blog I’m going to share some of my top tips and preparation which can be used independently to make the life of a data administrator a little easier overall.

Organisation is key!

In her blog ‘How to be a great Data Manager’, my colleague Kate Moon spoke about the need to be organised, and the importance of ‘understanding not only your assessment data but understanding education’.

With that in mind, the first thing I would recommend doing if you want to be organised is put together an “Exam Bible”. This bible will allow you to view important information at a glance.

In the school where I worked, this was put together by our Exams Officer in conjunction with our faculty heads.  Each member of exams and data staff would have a copy of this printed out and spiral bound on their desks to refer to throughout the year. An example of a couple of pages are shown below:

Along with the awarding body and level of the qualification, you can see roughly how many students are being entered. Including the QN code of the qualification makes it simple to check the official guidance from gov.uk for discount codes and performance points.  This really comes in handy when setting up SIMS for assessment points, target setting etc., as you can see exactly what qualification students are working towards. For more SIMS best practice ideas, have a look at my colleague Matt’s SIMS Toolkit:

Part 1 – https://www.sisraanalytics.com/downloads/sa/links/SIMS_BestPractice_Toolkit.pdf

Part 2 – https://www.sisraanalytics.com/downloads/sa/links/SIMS_Tips_Toolkit.pdf#

When you search on the DfE’s Key Stage 4 qualifications and discount codes guidance, you will see that both Film Studies and Media Studies in the example above have a discount code of KA2 so this is definitely worth bearing in mind when helping students make informed decisions over the courses that they will be studying.

Create your own digital data library

We know the education landscape is ever changing so I would recommend bookmarking useful links, such as KS4 Performance Points, to ensure that you are always up-to-date.

Some other sources you may find useful are:

Secondary Accountability Measures

Qualifications Counting in the English Baccalaureate

16 to 18 accountability measures guide

Performance tables: technical and vocational qualifications

I’d also recommend signing up to DfE notifications:

  • Go to www.gov.uk/government/publications
    · Select the dropdown list of ‘Departments’ and select the ‘Department of Education’. This will
    ensure that you only receive notifications about publications relating to the DfE and not for every
    government department.
    ·              Click on the ‘email’ icon
    ·              Click on ‘Create Subscription’
    ·              Enter your email address when prompted and select ‘Submit’

SISRA also have a Facebook group called School Data Management where school data staff can network and share ideas.  You can request to join the Facebook group here.

Obviously, we’re right at the beginning of the new academic year so a great time to start planning your “exam bible” or bookmarking useful guides (most of these are available within the HELP area of SISRA Analytics). How about spending an hour or so over the coming weeks putting together your list of qualifications, the relevant QN codes and discount codes (if any)?  This shouldn’t take too long at all as most of the data will already be held within your MIS. It’s worth taking the time to get useful information ready so you are prepared for anything!

Hopefully, these steps will help you throughout the academic year and make sure that your data can be used in future data collaborations. If you do have any questions or if you need any help, please don’t hesitate to get in touch with the consultant team by email on consultants@sisra.com.

 

 

 

 

Data Collaboration – Breaking New Ground Part 1

When Duncan Baldwin contacted me over a year ago with a random message “Claire, it would be useful to have a conversation with your developers/boss”, we at SISRA Limited were really intrigued.  The result of that conversation? An exciting exercise which culminated in over 1100 schools agreeing for us to use their anonymised data to create Attainment 8 estimates long before the release of official figures.

The feedback from schools was outstanding. One Headteacher sent me a text when the very first set of collaboration A8 estimates became available. He was feeling “a little bit emotional” as his school’s P8 had risen by 0.29 from the previous year.  It was hearing stories like this that confirmed to me that collaboration was the right direction for education data and for our schools.

Collaboration Continued

Moving on a year, following more conversations with Duncan and hundreds of schools about further collaboration ideas, the 2018 collaboration is bigger and better.

We’ve been told time and time again that schools have been frustrated about not being able to predict accurately due to new specifications and even on results day, they wouldn’t actually know how they had done compared to others around the country. Wouldn’t it be great if schools were able to compare how a qualification taught at their school did against other schools? As we (SISRA) discussed this more, the juices flowed and the idea of subject transition matrices, as well as a subject progress index (SPI), evolved. (SPI – A SISRA-exclusive measure showing how each of your pupils have performed in each subject compared with all students with the same KS2 Prior in our Data Collaboration. Think Subject P8 but better!).  This new feature could also be used for reflection on the previous year when schools are doing exam analysis and planning the year ahead.

Of course, to do something like this, data staff would need to do a small amount of administration.  SISRA Analytics is extremely clever but not clever enough to know that a subject called Computer Science is actually AQA GCSE 9-1 in Computer Science (QN Code 60183019). But by completing some simple steps, and ensuring that all data is correct, it’s possible to take part in future collaborations and access estimates and features close to the exams period.

Collaboration relies on schools being willing to take part in such an exercise, but it’s also imperative that the data used is suitable and accurate. The more schools that opt-in and follow the collaboration steps the more accurate the estimates become and the more insight schools will have on progress. The scope for expanding the data collaboration is incredibly exciting but the basic principle of good accurate data applies.

In my next blog, I’m going to share some tips and preparation for collaboration. It’s worth noting that these pointers aren’t collaboration or SISRA specific – they can be used by any data administrator to make life a little easier!

If you have any questions regarding the Data Collaboration, or are looking for support with your Analytics set-up, please email us on consultants@sisra.com.

Sharing your Data to Help Leaders Everywhere

We’ve seen an extraordinary change in school accountability this year. Normally when I say things like that I’m talking about radical change in performance measures imposed on schools by the DfE. This time I’m delighted that the change has been led by schools. This was the year that collaboration became a ‘thing’.

With such high numbers of schools electing to share their results via SISRA, so that around 185,000 pupils are included, we can be pretty confident that early estimates for Attainment 8 averages are going to be close to the final results. This all means schools will know their Progress 8 scores accurately enough for the start of term.

Being able to look at the emerging national picture based on your shared results is incredibly helpful. It looks like the Ebacc component of A8 averages have gone up, which is all down to the reforms in GCSEs and more points being available for those subjects in 2018. But the open element averages have gone down considerably. I believe this is mainly due to the removal of the ECDL qualification. These two changes have acted to cancel each other out in overall terms, but individual schools might find changes in their P8 based on their curriculum mix.

But the really exciting part of this collaboration is nothing to do with headline measures. Real accountability is about pupils. How well did they do in each subject, and how do we know? To answer these key questions we need subject level tools which compare results with entries in the same subject. We need to compare like with like.

Hence I am delighted that SISRA have developed two new tools to do just that using transition matrices and subject value-added (which will be known as Subject Progress Index or SPI for short). These tools show us how grades are distributed by prior attainment so we can be clear what good progress looks like.

 

These graphs show the value added in the reformed science GCSEs based on the collaboration, and can be used to see which grades are above or below average by prior attainment.

You will see that, on average, pupils with higher ability get noticeably higher grades in the separate sciences than their peers who entered combined science. Historically many schools enter their more able pupils for biology, physics and chemistry rather than core and additional or, this year, the new double science in the belief that they are more challenging, but this graph casts doubt on that. It’s worth reflecting therefore whether more pupils might benefit from taking separate sciences.

This is the sort of information which school leaders need to make important decisions about performance and curriculum planning early in the school year. By sharing your data, you help leaders everywhere. Thank you for doing it. Thank you for helping us all to move towards a school-led, self-improving system.

Introducing the SISRA Subject Progress Index (SPI)

The Subject Progress Index (SPI) has been introduced into SISRA Analytics Reports for schools involved in KS4 Data Collaborations (from 2018 onwards).

The SPI provides a direct indication of how a pupil, class, subject or any group of pupils has performed compared with pupils in the same subject with the same KS2 prior. A true subject value added measure. Think Subject P8, but RELEVANT (and far more FAIR and ACCURATE than dividing A8 by 10 or slicing up baskets!)

With over 1150 schools opted in (185,000 pupils) as of Window 3 – we are confident we have enough data across all subjects for our SPI to be both accurate and really quite revolutionary, finally giving schools an accurate subject VA-style measure. Pupil SPIs calculated using low cohort figures (e.g. below 25) will be marked as such with an icon.

A visual explanation of the calculation is available at the foot of this post.

 

How will this be shown in the Reports?

Reports have been updated to show any Data Collaboration related data in “BLUE” columns.

The following shows the updated Grade List Report. Here we can see not only how our pupils have performed in the recent exams (this happens to be English Language), but we can also see how their progress from KS2 compares with all pupils on the same starting point across all opted in SISRA schools for this subject.

 

The above shows that William Adams (KS2 prior of 4.3) achieved a Grade 5 in his exams for English Language. The SPI of 1.34 is telling us that William achieved more than a grade higher in English Language than the average English Language grade for pupils who also had a KS2 prior of 4.3. Notice the boldest colours denote positive/negative differences of more than 1 whole grade.

SPI can be calculated at a higher level also, to give an indication of class, subject and perhaps even faculty performance compared with SISRA “National” picture. The Grades Totals report shows the average SPI for each Qualification and if “percentage view” is enabled then you’ll also see the percentage of pupils with a positive SPI.

 

Calculation Example

Joe Bloggs came in on an (avg) KS2 prior of 4. He then achieved a Grade 4 in English Language, shown by the green square below. The Average English Language exam points achieved by all KS2 level 4 pupils was 3.3 (identified in blue below).

Joe Bloggs’ SPI for English Language is 4 – 3.3 = 0.7.

Joe achieved 0.7 of a grade higher than the average KS2 prior 4 pupil in English Language.

Another way of visualising how SPI is calculated is to simply imagine our Data Collaboration A8 estimates as below, but with columns added for each subject group. Art would have its own estimates, as would Biology etc.

Each pupil’s Subject Progress Index is their performance minus ‘estimate’ in that subject.

 

The future?

We believe the SPI will become a vital tool for schools, not only to see how pupils have performed compared with their peers, but as we move further into the new academic year we expect you to be able to compare current cohort assessments and even targets against last year’s cohort, allowing you to ensure you are doing all you can to add value on a subject by subject, pupil by pupil basis.

Bespoke Observation and Appraisal Analysis

Gordano School in Bristol have used SISRA Observe since its inception in 2014. Headteacher Tom Inman and Associate Assistant Headteacher Charlotte Thomas discuss the use and impact of Observe at their school.

WHY OBSERVE?

Tom Inman: Before Observe, we were using a spreadsheet solution for collating grade outcomes for lesson observations, with paper copies of observation records. We wanted to find a way of collating feedback from learning walks and other self-evaluation activities, and enable line managers to have easy access to lesson observations.

We chose Observe because we liked that you could ‘build’ your own style of observation and learning walk ‘forms’ from scratch using a range of qualitative and quantitative data, and analyse this in a quick and simple manner.

Also, the structure was similar in some respects to SISRA Analytics, which we had become used to.

DID IT TAKE LONG FOR YOU TO SET-UP AND IMPLEMENT OBSERVE?

TI: We spent a couple of months to develop our forms, and ran a trial with senior team members in the summer term prior to full launch in the September. A few members of staff took a little time to get used to the format but most found it intuitive and easy to use.

Charlotte Thomas: I find Observe really simple to use, but I guess at the end of the day it’s as simple as you want to make it. The first kind of encounter I had with the system after I joined the school was using it for learning walks. I found that the learning walk template we set up was so straightforward that I could go into the classroom, concentrate on the lesson and quickly login to Observe to type it up afterwards, which is great. I know Observe works really well with a tablet but I chose not to use one.

If I do formal lesson observations with my staff, which are obviously a lot lengthier than a learning walk, I’ll take my laptop in to the lesson with me to type up the record as I go. It just saves so much time doing it that way instead of printing off a form, hand-writing notes throughout the lesson and then typing it up afterwards.

WHAT ARE THE MAIN BENEFITS FOR YOU?

CT: Having everything in one place, especially since I moved up to an SLT role, as I can get more of an overview which is great. Speaking in my head of faculty role, it’s nice that everything that relates to my faculty is there, and I can get hold of it without it being too cluttered.

As an SLT, you just don’t have the time to go through every departments individual observations, whereas with Observe, even though it’s not something I have done a lot of yet, I can see the facility is there for me to analyse the results of our observations. At my previous school, the observations were stored on word documents in different areas, you would have to go through each observation and spot patterns. With Observe, I can use the Reports tool to quickly see the data that I need to see.

TI: We’ve also saved time as we no longer have to photocopy and collate our observation and self-evaluation records.

IS THERE ANYTHING YOU’RE HOPING TO DEVELOP IN OBSERVE IN TERMS OF YOUR USAGE?

CT: We are looking at bringing the appraisals onto Observe for the next cycle as the appraisal system we use at the moment is paper based. I want to move away from a system that people can quickly tick a box or complete it as soon as the target has been set, or is a cause of panic at the end of the year if a staff member hasn’t completed it. Instead of focusing on one tiny thing, it should be about improving your practice generally. And rather than being able to quickly put together one piece of evidence for a specific target, you should be able to attach evidence throughout the year to work towards the end goal of becoming a better teacher! I’m really interested and looking forward to making use of the Resources and Actions features within Observe as we move forward with appraisals.

I’ve recently set up another learning walk template for the pastoral side of things, i.e. tutors. I found that really easy after my online training session (which was actually for appraisals rather than the learning walk). It was easy to transfer that training and set up this pastoral learning walk template.

WOULD YOU RECOMMEND OBSERVE TO OTHER SCHOOLS?

CT: Yes. The great thing about Observe is that it keeps everything in one place. I love that we can tailor our system to how we want it, and it’s not just a one size fits all. I’ve found the support from the whole team to be really helpful, and they’re always keen to hear ideas from schools about possible new features and developments.

SANDRA BARCLAY

Originally from Scotland, Sandra left her operations manager role with the Halifax when she and her family moved to Essex in 2007. With sixteen years of data management in banking behind her, Sandra put this to use when she joined a local secondary school as their Data Manager in 2010. The school became SISRA online customers in 2011, and Sandra was quickly impressed at both the efficiency and effectiveness of the system. A firm believer that accurate assessment data and challenging targets provide the basis for sound intervention, she encouraged and trained staff in the best use of Facility, Go 4 Schools and SISRA Analytics as a powerful means to maximise student and school results. Sandra was delighted to join SISRA as a Data Consultant in 2018, expanding the consultant presence in the South East of England.

“From the pre-planning to on the day, Sandra exceeded our already high expectations of what we could get out of such a session. In addition it was a pleasure to spend the time with her – her enthusiasm and work rate is phenomenal!”
Data Manager, Hertfordshire

“Sandra was a fantastic help to us – excellent communication before the sessions to help us get the most out of it. Very supportive and knowledgeable and we all thoroughly enjoyed our days training. Thank you very much.”
Assistant Headteacher, Essex

“Having spoken to some of the members of staff that attended the training, they have all said how useful the training was and that they will now be able to use SISRA Analytics more effectively. The health check that was provided before the training was very effective and allowed me to touch up our system and make the improvements, thank you. We understand that because of the way that we needed the sessions to be run it would be tight to do so in the session time, so thank you for being able to make adaptations. We are going to look at the possibility of running more in-depth sessions with certain groups of staff, so if we do, we will be in touch. Thank you again for the fantastic training consultancy.”
Data Manager, Southwark

“Sandra was brilliant. So helpful and patient. Sandra introduced things that we were not using which has been met with much enthusiasm. Thank you Sandra for your patience and a great session.”                                                                                                                                                              Data Manager, Luton

“We had a fantastic training session. Sandra explained everything really well and the session was personalised for our needs. We came away feeling very confident.”                                                                                                                                                                                                                        Exams & Data Analyst, West Sussex

“After 4.5 years of using a different provider to do the analysis it is a breath of fresh air to find a company that is so professional to work with. I feel excited about my role again as I feel I will be able to deliver data to SLT and HODs with ease and accuracy. I am really looking forward to using this tool – I wish we’d picked SISRA Analytics years ago!”                                                                                                                                            Data and Assessment Manager, Harrow

“Sandra had a really calm, warm and professional manner and she was very patient.  She certainly opened my eyes to much more functionality within SISRA Analytics. She is a credit to your company and I would fully recommend her to other schools.”                                      Data Manager, Lewisham

Observations without grading

LIFE BEFORE OBSERVE

It was all paper and pen – we did use a different system, but for a very short amount of time.

When everything is on paper you can’t really get a good overview of the school, it’s all based in departments and faculties, but then that makes it difficult to judge across the school. With Observe, you can collate information and then use that to plan CPD and progress.

OBSERVATIONS WITHOUT GRADING

We’re doing a big change around this year in terms of how we want the templates and QA to look, so it’s taken a little bit of time to work out what we wanted to do, and then how to create that in Observe. We’re moving away from the old Ofsted grading to ungraded observations. We will use a judgement instead of the old 1-4 grading.

We started using Observe for SEF this year for performance management, so we’re hopefully going to be able to develop that further when we’ve created the new templates that move away from the Ofsted grading. I would also like to see us getting more use out of the Actions feature and see more Resources being uploaded and attached to the records.

THE IMPACT

The reports information is really useful and quick to access, so I use that all the time for seeing what type of support the school needs, and what QA and CPD we need to put in place.

In our school, Observe is primarily used by Middle and Senior leaders. The middle leaders are able to view the data for their departments and staff, but everyone has their own account where they are able to login and view their records.

We have a policy where verbal feedback has to be given so we’ll go through the record in person with the member of staff, so they can view their records at any time, but we will always have a meeting to go through it after an observation

I would 100% recommend Observe to other schools, absolutely. I already have actually!

Dawn Ashbolt, Director of Teaching and Learning, Wrenn School