During almost ten years in education, my overall use of data is at an all-time low. I know many people have a need to quantify; whether it’s the number of steps taken with a pedometer, followers on social media, or mentally totting up the number of cups of coffee consumed before midday, if we can add it up, average it, or worse, we will! Sometimes people don’t even mind if the resulting figure is made up as long as it gives us some information to paste into our latest pack or document.
Money? Love? No… DATA makes the world go round!
I too enjoy a graph, a table of data or an investigative hour with a spreadsheet. I can’t lie about that, but if you need your fix, the Office of National Statistics website* has plenty of data to play with so perhaps it might be best to use that as your sandbox and not stuff which might actually affect someone’s life. Such horror stories include (but not limited to) appraisal ‘targets’ for teachers based on Progress 8 predictions for Year 7. If you fancy a laugh at data’s expense, there are some incredible examples which tie in with this sentiment on Tyler Vigen’s popular site*, Spurious Correlations. *See links at the bottom of the page.
I frequently devote time to thinking of ways to monitor the quality of teaching and learning without applying the traditional outstanding to inadequate. In fact, how to evaluate teaching and learning without grades is the topic I am asked about most whenever I’m working with a school or showing Observe to a school who’s never seen it before.
For schools who have stopped grading altogether, there are still plenty of ways to analyse what’s observed while steering clear of whole grades for lessons.
All the analyses below have been taken from real schools using SISRA Observe for their observations (identifying data removed).
Lots of schools have opted for a set of questions based on what they expect to see in a lesson. They then go on to say ‘Yes, I’ve seen it’ or ‘No, I didn’t see it’.
Quantifying observation data then becomes straight forward, if a little limited. Over time, this approach can certainly give a good overall picture about what’s being done consistently, and what needs to be embedded further. It’s also reasonable to apply targets to this type of information as it is objective in nature and works well at whole school level.
|% Yes||% No|
|Teacher in class to greet pupils on arrival?||75||25|
|Are folders set-up for each lesson within the unit?||66||34|
|Resources ready so pupils begin to learn on entry?||100||0|
Further analysis could then be performed at faculty or subject level to provide more detail and allow targeted support. Analysing the data in this way takes the focus off what’s not being done and can show where standards are consistently high.
Areas of Strength and Development
Another effective approach is to decide which areas to focus on taken from a set of whole school priorities. This works so well when those areas are well thought out and relate specifically to a range of objectives for the year. This could provide a direct link back to the whole school improvement plan and individual faculty improvement plans as well as the teacher standards if desired.
Observers choose one or more strengths of the lesson from a finite list and then choose one or more areas of development from the same list. Accompanied by written and verbal feedback, over time this can provide a wealth of information including matching up staff who have complementary strengths and areas for development. Schools using Observe tell us that they use this information to pair up teachers for joint observing and lesson study.
The school in the example above can immediately see what they do well (climate, relationships) and in which areas staff require more support (challenge, assessment).
Student Based Questions
‘Were students focused on learning?’
‘Did students have the right equipment?’
Some schools prefer to use a scale such as; All Students, Most Students, Some Students, No Students. Taking this approach with multiple drop-ins or learning walks can allow a leadership team to see who is consistently meeting the expectations and who may benefit from some support. Once a school is satisfied that an area is being met consistently, they may decide that observing it regularly is not necessary and this could allow them to concentrate their efforts elsewhere.
In my experience, it’s certainly possible to use data in a reasonable and sensible way in relation to observations of all types. If you’ve moved away from grading whole lessons, these approaches can allow you to measure impact over time and also shout about what’s going well.