
Reflections on the new inspection framework: 1742 days under the lottery finger
My school and I have spent 1742 days in the OFSTED window (our last inspection was Feb 2018).
1742 days where, on any idle Monday (and until recently Tuesday and Wednesday) the phone could ring and inspectors could arrive at my door.
Four years, on and off, of ‘will today be THAT day’ eats away at you.
Like the 90’s lottery finger forever about to point down at our school.
Today, the first ‘post-OFSTED’ Friday, one member of staff quipped that, after being at the school five years, she’d never seen a ‘non-OFSTED Simon’.
And so, this September, when OFSTED asked for volunteers to face their new inspection framework, we put our hands up.
We’d had enough of waiting and just wanted it done.
For this I got some flack. Not much, but a slight undercurrent of me somehow being a ‘collaborator’ (?!)
So, before we get into the details, let me be very clear: I don’t have a problem with OFSTED as a concept.
I had a big problem with how the initial Ruth Perry tragedy was managed but personally believe that inspection has its place in our system.
Sorry – it’s just the way I feel.
Most importantly, before I recount what I learned from our inspection (one of the first under this new toolkit), I want to add a disclaimer:
MY SAMPLE SIZE IS ONE. THESE MAY NOT BE UNIVERSAL TRUTHS AND SHOULD BE TREATED WITH CAUTION.
I cannot count the number of LA meetings I’ve sat through as a Headteacher where one inspector’s personal flourish during inspection results in every school in the county being told to put a risk assessments above the goldfish bowl or make all the lunchbreak staff wear purple cagoules.
I will therefore try really hard not to start myths, but in case I fail, OFSTED (or Simon’s blog) haven’t told you to do anything…
I’m writing two blogs on this subject. This one, and then one on some tips which helped us through.
Part 1: Reflections and Observations
The inspection was exactly as advertised.
The inspection ran almost exactly as detailed in the OFSTED Operating Guide. The phone call came at exactly 9:30am and lasted 90 minutes covering everything their guide said it would.
The timetable for the first day wasn’t exactly the same as an inspection at another Trust school the week before, but it was pretty similar:
- The team walked through the door at exactly 8am.
- 8:15am: The inspectors met the staff and said hi (friendly and nice).
- 8:20am: The lead inspector then went through safeguarding documents and the SCR with the School Business Partner. Again, this was slightly different in what was asked for from the previous week’s inspection, but nothing that couldn’t be predicted.
- 9am: A meeting with team inspectors where they picked 6 children for case sampling (all SEND, PP, PP+, CP, LAC, or Service).
- Then Day 1 was a mix of the following:
- Learning walks – between 3 and 4 of these pretty much throughout the day focusing on:
- EYFS (but also ventured into KS1)
- Learning walks – between 3 and 4 of these pretty much throughout the day focusing on:
- Inclusion
- KS1 behaviour, inclusion, curriculum and teaching
- KS2 behaviour, inclusion, curriculum and teaching
- But firstly, one with the lead inspector and myself walking the whole school, looking at leadership decisions relating to behaviour, inclusion, curriculum and teaching at a strategic level.
- Focused leadership meetings (Inclusion, Behaviour & Attendance, Curriculum).
The second day timetable wasn’t decided on until the end of day 1, and was framed around areas which hadn’t yet been explored (for us Personal Development, and pinning down some details about attendance). This gave us opportunity to show the inspection team things we think they’d missed, which was a welcome change.
Most of the paperwork we’d made wasn’t needed
OFSTED have always told schools not to make paperwork just for them but school leaders like me always have. We slaved over a SEF and sent it to them before the inspection (along with our SDP) but it wasn’t really mentioned again. Some bits we’d made were useful (more on this in the next blog) but generally my beautifully labelled file sat in my office unopened.
The IDSR did drive some trails, but data wasn’t a massive part of the inspection
The IDSR was a ‘jumping off’ point for the inspection and did drive some inspection activity (for us around FSM attendance) and inspectors did grill us on our understanding of data (including how we’d used EYFS baseline analysis to re-shape provision for the current cohort). It provided the mood music for the inspection but wasn’t a massive part of it. This might have been because we had no red flags on our IDSR (and a number of greens), so, again, please don’t take this as a universal truth.
But once inspectors understood our understanding of statutory data, they stopped asking about it.
The tone of the inspectors was completely different to the previous four inspections I’d been involved in.
From the 90-minute phone-call onwards, there was a palpable shift in tone from the whole inspection team. During the initial call (now on Teams so you can see the person), the lead inspector checked how we were doing and offered simple phrases of encouragement such as ‘thanks – that’s really useful’. This lowered stress from the outset.
This was the pattern for the rest of the inspection. The team were… well… more human. They talked about their own experiences as school leaders and showed empathy for the difficult decisions which leaders must juggle. I even got to know a bit about them personally, which was a BIG no-no in all previous interactions.
Again, this greatly improved the professional dialogue.
The learning walks were a collaborative conversation
This again was a BIG difference. I was used to learning walks where an inspector would ask loaded questions like ‘tell me about the quality of learning here’, which actually meant ‘guess what’s inside my head’. Inspectors seemed genuinely curious and didn’t pre-judge.
The conversations seemed to flow easily and there was never any pressure to come up with a particular answer. I never felt the need to bullshit. That said, we knew our classrooms and provision VERY well. We had visited every class most weeks for about three years so knew exactly what we were going to see. We, as leaders, also learned the needs and provision of every disadvantaged pupil (130 of them) so were very well prepared (more on this in the tips blog to follow).
Please don’t mistake this comment for these being easy and requiring no prep – they were rigorous! But if you’re well prepared, they were fine.
The learning walks had a very broad scope (which was tricky)
One difficulty with the learning walks was that they didn’t have a specific focus (apart from the inclusion walk). This meant you needed to narrate subject pedagogies, how the needs of specific disadvantaged learners was being met and how behaviour expectations were maintained – all at the same time. This did lead to a bit of anxiety as it was a lot to remember (and we are a school with a singular pedagogy and very codified approaches to almost everything).
The expectation that leaders knew what provision was in place for disadvantaged children, not just generally, but specifically, was a lot. This resulted in our prep pre-inspection focusing on ensuring we had a clear understanding of this. Many, many hours in our case (we have 130 children on the SEND register alone). Now, who knows whether we over-killed to make ourselves feel more prepared, but this was probably the biggest time commitment leading up to the inspection – and we already knew our school and our teaching and our children VERY well.
There were a LOT of inspectors
In a 420 primary like Blackhorse, we had four inspectors (and a Senior HMI QA’ing the process). We were encouraged to join every learning walk, which was great and added to a sense of transparency.
But that required four confident leaders for most of the inspection.
In my school, part of The Leaf Trust, we had myself (Executive Head), a very experienced Head of School, a very experienced Deputy Head and our Trust’s Director of Education, who knew the school and children really well. We also had our CEO, an inspector himself, on site the whole time. So, for us, it was fine – we could man-mark every inspector throughout with confident, articulate and experienced leaders. But in a smaller school? Or a maintained school without a wider network? This could be really challenging.
Focused leadership meetings looked to get the best from leaders
In the past, meetings with OFSTED involved a grilling from a poker-faced inspector who (and I know this from my own Tribal training) had been told not to smile.
In this new framework the questioning, although searching, was phrased slightly differently. It was more open, and inspectors listened more actively. They gave more encouragement and were clearer in what evidence they were looking for. They encouraged (especially more nervous middle leaders). They also seemed to be more on the school’s side: helping us find the evidence they needed. Again, don’t get me wrong – there was no room for bullshit, which they smelled in a heartbeat – but lots was done to put people at ease.
Case Sampling was fair – but is still a ‘developing art’
I’d built case sampling up in my head to be bigger than it was. Sure, they heard these children read; they looked through their books and focused on them in pupil conferencing; they talked to them on every learning walk. But this was about it. The team chose the six children but allowed us to suggest an alternative to a child who wouldn’t cope with that level of attention. I suspect this part of the inspection will develop over time.
Pupil conferencing disproportionately focused on disadvantaged children
Again, I get this. It’s something we talk about at a school level: does this school work for most or all?
They spoke to a LOT of children; least five separate pupil conferences, almost all chosen by the inspection team: Y2-4 disadvantaged, Y5-6 disadvantaged, single sex boys/girls (around personal development and protected characteristics) and our school council (which we were allowed to pick).
But when most evidence on teaching and curriculum is gathered from this group it can result in assumptions quickly becoming lines of enquiry which are doggedly followed.
So, the methodology is a big improvement
All the above points to a much better inspection methodology. A can’t fault the inspectors and the efforts they made to make sure we felt at ease (they even checked we’d eaten and had a cup of tea – a big change from the last inspection).
We all felt like we’d established a personal connection with the whole team by the end – which is the first time in twenty years I’d felt this.
But then we came to the toolkit…
We met at lunchtime on the first day, and then at the end of the first day (and the same on day 2) to look at the toolkit and its ‘secure-fit’ judgements. Nothing was therefore a surprise. Indeed, the Lead showed great curtesy in coming and seeing leaders when they felt a particular judgement could be controversial. Again, 10/10 for inspector behaviour.
But the toolkit…
The toolkit was followed to the letter
There is zero wiggle room in this framework. If you can’t prove every word of every statement, then that whole judgement is pushed down to the one below.
For ‘Expected’ this is easier as words like ‘broadly’ have been (quietly) added over the last month, which allows more scope for inspectors to flex. But for ‘Strong’, phrases like ‘all children’ and ‘all subjects’ appear. These leave zero room for manoeuvre.
My school has data above national in almost every measure. In old money we’d be knocking at the door of Outstanding in many areas. People visit us from all over to find out about our Personal Development, Inclusion and teaching approaches.
But this new toolkit raises the bar significantly.
I asked the Senior HMI if this seemed fair. He gave a good answer, namely that the toolkit sought to remove the biases and individual likes and dislikes of individual inspectors. I understand this, but the wording of the document as a whole needs refinement – in my opinion.
There are no ‘limiting judgements’ but Teaching & Curriculum, Achievement and Leadership all appear to be linked…
We pressed on this, as we thought this framework was advertised as being more nuanced – with no one judgement causing a domino effect on others. However, these three areas seem to need to align under the new toolkit. Indeed, clarification was sought from the lead who rang the helpdesk – which seemed to confirm this.
People need to get ready for a complete re-set in grade expectations
Whilst inspectors are not supposed to draw any parallels between the old ‘Good’, ‘Outstanding’ etc, I know of a number of inspectors who all say the same: ‘Expected feels like a high good’, ‘Strong feels like a high outstanding’. Exemplary? Who knows! Somewhere inhabited by unicorns and fairies, I guess…
I’m told the ‘new normal’ will be lots of ‘Expected’ judgements, with most schools getting one or more ‘needs attention’ and some getting some ‘strongs’. A flattening of the bell-curve.
My understanding is that all the pilot school inspections will be released on the same day so that OFSTED can ‘manage the press’…
And this is my main beef.
Having crafted a great methodology, attaching ‘Expected’ as a label to a much higher bar seems out of kilter with the spirit of this new framework. The previous ‘Secure’ (which was mooted as the main judgement grade) was much better. ‘Expected’ as a term seems quite, well, … ill-considered.
I worry it will erode morale unnecessarily. That great school leaders will feel like their additional efforts… the time away from their own families to create great schools, will be described as ’expected’… ‘meeting statutory obligations’… ‘mediocre’…
And so, I finish with a poem. A poem which I have read at every open morning for the last 15 years as I pulled Blackhorse from the edge of special measure to one of the most popular schools in the county. Please don’t read this as any critique of our own inspection, which is yet to be signed-off – it isn’t. But it is a worry about this new approach.
The poem is about why we we don’t do ‘average’ at Blackhorse. Why we want every child to leave believing that they are a champion…
Because, we all must ‘do good as we go’…
The Average child – by Mike Buscemi
I don’t cause teachers’ trouble;
My grades have been okay.
I listen in my classes.
I’m in school every day.
My teachers think I’m average;
My parents think so too.
I wish I didn’t know that, though;
There’s lots I’d like to do.
’Cause, since I found I’m average,
I’m smart enough you see
To know there’s nothing special
I should expect of me.
I’m part of that majority,
That hump part of the bell,
Who spends his life unnoticed
In an average kind of hell.
Thank you. Really useful as we will be one of the first non-volunteers. Did you do a ‘SEF’, was it useful and did they read it? Looking forward to part 2!
You can all enjoy the Christmas break now!
Thank you for this lovely , balanced and informative response to your Ofsted experience. It is a relief to hear things may be improving. I am no longer teaching on the ground, except for through writing my books to encourage teachers of all kinds to pursue research. I admire the work of good teachers wherever they may be!