I find nothing more irritating than to have my intelligence insulted, particularly by people who think they are being helpful but haven’t actually got a clue.
If you have ever come across the ESFAs ‘Efficiency Neighbours’ data you’ll know what I mean. This blog is for any school leaders who are told they should use this data to help them identify financial efficiencies. To save you some time, I publish below my submission to the ESFA in which I try to explain why their data is rubbish.
Now I don’t like to bite the hand that feeds, so I will repeat that I think this data has been put together in good faith. Financial efficiency is important and it is right that schools should be expected to benchmark costs against similar schools and identify ways they could operate with better value for money. Thanks for trying to help. It’s just… well, perhaps some vague understanding of the education system to complement your accountancy qualification might be in order to avoid embarrassing yourself.
You may point out (not unreasonably) that I might be in a better position if I had an accountancy qualification. You are probably right. However, in the five years I have been a headteacher I have had to make more than £1m of cost savings in my school. I am no accountant, but I do know a thing or two about efficiency. I know how to suck those eggs.
For the blissfully unaware, the ‘Efficiency Neighbours’ data places your school alongside 49 other ‘similar’ schools and ranks these according to how financially efficient they are. The intention is that you can look for more efficient schools and go and find out how they are doing it. On the surface, this appears sane. However, the data is deeply flawed both in how ‘similar’ schools are selected and in the gross misuse of Progress 8 data as a measure of outcomes. The result for our school is a league table of efficiency that ranks us 43rd in terms of efficiency against a cohort which includes 37 Grammar schools (we are not selective).
‘Efficiency’, according to this model, can be measured by comparing ‘input’ (per pupil funding) to ‘output’ (Progress 8 score). Therefore, a school with higher per pupil funding and a low Progress 8 score is inefficient, and vice versa. To an accountant, this no doubt makes absolute sense, but anyone who understands schools will know what a gross over-simplification of school effectiveness this is.
Anyway, here is my response…
Why the ‘Efficiency Neighbours’ data is invalid for our use
The ranking of XXX School against 49 other ‘comparable’ schools in terms of ‘efficiency’ lacks validity, and therefore cannot be used to draw any useful conclusions. What undermines the validity of the data-set is the use of Progress 8 scores as a measure of comparable outcomes. This methodology is flawed for three reasons:
- Progress 8 is not a reliable measure of school effectiveness
- Comparing schools by Progress 8 scores is statistically invalid where no confidence testing is applied
- Progress 8 scores are inflated for grammar schools and therefore comparing XXX School (non-selective) against a data-set which includes 37 grammar schools creates a misleading impression of relative performance
There are many reasons why Progress 8 is not a reliable measure of school effectiveness, however the key flaws are:
- The measure is dependent on the curriculum choices made by students. Schools can make students take combinations of subjects which guarantee all the ‘buckets’ are filled, thus escalating their Progress 8 score. Equally, schools can enter students for ‘soft’ qualifications (as we have seen with the ECDL qualification) which significantly inflates the Progress 8 score (schools entering over 90% of students for ECDL in 2017 inflated their P8 score by 0.2). We give an open choice of GCSE subjects to students and do not bulk-enter students for soft qualifications. This has a negative effect on our P8.
- The measure does not account for the ‘difficulty’ of the subjects taken. FFT analysis shows that our P8 score was 0.25 lower than it would have been if we entered students for qualifications in line with the national picture. In other words, because our students choose subjects in which it is harder to achieve high grades (such as foreign languages), our P8 score is reduced.
Due to the above, attempting to compare school effectiveness by using Progress 8 scores alone will lead to invalid conclusions.
Furthermore, ranking schools by Progress 8 scores, as is done in the ‘Efficiency Neighbours’ tool is also invalid for statistical reasons. It is only possible to say that one school had a ‘better’ P8 score than another if significance testing is applied, and even then this can only be stated with the relevant statistical degree of confidence. A school with a P8 score of -0.2 is did not necessarily achieve ‘worse’ than a school with a score of +0.2. Both may fall within the average range, and no more can be said than they both achieved scores in line with the national average. It is statistically nonsensical to state that any schools got a ‘better’ P8 score than XXX School unless their P8 score was significantly above average following significance testing.
Finally, comparison with grammar schools is problematic. Although there may be similar levels of SEN and deprivation (which is the basis for the selection of comparator schools in the tool), the cohorts of the schools are not comparable in other, equally important, ways. XXX School is not selective and therefore we attract the full range of abilities. This has significant implications for how ‘efficient’ the school can be with its resources. Grammar schools are able to teach in large, able groups. They do not need to direct resources towards students who have not met national expectations at primary school. Grammar schools are also almost universally over-subscribed and therefore can admit an economically viable number of students (i.e. denominations of 30) which mean pupil-teacher ratios can be maintained at an efficient level. At XXX School, we must fund smaller intervention groups and support for students with low levels of prior attainment. Our intake numbers fluctuate and therefore class sizes are not always optimal year-on-year. The breadth of the abilities of the students also means that a more diverse curriculum is required. In short, we have to make our resources stretch further.
The Progress 8 score for grammar schools is also not comparable to non-selective schools. A school’s Progress 8 score is proportional to the average prior attainment level of their students. So, for example, schools in top 10% of prior attainment achieve, on average, +0.3 P8. Schools with an intake profile similar to ours achieve, on average, between 0 and +0.1. This is widely believed to not be due to the effectiveness of the school, but to a variety of factors which skew the data. Analysis undertaken by FFT’s Education Datalab points to the effect of the grammar school selection process in attracting students who have underperformed in SATS tests but who are actually very able and therefore pass the 11+. This results in the Progress 8 score for these pupils being artificially high. Research by Tom Perry at the University of Birmingham shows that, after adjusting for such ‘compositional errors’, the actual Progress 8 score for grammar schools may be over-estimated by up to 0.5 points.
In summary, the conclusion that our school is ranked 43rd of 50 ‘similar’ schools in terms of efficiency is false and unsubstantiated. To compare our outcomes with these other schools, one would need to:
- Compensate for the selection effect in P8 scores (adjusting grammar schools downwards by 0.5)
- Perform significance testing to identify those schools significantly above average.
- Account for contextual demographic factors, curriculum models and subject difficulty to establish the underlying ‘value added’ as an indicator of school effectiveness.
It is likely that the ‘ranked’ Progress 8 list resulting from the above would place the majority of schools in the average band and a very small number (less than 5 in my estimation) in the ‘significantly above average’ band.
Following this analysis, one would need to account for the school contextual factors in terms of the potential for an efficient business model.
In conclusion, the data provided by the ‘Efficiency Neighbours Set’ fails to provide any useful indicator of school efficiency or to point us towards useful sources of information. I hope you understand that we will therefore depend on more reliable and nuanced indicators of financial efficiency.
References
https://teacherhead.com/2015/05/02/progress-8-looks-like-data-garbage-to-me/
https://schoolsweek.co.uk/how-progress-8-disguises-grammar-school-pupils-true-attainment/
If we are to deal with the serious shortage of funding in our school system, we need those in positions of authority to understand education. We need support and advice from people who know what they are doing. This kind of thing is just not good enough. My job is hard enough without me having to explain the basics to people who should know better.
Reblogged this on The Echo Chamber.
LikeLike