Henry Stewart has exposed the serious shortcomings of the latest Department for Education (DfE) proposals in his post on Local Schools Network. The gist of his argument is as follows.
Schools with ‘high attaining intakes’ are very unlikely to be defined as coasting.
The first measure of “coasting” is a school that falls below 60% of pupils achieving 5 GCSEs including English and Maths. There is a close correlation between how likely a school is to pass this benchmark and the average KS2 point score. Of schools whose children have an average KS2 point score of 30 or more (where a 4a is equal to 29 pts and a 5c to 31 points), precisely none would be caught by this measure. However of schools with an average point score below 26 (where 4b is equal to 27 pts), 97% fail to reach the benchmark.
The [second measure] of coasting [is] % of pupils making “expected progress” in English and in Maths. Of secondary schools with an average pupil KS2 point score of less than 26 (where a 4c is 25 and a 4b is 27), 70% in English and 82% in Maths fall below the threshold. In contrast for schools with an average point score of 30 or more (where a 4a is 29 and a 5c is 31), only 2% in English and 1% in Maths fall below the threshold. The 163 English grammar schools are all in that top category (along with 35 comprehensives) and therefore certain to avoid being classed as “coasting”.
The expected 3 levels of progress means a stretching grade C for a pupil entering with a 4c grade at age 11, but only a grade B for a pupil entering with a 5a. As Michael Wilshaw has argued, grade 5 pupils (especially those on 5b or 5a) should be achieving As.
Take an example: If a grammar school had 100% of its intake entering with level 5s (as some grammars do), then it should be expected to get them to As or A*. However “3 levels of progress” only requires them to achieve a B and being above the median would in 2014 have meant getting 74% to a B in English or 67% in Maths. If this grammar got no As or A*s and got only 70% of its students to a B in Maths, then it would still not be deemed “coasting”.
Does the government intend never to tackle ‘underperforming schools’ in leafy areas, or does it really not understand the measures that it is choosing to use? The latter seems unlikely. The DfE employs highly qualified statisticians. Given that OfSTED behaves as if it is an executive arm of the DfE rather than a truly independent regulator of education standards, the result will be huge numbers of secondary schools serving less affluent communities that produce lower intake attainment scores being lined up for privatisation through the compulsory academisation mechanism.
In November 2006, the NfER journal ‘Practical Research for Education‘, published my article, ‘Cognitive Ability and School Improvement’. This argued for the use of standardised school intake ability scores for assessing the effectiveness of secondary schools through an LEA managed approach based on producing regression data as in the following example for Cumbria Schools in 2002.
The named school with by far the lowest GCSE Average points and average intake Cognitive Ability Test Scores (CAT) was The Alfred Barrow School where I was headteacher at the time.
The chart shows all the Cumbria secondary schools along the horizontal axis according to the average CAT score for the children admitted in 1997. The vertical axis gives the school GCSE performance in 2002 for the same year group of children. The former DfES calculation of GCSE point score is used but the chart has a very similar form if the percentage of pupils with 5 or more A*-C grades (5+A*-C) is substituted. The line shows the relationship between average intake CAT score and subsequent GCSE performance. This regression line gives a correlation of 0.84. This means that school GCSE performance is very strongly predicted by the CAT. The significant features of this chart are as follows.
- There is a vast difference between the average intake cognitive abilities of the lowest and highest performing schools. The lowest performing school, Alfred Barrow had an average intake pupil CAT score at the 16th percentile. This means that only 16 percent of 11 year-olds in England had a lower score than that of the average for pupils admitted to this school in 1997. The highest performing school, Cumbria’s only selective grammar, admitted pupils with an average cognitive ability score much higher than the national average of 100.
- The chart, with its very high correlation, takes no account of social deprivation or any other factor that might constrain or promote pupil and hence school performance. This suggests that individual cognitive ability is a much stronger factor in school achievement than social deprivation, or else that cognitive ability and social deprivation are related. Both possibilities challenge the DfE model of school achievement and effectiveness.
- The regression line shows the average performance of Cumbria schools in relation to the intake cognitive abilities of pupils. Schools below the line had a worse than average performance and schools above the line had a better than average performance.
- When schools were ranked in order of distance from the regression line the resulting rank order bore no relation to the published GCSE league table of Cumbria
- Alfred Barrow was bottom of the Cumbria GCSE league table in 2002 but in the top half of the Cumbria rank order of effectiveness in relation to intake cognitive ability, without making any allowance for the acute social deprivation suffered in its catchment area.
- This means that when its 2002 leavers were placed in rank order of Cognitive Ability Test (CAT) score as measured on entry to the school in 1997, the GCSE results of each pupil compared favourably, on average, with pupils of the same CAT score in the school with the highest GCSE score, which was top of the DfES GCSE league table for Cumbria in 2002. This was a statistical necessity resulting from the positions of the two schools on the chart with respect to the regression line. This may well have been true throughout the ability range, the chief difference being that Alfred Barrow had far fewer (but still some) high CATs score pupils, but far more low CATS score pupils.
- There is nothing in these data to suggest that the minority of Alfred Barrow pupils with high intake CATs scores underperformed compared with pupils with similarly high intake CATs scores in other schools.
Is this a model for a much better definition of ‘Coasting Schools? I believe it is. At that time the Cumbria Education Service was beginning to use such regression based data for evaluating the performance of Cumbria Secondary schools. A local Cumbria definition of a ‘coasting’ school could well have been falling below the regression line. It should be noted that some of the highest performing schools at GCSE fell into this category.
However, at that time in Cumbria as elsewhere, LEAs were in the process of being replaced by ‘Children’s Services’ reorganisation and the political climate was anticipating the eventual dismantling of the LEA based national education system created by the 1944 Education Act. In Part 5 of my book, ‘Learning Matters‘ I argue for the restoration of the equivalent of LEAs and democratic local control of the school system. While such ideas will not appeal to the present Conservative government they are very much on the agenda of the Labour opposition led by Jeremy Corbyn.
There are two problems with my model in relation to a national policy for identifying and challenging ‘coasting schools’.
The first is that it relies on universal CATs testing. In ‘Learning Matters’ I argue that KS2 SATs should be abolished in favour of universal Y6 CATs screening of all pupils as now happens in the Hackney LA. However, because CATs drive the outcomes of all tests that measure the development of cognition, there is a good correlation between CATs and KS2 points scores. Therefore GCSE/KS2 regression charts could perform the same function.
The second is that the DfE is seeking to establish a basis for judging school performance based on the national rather than local norms. So the national equivalent chart would contain the data for all the secondary schools in England. Those appearing below this national regression line could be fairly judged to be producing less academic progress for their pupils than the schools above the line.
LEA specific regression charts would still have a useful function as they could be a sound basis for comparing the effectiveness of LAs in bringing about school improvement.
Nothing is ever perfect and even on my regression based model there is still room for factors like the mix of GCSE subjects offered to and taken up in different schools to distort the comparisons.
And none of this invalidates the greatest statistical health warning of all. There is much more to quality education and a good school than just maximising GCSE performance per pupil.