[Skip to content]

NAHT - For Leaders, For Learners
Search our Site
Nearly one in five primary schools below KS2 floor targets from 2016?
Opinion icon

Warwick Mansell

image of Warwick Mansell
The former TES journalist writes for NAHT on current education issues. The views expressed do not necessarily reflect those of NAHT

 

 



Nearly one in five primary schools to be below KS2 floor targets from 2016?

This, I know, is a headline to warm the hearts of head teacher readers of this blog, many of you no doubt now recovering from the recent stress of Sats week.

But the question has to be asked: how many schools stand to be below the government’s new key stage 2 floor targets once they are introduced from 2016?

I’ve been looking at the likely impact of the new accountability system. This was first announced, after much delay, in March, and I blogged about the DfE paper as a whole here: http://bit.ly/1ohjTRL.

In this blog, I am concentrating on the calculation of the new floor target measures, announced in that paper http://bit.ly/P5Roqj .

The headlines are that, from 2016, schools will be below the government’s baseline targets – officially called “floor standards” (Note 1) – if both of two things happen.

-          The school has fewer than 85 per cent of its pupils achieving the expected level in all of reading and maths tests and writing teacher assessment. This is called the attainment target.

-          It fails to do well on a measure which assesses pupil progress between the end of KS1 and KS2. This is called the progress target. (Note 2)

Readers of this blog will need no reminding that falling below the new floor standards could be the trigger for something quite dramatic, from the school’s point of view. The government’s document says that “A school will come under additional scrutiny through inspection if it falls below these minimum ][my italics] standards.”  It goes on: “In some cases, intervention may be required and could result in the school becoming a sponsored academy.”

So, to repeat, how many schools could be below target?

Well, it is fairly easy to check what the statistical position would be, assuming the current level of performance in English and maths tests and teacher assessments continue, and also assuming that schools would fail to achieve the progress target with below average value added data.

So I looked at the key stage 2 league tables for 2013 to find the data.

Ok, so let’s look at these indicators again, in detail. The new attainment element is actually based on a higher “expectation” than under the current system, in that for any pupil to count as being at expectation, they have to achieve at level 4b, rather than at the slightly lower threshold of “only” level 4.

In other words, from 2016, schools will need to have at least 85 per cent of their pupils achieving at least level 4b in all of reading, writing and maths to achieve the attainment part of the floor standard.

In the ever-so-detailed 2013 league tables, there is a column which says what percentage of pupils in each school achieved level 4b in both the reading and the maths tests, and at least level four in the writing teacher assessment. (Note 3).

So, crunching the numbers quite quickly on Excel (Note 4), I make it that only 11 per cent of schools would be above the attainment element of the new “floor standard” as things stand, ie based on the most recent data, which is for 2013.

In other words, only 1,584 of the 14,493 schools for which there was useable data score at least 85 per cent on this measure, meaning they would be above this element of the floor target, with 89 per cent below it.

These, of course, are fairly remarkable, if not surprising, figures. It is extraordinary to be presented with a measure for a “minimum” acceptable level, below which dramatic things happen, which nearly nine out of 10 schools seem on current performance to be in line to miss.

It does, of course, invite questions as to what the purpose of these standards are. It seems clear, both from internal DfE documents and from general political discussion, that maximising the number of sponsored academies is a key DfE priority. And triggering sponsored academy status is included as a possible outcome of falling below this floor. So perhaps you don’t need to be a conspiracy theorist to wonder if there is a connection between the two.

The statistics are all the more remarkable when one considers that level 4 – that’s the bottom of level 4, rather than the middle, as in level 4b – was originally defined as the performance to be expected of the average (note 5) child in England. This was when the levels system was first devised, in the late 1980s, and performance standards were set down.

We now have a situation where schools will be in trouble, and defined as performing below minimum expectations, if approaching 90 per cent of their pupils cannot achieve above what used to be the average level of performance in each of three different subject domains. My, how far we have come.  I wonder if politicians, if they are treating these numbers as seemingly the final word on school quality, don’t at least have a duty to acknowledge the changing picture – the dramatic improvement , if taken at face value - the data suggest has taken place over the past 20 years.

But hold your horses, I can hear the DfE and its supporters retorting. For we haven’t yet got to the second part of the floor standard equation.

So, we know that, on 2013 data, 11 per cent of schools would be above the floor target on attainment and thus cannot fall below it overall.

But of the remaining 89 per cent, many will be “saved” from being below the floor overall if they can demonstrate good progress with their pupils. But how many?

This, it turns out, depends on exactly where the threshold for success or failure in the progress element of the floor targets is set.

The latest DfE paper on how the new system is going to work does not actually set out what value-added score will count as “failing to do well on progress”.

So, initially, I assumed that any school coming just below the national average on overall value-added would fail on that aspect of the floor standard. In other words, any school scoring below 100 on that measure would fall into that category.

On my calculations, the bad news for heads would be that fewer than half of the schools falling below the 85 per cent target on attainment would be “saved” by doing well on progress, if we assume that they would need a value-added figure of at least 100 to escape the floor.

In terms of the measure used for progress, again I looked at the 2013 league tables and in particular at each school’s value added score, which compares pupil scores at ks2 with ks1, taking into account all of reading and maths test results and writing teacher assessment. (Note 6)

So, I make it that, on the 2013 data, 5,438 of the 12,714 schools which scored below 85 per cent on our attainment measure were above 100 on KS1-2 value added, and thus it seems would have escaped being below the floor standard overall.

But this still leaves 6,801 schools which, on 2013 data, were both below the 85 per cent attainment measure and were also below average (ie below a  score of 100) on value added. (Note 7)

That, then, is 6,801 out of the 14,298 schools overall for which there is full data (Note 8) seemingly falling below the floor targets.

In other words, by my very rough calculations, 48 per cent of schools would be below the government’s new “floor standards” if they were introduced on 2013 data and if the VA score needed were 100 or more.

That, however, may be unduly pessimistic for schools. For, in last July’s consultation by the DfE, which first out proposals as to how the new system would work, a proposed figure for the VA score needed to escape the floor was specified. And the good news for schools was that it was quite a bit lower than 100.

The July consultation paper http://bit.ly/1o4zXX1 said: “We expect the value-added score required to be above the floor to be between 98.5 and 99 (a value-added score of 100 represents average progress)”.

Applying VA thresholds of 99 or 98.5 to the current data, as you might escape, would lift many other schools above the floor.

So, I make it that, on 2013 data, if a VA score of 99 were required to lift a school above the floor, then 2,291 schools would remain below it once the progress element of the floor targets was calculated, or 18 per cent of the total. If only 98.5 on VA were needed, 1,037 schools would in the end be below the floor, or 8 per cent.

I am interested as to why the specific pledge of a VA figure of 98.5 to 99 in last July’s document had been removed by the time the latest proposals were published in March. The March document says that, for 2016, the new system’s first year of operation,  the DfE will set the precise extent of value-added progress required to be above the floor “once key stage 2 tests have been sat for the first time”, suggesting, perhaps, that the new test system may be creating some uncertainty, and that the exact VA threshold will not be known until 2016.

In any case, a figure of up to 18 per cent of schools being below target, based on the suggested VA thresholds published last July, still represents a substantial proportion of the total. By comparison, by my rough calculations around 860 schools, or 6 per cent of the total, were below the government’s 2013 actual floor targets last year.

Unless pupil performance changes, this, then, will be the position when and if these new standards are introduced, with many schools which have pupils sitting KS2 tests seemingly being at risk of “intervention”.(Note 9)

I wondered, also, to what extent the “progress” element of the new target will soften it for schools with challenging intakes, as seems to be the spin from certainly the Liberal Democrat element of the coalition, who have seemed to have led on this policy.

It is clear that any “raw” attainment measure is likely to favour schools with fewer disadvantaged pupils. But I wondered whether this would also be true of the “progress” measure, even though it assesses not “raw” attainment of pupils at the end of key stage 2, but just how far they had moved, given their starting points at the end of key stage 1.

Would it still be more difficult even to demonstrate progress for pupils, in a school with many disadvantaged pupils, I wondered?

Well, the good news for those with perhaps more challenging intakes seems to be that schools with high numbers of free school meals pupils don’t appear to struggle statistically, on average, to demonstrate success on the progress measure, as far as I can see: the number of schools who score above average on key stage 1-2 value added currently, and have relatively high proportions of FSM pupils, seems to be about the same as those with above-average FSM numbers but below-average value added.

This means that, by my very quick calculations, the numbers escaping the floor targets overall will not be very strongly skewed towards those schools which have few disadvantaged pupils, I think. (Note 10)

But, of course, overall the implications of a substantial proportion of schools being poised to below these new “minimum” targets are manifold.

One view, which I tend to find persuasive, is that the height of this floor suggests this is all about politics. Setting such a target, which hundreds if not thousands of schools are going to find extremely difficult to achieve, is a kind of win-win for both the Department for Education itself and its ministerial masters.

For the department, it is a mechanism by which the government’s intervening reach can be extended dramatically, against the background that still only a small minority of primary schools have so far chosen to jump towards academy status.

For ministers, it helps them achieve their goal of pursuing the academies scheme as a political project. On this argument, this dominant aspect of education policy-making of recent years hasn’t really been about improving provision for pupils, but about pursuing purely political ends, such as the sidelining of local education authorities as a goal in itself.

Ministers and political strategists could counter, I guess, that there is a general political dividend to at least being seen to be setting high standards of schools and pupils. That may be true, but of course the only important question, from the point of view of actually doing the right thing by pupils, is whether it will work.

And here, for me there must be reason for serious doubt. Back in 2003, an official evaluation for the Labour government of its literacy and numeracy strategies – and associated accountability regime – warned that the long-term target of the day, requiring 85 per cent (that figure again) of 11-year-olds to achieve level 4 in English and maths, might backfire.

It said: “We caution that setting ever-higher national targets may no longer serve to mobilise and motivate, particularly if schools and local education authorities see the targets as unrealistic.” (See http://bit.ly/1pqTcd1 ) (Note 11)

If these targets serve to make teaching, and particularly school leadership, more stress-inducing and therefore less attractive for many more professionals, will they really make schools better places for their pupils? I think our target-driven, school-by-school accountability system has done more harm than good to English education over the past 20 years, so you would expect me to be sceptical about the merits of this particular ratcheting-up of the policy.

Setting targets high may seem both an attractive sell to the electorate for politicians and, on this occasion, may also suit their policy purposes. But the long-term implications of this project, if indeed it is implemented from 2016, seem questionable.

 

Note 1: …because calling them “targets” no doubt seemed too redolent of Gordon Brown for this government, even though in concept they are the same as those introduced under Labour.

Note 2: I am concentrating, in this blog, on the system which the DfE proposes to introduce from 2016. Longer term, of course, it wants to assess pupil progress, at least for all-through primary schools, against new “baseline” assessments to be taken by reception children. But as the first of these assessments won’t actually be used as accountability measures, if at all, until 2023, it seems safe not to worry too much about them in this particular analysis.

By the way, in terms of a definition of the proposed new progress measure, the DfE publication says: “The proposed progress measure will be based on value-added in each of reading, writing and mathematics. Each pupil’s scaled scores in each area at key stage 2 will be compared with the scores of pupils who had the same results in their assessments at key stage 1.” The VA average is 100.

Note 3: It’s just level 4 in the writing TA, I think because that’s as fine-grained as the system currently goes. So the post-2016 floor standard might actually be a little tougher than my statistics would suggest, if they do manage to set this level precisely enough to show that the new standard is equivalent to “4b”. In other words, if that does happen, more schools may actually be below target, given current levels of performance, than my statistics in this piece would suggest.

Note 4: I must admit that my work looking at schools by the proportion they have of disadvantaged pupils has been particularly rough-and-ready. I’ve simply taken all the primary schools which have published data in the 2013 league tables, ie that have a published figure on the “percentage of pupils achieving level 4b in reading and maths tests and level 4 in writing TA”, and which is above zero per cent, and which also have an overall VA score published on the indicator I mention above. I look at FSM figures for these schools using the DfE measure giving the percentage of “disadvantaged pupils” in each school, which disadvantaged pupils is defined: “those who were eligible for free school meals in last 6 years or are looked after by the LA.”

Note 5: Technically, the median.

Note 6: In other words, schools are assessed on this measure by looking at the starting points of their pupils at the end of KS1 and then asking whether their results are better or worse than schools with pupils at similar starting points. A school with a value added score of above 100 does better than average, given the starting points of its pupils; one with less than 100 does worse than average.

Note 7: In the middle, I make it there are 475 schools scoring below 85 per cent on the attainment measure but exactly 100 – ie exactly average – for value added.

Note 8: The total number of 14,298 I use as the denominator in this calculation is lower than the 14,493 I started with in assessing whether or not a school achieved the attainment target, as I have taken out a small number of schools which did not have useable value added data from the calculations.

Note 9: 2016, of course, is after the next general election so a new government would need to decide whether to persist with them.

Note 10: This is partly, I think, because of the way the maths works, with the progress measure, which particularly helps high-FSM schools, moving many more schools above the overall floor standard than does the raw attainment measure, even on the assumption that the progress threshold is set at 100, but it needs much more analysis.

Note 11: This 2008 National Audit Office report, which could find no quantitative beneficial effect of closure threats on schools, would also seem to make interesting reading: http://bbc.in/1ktjHvB .

 

Page published: 04 June 2014