On 19 January 2017 the Daily
Telegraph published the following article online. The article has since been largely
rewritten. I have commented on the
original version. The text in red is
mine.
There are several things that worry
me here. There are so many errors and
misunderstandings and that isn’t good.
The biggest worry is that I tend to trust the press. My default position (and I suspect that of
many other people) is to believe what they say (not just this newspaper but
others too). If there are this many
misrepresentations and errors in an article about a subject I know a little
about, what are the chances there are errors in other coverage too? Can I believe anything I read … ?
Secondary school league
tables 2016: Over 1,500 schools are falling behind, figures show
19
JANUARY 2017 • 12:33PM
More than 1,500 schools are falling behind,
according to the Government's new progress measure, official league tables
released today by the Department for Education (DfE) show.
Almost a quarter of a million pupils monitored
under the Government’s new GCSE ranking system – called Progress 8
– are at schools which were given a negative rating, meaning they are
performing below the national average.
Progress 8 has been created so that the total Progress 8 score for all
pupils across the country is zero. The
average score per pupil is therefore also zero.
When this is transferred to school level things are not entirely
straightforward. We shouldn’t expect
there to be exactly the same number of schools above and below the average, but
this is roughly the case. [The discrepancy is down to the different sizes of
schools, the fact that special schools tend to be smaller and that Independent
schools are not included within the figures.
To give a simplified example: if you have all of the pupils in two
schools, one of which has a positive Progress 8 score and the other a negative
one. Half the schools are below
average. If you then split the pupils
from the school with the negative score into two different schools you could
have one big school with a positive score and two small schools that have a
negative score. Then there would be
twice as many schools below average as there are above average.]
My reading of the numbers is that in 1616 schools the pupils averaged a
Progress 8 score above zero, in 48 schools the pupils averaged a score of zero
and in 1994 schools they average a score below zero.
There are three issues so far in the article. One is that it is bonkers to complain that
lots of schools are “below the national average”. The way averages work is that some are above
and some are inevitably below. The only
way to avoid having anyone below average is for every single school to be
exactly average.
The second is I can’t get “1500 schools” from the figures! 1994 schools were below zero (but it would
seem odd to use “more than 1500 schools” to describe 1994 schools). Maybe they are referring to the 1598 schools
that are -0.1 or below. But this is not
sensible, because the government has said Ofsted will investigate schools whose
figures are lower than an arbitrarily chosen -0.5, of which there are 705.
The third issue is that it is nonsensical to make a big deal about “Almost
a quarter of a million pupils” attending schools where the average Progress 8
score is negative. The key thing is the
_individual pupils_ whose P8 score is negative.
If a child gets a negative score it doesn’t matter whether they are at a
school where lots of other people did the same or not.
Free Schools, which were introduced under by former
Education Secretary Michael Gove in 2010, were proportionately the worst
performing schools in the state sector.
In total, 84 per cent of free schools were given a
negative rating for progress. Academies performed comparatively well,
with the majority (57 per cent) rated as above the national average.
If you look at the figures for every school that is a Free School then
there are 13 with a positive P8 score and 71 with a negative score, which is
84.5% negative.
There are some important nuances that have been missed here though. First of all, this is only 84 schools out of
227 Free Schools. The rest have not yet
got to the stage where pupils are in Year 11, so only 37% of Free Schools are
included in the figures.
Then let’s look at the different types of school. Free Schools include special schools, Studio
Schools and UTCs. It doesn’t seem fair
to put all of these together and then to compare them to Academies. For example, if we look at all special
schools (whether Free Schools or not) we find that 406 of them had negative
scores and 2 of them didn’t. This is for
sensible and understandable reasons, which I won’t go into here. The point is, though, that the different
types of school that make up ‘Free Schools’ is not the same as the types of
schools that make up ‘Academies’.
In fact, Special schools were only a small fraction of the Free Schools,
but UTCs are very different from mainstream schools and seem to have their own particular
challenges. Again, it seems unfair to
compare them to academies. I didn’t know
anything about Studio Schools so I looked them up. Their website states that the “Studio Schools
curriculum moves away from traditional methods of subject delivery with the
curriculum delivered principally through multidisciplinary Enterprise Projects
in the school and surrounding community”.
It also previously mentioned that students “work towards GCSEs in
English, Maths and dual award Science as a minimum”. If as a school you have some students who
only work towards GCSEs in English, Maths and double Science then it is unsurprising
your P8 scores are low. (I pass no
judgement on Studio schools – I am only pointing out that their curriculum
requirements are not closely aligned to Progress 8.)
The figures also revealed a clear north-south
divide with the more than half of the ten worst local education authorities
(LEAs) in terms of student progress situated in the north-west of
England.
The league tables showed that every single school
in Knowsley, Merseyside, was failing, as were 90 per cent of schools in Redcar
and Cleveland, North Yorkshire.
It might well be reasonable to look at exam results in different areas
of the country, but again there are other confounding issues. For example, if it happens that there are
more PP pupils in certain parts of the country then those areas’ P8 scores may
well be lower.
It isn’t right to describe schools that are below average as “failing”.
Meanwhile, the five best performing
areas for pupils achieving 5 A*-C including English and maths were
all in London, with Hackney, Kingston upon Thames, Kensington and Chelsea,
Barnet and Westminster at the top of the league table.
Nick Gibb, the schools standard minister, said the
figures “confirm the hard work of teachers and pupils across the country is
leading to higher standards”.
The implication in this article is that Nick Gibb’s quote refers to the
Progress 8 figures. Either this is wrong
and Mr Gibb was actually talking about other figures, or Mr Gibb is wrong.
If you have a measure like P8 which has zero as its average then this
is, definitionally, a ‘zero-sum game’.
You have no way of telling whether standards are going up, down or
staying the same.
This year, nine of the top 10 schools for GCSE
results were state schools, with the country's best three state schools
all situated in north London. Henrietta Barnett School, a girls' grammar
school, scored 100 per cent for students obtaining both 5 A*-Cs.
Queen Elizabeth's School – a boys' grammar
school – came second, with 100 per cent of students gaining 5 A*-Cs, and St
Michael Catholic School, a grammar school for girls, came third.
Independent schools performed poorly in the tables,
with 62 per cent of them coming in the bottom third of schools for 5 or more
grade A*- C including English and maths.
This is due to the fact that many independent
school students sit IGCSEs, which are not officially recognised by the
Government.
This isn’t getting better. The
school that came first scored 100% (we are now talking about pupils gaining 5A*-C
grades rather than Progress 8). The
school that came second also scored 100%.
It is difficult to see how they have been separated.
This year is the first time that the Government’s
new measure for attainment in GCSEs, called Progress 8, has been used to rank
schools. It measures students’ progress in eight subjects from primary school
through to secondary school.
The eight “core” subjects measured by Progress 8
are English, maths, history or geography, the sciences, and a language.
Um, no they aren’t. This confuses
the EBacc subjects with Progress 8.
While English and Maths are part of P8, pupils need three of the others
that are mentioned and then have any three further subjects (which can include
the other English exam that many pupils take and can include other EBacc
subjects).
Mr Gibb said: “As well as confirming that the
number of young people taking GCSEs in core academic subjects is rising,
today’s figures show the attainment gap between disadvantaged and all other
pupils has now narrowed by 7 per cent since 2011.
“Under our reforms there are almost 1.8 million
more young people in good or outstanding schools than in 2010, and through our
new, fairer Progress 8 measure we will ensure that even more children are
supported to achieve their full potential.”
Here is the URL for the
article. As mentioned earlier, it has
changed significantly since I copied and pasted the original version. There is no official statement that the
article has been rewritten (although the time given on the article is
different).
No comments:
Post a Comment