Wednesday, November 22, 2017

£600 for post-16 mathematicians

The government initially seemed keen to make a success of Core Maths.  They put some very good people onto promoting and supporting it (David Burghes, Mick Blaylock, Paul Glaister, the CMSP group) and seemed to have nurtured this new course well.  

Then it became clear that many sixth forms would not be able to run Core Maths because it isn't funded in the same way as A-levels and they would not be able to afford to teach it.

In the budget today (22 Nov 2017) the Chancellor announced the following (taken from gov website):


Let's take the case of Core Maths.  One period per week across a whole year costs a school about £2000.  If a Core Maths class has 3 lessons per week over two years then that will be 3 x 2 x £2000 = £12,000.

Now let's divide that by the £600 per extra pupil taking the qualification:
£12,000 / £600 = 20.

So, a class of 20 extra students doing Core Maths will fund itself.

In the case of Core Maths then, this essentially sorts out the funding required to run the qualification.

And working this out could form the basis of a Core Maths lesson!

Friday, October 20, 2017

A Geographer Does Core Maths

A couple of years ago I made a presentation to geography teachers about Core Maths.  (One motivation for doing Core Maths is to support the statistical work that A-level geography students need to do.)

My colleague Scott, head of the geography department has admitted that he now “does core maths” in real life!  When watching Scotland fail to qualify for the World Cup finals he heard the crowd singing “I’d walk a million miles for one of your goals” .  As any self-respecting Core Mathematician should, he wondered how long it would take to walk a million miles.

Scott reckoned it would be sensible to walk 30 miles a day.  He then told me it would take 92 years to walk a million miles.  What’s nice is that I can use this with a core maths class.  I could give them these figures and ask whether they are reasonable.  Or I could play them the chant and ask for their thoughts: they might create a different question.  The students could also decide on their own ‘daily-distance’ to walk.  They might decide that it’s reasonable for people to walk long distances from the age of, say, 15 up to the age of 65 and could work out how far you would need to walk each day to cover the million miles.  (About 55 miles per day – that feels like a lot!)

Over at the Quibans website there are lots of ‘Questions Inspired By A News Story’.  They work in a similar way to this, with some sort of stimulus (usually from a newspaper or news website) and then some related questions to answer.  This is ideal for Core Maths and can be used to work on problem-solving as well as to practise mathematical skills (such as percentages) in different contexts. 

The tasks on the Quibans site can usually be projected in class.  This question has been set up as a Quibans and can be found here.  Do explore the rest of the site too.


Monday, August 14, 2017

How I learned some Spanish (or ‘what I did in my summer holiday’)

I recently spent a fortnight in Valencia attending a language school for 4 hours each morning.  This gave me afternoons to spend on the beach, opportunities to meet other language learners from around the world and the chance to work hard on learning some Spanish. 

This is the first time in a long time that I have been a classroom learner; here are some reflections on things that helped me to learn.

Positive mindset

I had really enjoyed languages when I was at school and about 20 years ago I picked up some smatterings of Croatian and Albanian during summers spent in those countries, so I was fairly confident I would be able to be successful in making a start on learning some Spanish.  (Let’s be clear: I am still very much a beginner, but I have learned lots in two weeks.)  I therefore had some self-belief that while I didn’t know Spanish I would be able to learn some.

I was also very keen to have a go.  During the lessons I took every opportunity to talk and to try things in Spanish (to the extent that when, as we left Valencia, an Englishman at the airport was having trouble understanding the security personnel, I wanted to explain to him in simplified Spanish what they wanted him to do rather than just using English!).  As I walked to the language school each morning I rehearsed what I wanted to say, how to introduce myself to new members of the class, etc.  Essentially: I worked hard.

I was not afraid to make errors.  I tried things out, recycled language and structures we had used in previous lessons, listened to the corrections from the teacher but didn’t worry about making mistakes.  I did make lots of mistakes!  Near the end of each lesson the teacher would pick out three or four errors that had been made during the lesson, wrote them up on the board and asked us to correct them.  The vast majority of them were errors I had made. 

Making links

One of the things I really enjoyed was making links.  Some of these links were within the language of Spanish.  For example, knowing that the suffix ‘-ito’ or ‘-ita’ often means a small version of something was useful.  When we met ‘mesita’ (a bed-side table) this was a small version of a ‘mesa’ (a table) and helped me to remember the word.  I know what a ‘mosquito’ is (and the word is the same in Spanish), so it was rather neat that a fly is a ‘mosca’, meaning that ‘mosquito’ is really just ‘small fly’. 

Other links were across different languages.  There were lots of connections with French (which I took for GCSE years ago).  My favourite link was comparing hair and horse.  In French hair is ‘cheveux’ and horses are ‘chevaux’ (change an ‘e’ to an ‘a’).  In Spanish hair is ‘cabello’ and horse is ‘caballo’ (make the same change!).

Using experts

A particular advantage was the number of experts I had available to me.  During the lessons there was my teacher and the invaluable apps ‘Word Reference’ and ‘Spanish Verbs’. 

During the rest of the day I could ask questions of my wife and son.  My wife is fluent in Spanish and son is approaching GCSE and both were happy to correct me as I haltingly spoke to them in Spanish. 

Revisiting the content

I tried to force myself to reuse as much previous language as possible during the lessons.  This was very helpful.

I made notes during the lessons and found these very helpful.  Each evening I revisited what we had done that morning and also flipped back to previous lessons to keep things current in my mind.  As we walked around town I tried to recall and use some Spanish too.

Some of the grammar I wrote out in full and explicitly set out to learn (such as irregular verbs).
I realised the importance of revisiting the language partly through the mistakes I was making.  When the teacher wrote up the ‘errors of the day’, I could always correct them immediately, but would still make the errors when I was talking.  I put this down to focusing on (or panicking about!) the word or idea I would need at the end of the sentence I was saying, and that then I made some silly mistakes earlier in the sentence.  For example, when I wanted to talk about the three walruses we had seen at the Aquarium I was trying to recall the word for a walrus, so I used the wrong expression for ‘there are’ because my focus was elsewhere.  (It should be ‘hay tres morsas’.)

Using the language

One of the lovely things about the lessons was having the opportunity to work with so many talented, interested and interesting other people.  This gave rise to lots of banter and the opportunity for creativity.  For example, after the verb ‘to milk a cow’ cropped up (‘ordeñar’, since you ask) we used it at every opportunity. 

Outside lessons I looked for real Spanish wherever I could find it.  Shopping in the supermarket was an opportunity to revise vocab for fruit and veg, adverts were often easy to understand, and athletics commentary on TV included snippets I could recognise.  Even though football commentators spoke at a blisteringly fast speed they repeated a few phrases frequently, so I could even get some of what they were saying.

In the Aquarium (which is excellent – during August it is open until midnight) there were lots of boards with explanatory text.  This was in Spanish and in English.  I spent as much time reading the Spanish, translating it and then checking my understanding against the English version, as I did looking at the marine animals. 

Recognising gaps

I was able to recognise the gaps in my knowledge and treated these in one of two ways.  Some of them I wanted to deal with.  This was particularly true for vocabulary that I wanted to use.  Knowing the basic word for a particular thing in Spanish is often straightforward.  The word for a shark (tiburón) is no more difficult than the word for a table, so why not learn it if and when I want to use it?

With some aspects of grammar it was a different story and I deliberately left big new grammar issues, knowing that they would probably be too much for me at that time.  For example, I can use the present tense of verbs confidently and can express what I am going to do (a sort of version of the future, but one which uses the verb ‘ir’ (to go) in the same way we use it in English – such as “this afternoon I am going to go to the beach”).  I can’t use any version of the past tense yet.  Knowing that there are at least four versions and that they all work differently and have new things to learn meant I thought I should focus on other areas first.  So, when there was something I didn’t know how to say but wanted to use, it was useful for me to know whether it was just a case of a missing word, or whether it would take a couple of hours of learning some new grammar.

To be continued …

I am determined to continue to learn Spanish even after term has started.  I will need to think hard about how to do this effectively.


One reason for writing this blog is so I can reflect, in a week or so, about the implications for teaching and learning mathematics.  (I might write about that at some point too.)

Sunday, July 09, 2017

When is a semi-colon not a semi-colon?

In 2007 I was a marker for the KS3 mathematics tests.  One of the questions asked pupils to work out three angles and to give reasons for each one (such as ‘angles in a triangle add up to 180 degrees’).  There were three marks for the question, with one mark for getting the correct angle and the correct reason each time.  The marking guidance clearly stated that three correct angles without any reasons would result in zero marks. 

I thought this harsh, but I tell my classes about it every year when doing anything angle related.  It doesn’t seem right to me that a pupils should get no marks at all despite knowing the angle facts (they must have known them even if they didn’t write them down to get all three angles correct).  It seems to me that this is justifiable though.  If the question is really testing whether pupils can write down angle reasons, with the numbers there as a support to enabling them to do this, then it is justifiable that they wouldn’t get marks for merely writing correct numbers. (I disagree with this approach, but it is justifiable!)

The following year (the last one before the KS3 tests were scrapped) I looked through the mark scheme.  A very similar question did award some marks for correctly calculating angles, with full marks for those who also wrote reasons. 

I recalled this because of the recent issues with the marking of KS2 SATS, where a correctly placed semi-colon has been marked as wrong.

[From TES article.]

First of all, does this matter?  It’s only one mark on a test, so surely that isn’t a big deal? Unfortunately it does matter.  These test results form part of schools’ accountability measures and if the data for these accountability measures is wrong then that is manifestly unfair.  Pupils’ results are carried forward to secondary schools and this can affect their start to Year 7.  From a secondary school teacher point of view, our major accountability measure is now Progress 8 and this is based on KS2 test results (currently given to 0.1 of a level under the old system).  If the KS2 results are inaccurate then it gives the wrong baseline for some pupils and an inaccurate Progress 8 score, which again is unfair.  (I am not interested in ‘gaming the system’, whereby artificially depressed KS2 scores make it easier for a secondary school to show value-added progress.  I want a fair system.)

Various government ministers have been very clear that they are using the new SATS as a way to force teachers to teach pupils things that haven’t been taught in the past.

It is therefore absolutely right for KS2 teachers to use the marked scripts of their Y6 pupils, along with marking guidance, to influence their future teaching.  This is partly so their pupils can do better in the tests (for the sake of the individual pupil and for the sake of the school’s results) and partly because the government wants pupils to know this stuff and to be able to answer these questions correctly. 

Today some further guidance has been doing the rounds.  I find it hard to believe this is genuine in this form.  Surely official marking guidance doesn’t include clip-art?  This does, however, explain why these answers have been marked as wrong.

I often find Hanlon’s Razor useful: "Never attribute to malice that which is adequately explained by stupidity".
Maybe there are just some rogue markers out there and maybe the checking system isn’t robust enough to pick up the errors they are making. 

Alternatively, like the KS3 tests I marked ten years ago, maybe the intention is for the marks to be given only for certain things.

The semi-colon question could be testing whether
  1. pupils know what a semi-colon looks like
  2. pupils know where to use a semi-colon
  3. pupils can write a semi-colon that is exactly the right height and slope, in a space between two words that isn’t big enough to hold it

The third of these seems to be what is causing the problem (if you don’t go with the Hanlon’ Razor idea).  It might be possible to argue that children mustn’t write punctuation that is too big for the sentence.  If your semi-colon dot is above the level of the letters around it then it is the wrong size.  

But this is unfair;with the correctly placed semi-colon there is no longer a gap after the punctuation.  And that is wrong too.

The final issue here is how teachers will guide their pupils in future.  Will KS2 teachers now have to correct the height of semi-colons in pupils’ work?  Perhaps that is reasonable, in the same way they might correct malformed letters.  It does mean that the pupils will either need to be told explicitly that punctuation should be the correct size when completing the KS2 tests or will need to be given work that involves them placing semi-colons so this can be marked as incorrect if the semi-colons are too big.  Neither of these things seems sensible!

I am sure that KS2 teachers will find a way to do this well, but it seems so wrong that it is deemed to be necessary. 

Friday, January 20, 2017

Progress 8 - newspaper nonsense

On 19 January 2017 the Daily Telegraph published the following article online.  The article has since been largely rewritten.  I have commented on the original version.  The text in red is mine.

There are several things that worry me here.  There are so many errors and misunderstandings and that isn’t good.  The biggest worry is that I tend to trust the press.  My default position (and I suspect that of many other people) is to believe what they say (not just this newspaper but others too).  If there are this many misrepresentations and errors in an article about a subject I know a little about, what are the chances there are errors in other coverage too?  Can I believe anything I read … ?

Secondary school league tables 2016: Over 1,500 schools are falling behind, figures show
19 JANUARY 2017 • 12:33PM
More than 1,500 schools are falling behind, according to the Government's new progress measure, official league tables released today by the Department for Education (DfE) show.
Almost a quarter of a million pupils monitored under the Government’s new GCSE ranking system – called Progress 8 – are at schools which were given a negative rating, meaning they are performing below the national average.
Progress 8 has been created so that the total Progress 8 score for all pupils across the country is zero.  The average score per pupil is therefore also zero.  When this is transferred to school level things are not entirely straightforward.  We shouldn’t expect there to be exactly the same number of schools above and below the average, but this is roughly the case. [The discrepancy is down to the different sizes of schools, the fact that special schools tend to be smaller and that Independent schools are not included within the figures.   To give a simplified example: if you have all of the pupils in two schools, one of which has a positive Progress 8 score and the other a negative one.  Half the schools are below average.  If you then split the pupils from the school with the negative score into two different schools you could have one big school with a positive score and two small schools that have a negative score.  Then there would be twice as many schools below average as there are above average.] 
My reading of the numbers is that in 1616 schools the pupils averaged a Progress 8 score above zero, in 48 schools the pupils averaged a score of zero and in 1994 schools they average a score below zero. 
There are three issues so far in the article.  One is that it is bonkers to complain that lots of schools are “below the national average”.  The way averages work is that some are above and some are inevitably below.  The only way to avoid having anyone below average is for every single school to be exactly average. 
The second is I can’t get “1500 schools” from the figures!  1994 schools were below zero (but it would seem odd to use “more than 1500 schools” to describe 1994 schools).  Maybe they are referring to the 1598 schools that are -0.1 or below.  But this is not sensible, because the government has said Ofsted will investigate schools whose figures are lower than an arbitrarily chosen -0.5, of which there are 705.
The third issue is that it is nonsensical to make a big deal about “Almost a quarter of a million pupils” attending schools where the average Progress 8 score is negative.  The key thing is the _individual pupils_ whose P8 score is negative.  If a child gets a negative score it doesn’t matter whether they are at a school where lots of other people did the same or not.
Free Schools, which were introduced under by former Education Secretary Michael Gove in 2010, were proportionately the worst performing schools in the state sector.
In total, 84 per cent of free schools were given a negative rating for progress.  Academies performed comparatively well, with the majority (57 per cent) rated as above the national average.
If you look at the figures for every school that is a Free School then there are 13 with a positive P8 score and 71 with a negative score, which is 84.5% negative. 
There are some important nuances that have been missed here though.  First of all, this is only 84 schools out of 227 Free Schools.  The rest have not yet got to the stage where pupils are in Year 11, so only 37% of Free Schools are included in the figures.
Then let’s look at the different types of school.  Free Schools include special schools, Studio Schools and UTCs.  It doesn’t seem fair to put all of these together and then to compare them to Academies.  For example, if we look at all special schools (whether Free Schools or not) we find that 406 of them had negative scores and 2 of them didn’t.  This is for sensible and understandable reasons, which I won’t go into here.  The point is, though, that the different types of school that make up ‘Free Schools’ is not the same as the types of schools that make up ‘Academies’. 
In fact, Special schools were only a small fraction of the Free Schools, but UTCs are very different from mainstream schools and seem to have their own particular challenges.  Again, it seems unfair to compare them to academies.  I didn’t know anything about Studio Schools so I looked them up.  Their website states that the “Studio Schools curriculum moves away from traditional methods of subject delivery with the curriculum delivered principally through multidisciplinary Enterprise Projects in the school and surrounding community”.  It also previously mentioned that students “work towards GCSEs in English, Maths and dual award Science as a minimum”.  If as a school you have some students who only work towards GCSEs in English, Maths and double Science then it is unsurprising your P8 scores are low.  (I pass no judgement on Studio schools – I am only pointing out that their curriculum requirements are not closely aligned to Progress 8.)
The figures also revealed a clear north-south divide with the more than half of the ten worst local education authorities (LEAs)  in terms of student progress situated in the north-west of England.
The league tables showed that every single school in Knowsley, Merseyside, was failing, as were 90 per cent of schools in Redcar and Cleveland, North Yorkshire.  
It might well be reasonable to look at exam results in different areas of the country, but again there are other confounding issues.  For example, if it happens that there are more PP pupils in certain parts of the country then those areas’ P8 scores may well be lower. 
It isn’t right to describe schools that are below average as “failing”.
Meanwhile, the five best performing areas for pupils achieving 5 A*-C including English and maths were all in London, with Hackney, Kingston upon Thames, Kensington and Chelsea, Barnet and Westminster at the top of the league table.
Nick Gibb, the schools standard minister, said the figures “confirm the hard work of teachers and pupils across the country is leading to higher standards”.  
The implication in this article is that Nick Gibb’s quote refers to the Progress 8 figures.  Either this is wrong and Mr Gibb was actually talking about other figures, or Mr Gibb is wrong. 
If you have a measure like P8 which has zero as its average then this is, definitionally, a ‘zero-sum game’.  You have no way of telling whether standards are going up, down or staying the same.
This year, nine of the top 10 schools for GCSE results were state schools, with the country's best three state schools all situated in north London. Henrietta Barnett School, a girls' grammar school, scored 100 per cent for students obtaining both 5 A*-Cs.
Queen Elizabeth's School – a boys' grammar school – came second, with 100 per cent of students gaining 5 A*-Cs, and St Michael Catholic School, a grammar school for girls, came third. 
Independent schools performed poorly in the tables, with 62 per cent of them coming in the bottom third of schools for 5 or more grade A*- C including English and maths.
This is due to the fact that many independent school students sit IGCSEs, which are not officially recognised by the Government. 
This isn’t getting better.  The school that came first scored 100% (we are now talking about pupils gaining 5A*-C grades rather than Progress 8).  The school that came second also scored 100%.  It is difficult to see how they have been separated.
This year is the first time that the Government’s new measure for attainment in GCSEs, called Progress 8, has been used to rank schools. It measures students’ progress in eight subjects from primary school through to secondary school.
The eight “core” subjects measured by Progress 8 are English, maths, history or geography, the sciences, and a language.
Um, no they aren’t.  This confuses the EBacc subjects with Progress 8.  While English and Maths are part of P8, pupils need three of the others that are mentioned and then have any three further subjects (which can include the other English exam that many pupils take and can include other EBacc subjects).
Mr Gibb said: “As well as confirming that the number of young people taking GCSEs in core academic subjects is rising, today’s figures show the attainment gap between disadvantaged and all other pupils has now narrowed by 7 per cent since 2011.
“Under our reforms there are almost 1.8 million more young people in good or outstanding schools than in 2010, and through our new, fairer Progress 8 measure we will ensure that even more children are supported to achieve their full potential.” 


Here is the URL for the article.  As mentioned earlier, it has changed significantly since I copied and pasted the original version.  There is no official statement that the article has been rewritten (although the time given on the article is different).