The Education Insider: Did I write this blog independently enough?

In her eleventh post for us, education expert Jodie Lopez looks at the problem of inconsistency in assessment – why it’s happening, and how we can work towards fixing it.

There has already been a fair amount of criticism in the press and in teacher circles around the Interim Teacher Assessment Frameworks for 2016 (and now for 2017 too) and this is especially evident in the writing results – 74% reached the expected standard in writing despite this being quite a bit higher than the 66% who reached the expected standard in reading. It is not far off from the 72% who achieved the expected standard in Grammar, Punctuation and Spelling (GPaS) however, which may indicate that the focus on those skills over the creative aspects of writing may have had an impact.

Something that has been of particular concern (causing the Department for Education to introduce a test for moderators) is that of inconsistency. Some Local Authorities (LAs) saw huge drops in their writing results where others suffered what could be described as minor blips. The figures for the GPaS tests showed much more consistent changes countrywide against the new ‘raised bar’ curriculum than the writing results have thrown up. So what is causing such discrepancies? Are some Local Authorities giving better advice or supporting their schools more? Or could it be that the frameworks ‘Pupil can’ statements and the guidance for schools and LAs from the DfE is somewhat…how do I put this…open to interpretation? “SURELY NOT?!” I hear you cry in unison. I know, I know, surely a series of quite simple statements could not be misconstrued or confuse anyone?

Well, let’s look at some examples, shall we?

I will start with some of the ‘Pupil can’ statements for Key Stage 2 (KS2).

The pupil can write for a range of purposes and audiences (including writing a short story):

• spelling most words correctly* (years 5 and 6) – Expected Standard

• using different verb forms mostly accurately – Working towards the Expected Standard

Already we can, I think, all see the rogue word in these statements which may cause a few furrowed brows – “most(ly)” – hmmmm…

How many is most? How often is most? 51% of the time? 70% of the time? 85% of the time? This is not a new problem in assessment. The old levels system had the same lack of consistency across the country for exactly the same reasons – indeed it was one of the triggers for scrapping levels – and yet here we are again with exactly the same dilemmas.

If your Local Authority or MAT has decided on some more specific guidance – which would be helpful to ensure you would know what to expect at the time of possible moderation – then you could pin this down. But what if your LA has decided “most” is anything above 60% but the LA on the other side of the tracks has decided on 90%? Well, good news for you at least! Your results are amazing…not so many happy teachers over the road though!

It is also up to the moderator to decide, without being in the lessons all year, whether or not the writing being moderated was in fact “independent.” There are huge debates going on around what is classed as independent and Michael Tidd even did some investigating around how consistent teachers are in their interpretations of independent. See his blog for his excellent analysis of the results.

As well as, obviously, having an impact on the results for a school, there is the knock-on effect of the inconsistences for the pupils going up to secondary school too, which also causes problems for secondary teachers (who have the added stress of the new Progress 8 measures). I was talking to one secondary English teacher last night, who apparently had some Year 7s join in September with the equivalent of old level 7Bs – she has yet to figure out who they are based on the writing standard of the class!

So does this mean the LAs and MATs are to blame for their guidance or moderation techniques? Well no, not in my view. Because it should never have been their responsibility to suddenly decide. They waited as long as the schools did for these ITAF documents to be released in the first place. They had them suddenly arrive mid-year just like schools did. They then had most schools immediately look to them for advice when they had little time to prepare their own responses and decisions.

The LA advice may possibly have exacerbated the problem in some areas but it doesn’t excuse the problem and actually does us a favour in a way by highlighting it. The problem sits squarely with the DfE still. And it is still a problem. Such woolly guidance alongside such strict and thorough expectation – the “secure fit” model – creates mixed messages and will keep teachers and school inadvertently tripping over their assessment shoelaces. It was the DfE who untied them and didn’t give the teachers a chance to bend down and tie them back up before pushing them forward. I hope there is a simpler way in the future. Maybe it will involve externally marked writing tests again, but given the current economic climate I won’t be holding my breath just yet!


Jodie is an award winning ex-primary teacher who now works as a Freelance Edtech Consultant. Her interest in using technology in education has led to her working with a number of educational technology businesses since leaving the classroom. Most recently she has been Head of Education for an assessment system provider and has specialised in helping schools to transition to the new curriculum and leaving levels behind. Follow her on Twitter here.


Similar Posts

All categories

Blog home