The Education Insider: Can summative assessment become formative?
In her latest post for us, education expert Jodie Lopez discusses looking at testing in a positive way; using children’s results as a basis for delivering targeted support.
With the SATs barely over with in Year 6, the media channels (and indeed social media channels) are awash with articles, as they are every year, debating the usefulness and problems caused by such tests. A lot of the concern tends to stem from the high stakes accountability. Other concerns in recent years lead back to concerns over curriculum content since changes were made in 2014, and the associated issues with having children taking tests which are designed around a curriculum they have only been studying for two years.
If we put the SATs themselves to one side, however, I think it is important to discuss tests as useful within ongoing assessment for learning and the impact they can have on teaching. One thing the SATs currently do not give schools, in time to be useful at least, is feedback and analysis which aids learning and next steps. They are not designed to do this. They are very much an example of summative assessment.
However in many schools I have visited, and through conversations with teacher and school leaders, it is apparent that many teachers and leaders believe all tests always equal summative assessment. There seems to have been a line drawn which puts observations in class and marking books into the “formative assessment” category and tests into the “summative assessment” category. However, there is most definitely a use for tests within formative assessment too.
Those of you who have been teaching for a few years will remember that not so very long ago it was standard practice for every year group to take a test at this point of the year, usually a QCA levelled test. And in most schools we did this three times a year. These were mostly used as summative only, however. Very rarely did I see many people making formative use of the big gap analysis grids which came with those tests. I did use these. Not because I love big spreadsheets…ok that’s a lie, I do love spreadsheets…so not ONLY because I love spreadsheets, but because without filling in that gap analysis grid I didn’t feel as comfortable about administering the tests. If the only purpose they served was to make it look like every child in every year group had all made exactly 2 sub-levels of progress, then what did the children really gain from the week of tests?
So I used the grids to ensure I could pluck out some common misconceptions across the whole class, and some key weaknesses for groups of children. It was especially useful in my first couple of years teaching as it really helped to focus my planning after the last data drop of the year. It was important at this stage to focus on the achievable. There is not a lot of time left before summer holidays and realistically we need to also get reports written, practice plays and concerts, have a day trip to the beach and organise summers fayres and parents’ evenings. So I am not suggesting you try between now and summer to fill every single gap on the sheet but pick a few key areas which are highlighted.
Now you may be wondering at this point which actual tests and grids I mean, as QCA tests are now from a (albeit recently) bygone era. Well, it really is up to you what test you use. They can be as formal or informal as you like, as long or as short as you feel appropriate. You can buy them in or create your own using resources you already have. So long as they will give you the information which will help you to plan from them.
As an example, use a reading assessment which covers a range of skills. Then, using the gap analysis grid (you can easily create your own in any format you wish if none comes with the test), group together children who need extra help any particular skill – inference, for example. If around six children show they are struggling to answer questions in the inference category then a week or two of an intervention or additional support group, or planned teacher focus time during guided reading, could make a huge difference and give them a big step up ready for next year.
You may be thinking now “Well I do this, who doesn’t?” and of course I am sure many teachers do this already or have done in the past. However, the change of curriculum and the heightened pressure of recent years surrounding the SATs and the “raised bar” curriculum has definitely caused many schools and/or teachers to shy away from what is seen as more formal testing as they do not want to add further pressure to the students. Or the focus on Years 2 and 6, coupled with a lack of a nationally approved test, has caused schools to push regular testing to the back burner for now. However, when used well, a test can be a huge support for a child if they get no summative result but do get some targeted support and they see the effect of this as their confidence grows in their weaker areas. This will actually help them to have positive attitudes towards testing throughout their schooling.
Jodie is an award winning ex-primary teacher who now works as a Freelance Edtech Consultant. Her interest in using technology in education has led to her working with a number of educational technology businesses since leaving the classroom. Most recently she has been Head of Education for an assessment system provider and has specialised in helping schools to transition to the new curriculum and leaving levels behind. Follow her on Twitter here.