Monthly Archives: September 2015


One of the challenges facing our school at the moment due to assessment without levels is our baselining procedure. Our pupils can come to us at any point from year 3 to year 9 (it’s rare that we take new pupils in KS4), with varying experiences of education. Some come straight to us from a mainstream school, some have been at the PRU and others have been allocated a number of hours of home tutoring.

We baseline new pupils upon admission in order to assess their levels, needs and eventually provide us with a measure of just how (hopefully) awesome our teaching has been. We use Hodder Oral Reading Tests and Graded Word Spelling Tests for reading and spelling (which we use across the school twice a year). Until last year we used the GOAL online assessments for English, Maths and Science, but with the removal of NC levels, this product was removed from the market and, much to the pain of SLT, nothing has replaced it.

We have reverted back to the paper version of GOAL formative assessment that we used before the online tests. This is a series of multiple choice questions with a simple ‘number correct=a NC sub-level’. There is information in the depths of the teacher’s guide to help analyse the results, but we just report the end level for our records.

An issue we find regularly with our pupils is that they come to us with huge gaps in their knowledge. Understandable if you’ve had a series of fixed-term or permanent exclusions. They’ve often missed whole topics and we find, for instance, that they’re brilliant at working out lines of symmetry but give them a 3D shape to identify and they haven’t got a clue.

Something that struck me as I baselined a new pupil the day after reading the ‘Commission on Assessment Without Levels: final report’ is how, as well as changing the way we assess generally, there must be a better way of assessing our pupils as they enter the school. We can’t rely on KS tests to give us a picture of what they can do and as the report says, ‘There is no intrinsic value in recording formative assessment; what matters is that it is acted on’. As we develop our school assessment systems we need to look at how we build baselines into this so we can identify exactly what the pupils coming to us can do and where the gaps are that we need to fill in. At the moment I give teachers an old NC sub-level that could be based on a pupil getting all the easy questions at the start of the test or all the harder ones at the end. To actually find out what the child can do, the teachers have to work it out themselves.

I was spurred further into thought as, aside from the Assessment Without Levels report, two other windows open on my computer were Michael Tidd’s resources of curriculum key objectives and Daisy Christodoulou’s slides from researchED with her focus on multiple choice questions. I’m wondering if we could use these bits of information to create our own baseline tests?


What do we baseline for?

  1. Identify what the pupil knows and can do/ may need help with.
  2. Set a starting point for us to gather data and measure progress.

For me, the first reason is the most important. I get frustrated when class teachers ask me how a new pupil has done and I have to report a vague ‘sub-level with caveats’. I want to be able to give them specifics, but specifics they will actually use. Even when we did online testing, I’m not sure how much of the data we printed off was actually used to assess pupil needs.

The second point is more linked to the development of the whole school assessment policy, and debate around the worth of measuring individual pupil progress would probably add another 1000 words, however we do need to follow the Assessment Without Levels report and ensure our ‘curriculum and approach to assessment are aligned’ and I’m very aware that whatever system we come up with, we don’t ‘reinvent levels, or inappropriately jump to summary descriptions of pupils’ attainments’. The report also specifically states that ‘for pupils working below national expected levels of attainment assessment arrangements must consider progress relative to starting points and take this into account’. Baselining is our ‘starting point’ and needs to fit into this.

How do we create our own, useful, baseline procedure that identifies where our pupils are?

Well we need to think about what our pupils need – no point coming up with something that’s great for some kids but our lot flounder with. I’ve done a lot of baseline assessments of SEMH pupils. I’ve had everything from pupils that hide under tables brandishing a weapon, to ones that fly through at genius level. In my experience, they are often quite de-schooled and not used to sitting and working for any length of time; they have lower levels of literacy and big gaps in their knowledge; they’re apprehensive of coming to a new school and scared they’ll ‘fail’ the test. We need something to put them at ease and keep them engaged. My, not necessarily complete, list of requirements so far includes:

  • Easy to read/can be read to them
  • Doesn’t have to be done in one sitting
  • Adaptive to very different levels
  • Questions that aren’t too lengthy (lack of stamina/ easily distracted/ simply don’t engage if a question even looks too long.)

The baseline tests we use now have multiple choice questions. Daisy Christodoulou’s work has prompted me to think about the use of these more closely and I’m wondering if we should use our curriculum to create our own questions. Is it worth going through the questions on the existing tests and evaluating where they fit our curriculum? Daisy shows the impact of different types of question, for example, using multiple correct answers to get them to really read the options, and thinking carefully about how our choice of the incorrect answers can inform us just as much.


Michael Tidd’s key objectives tables break down expectations for KS1 and 2 – do I assume that it’s worth measuring what they know of these before we move them on to our KS2 and 3 curriculum or KS4 courses? Does it work like that? Could we use Michael’s work to help us create our multiple choice questions? Certainly it would be easier to do for some objectives than others. How do we do that and avoid something like the old criteria based grids? (By recognising that it’s particular questions about that objective that they can/can’t do rather than securing that objective as a whole based on a couple of questions, I suppose.) It’s the gaps we need to find and fill in if they’re to access all the work we need/want to cover, so whatever we end up with needs to be both useful and used.

Is an administered test the answer? Might teacher assessment be more useful for teachers/ more specific? (Should probably mention we run a primary model of teacher to class for most subjects throughout 7-16, with some specialist subjects). We wouldn’t get the data for tracking progress in the same way but the data we start with at the moment is inaccurate anyway and whilst we used to, we don’t do the paper GOAL tests with them again to compare. But we need to bear in mind of course that we ‘should be careful to avoid any unnecessary addition to teacher workload’.

I’ve looked around for alternatives that are on the market since we found out GOAL Online was being removed and I’ve not found a lot. The Assessment Without Levels report warns against buying in products and this probably goes for baselining too. The most promising thing I saw a while back was Alfie as it covered English, Maths and Science (we want all three, most do the first two) and you can piece together your own assessment from existing questions. We tried the GL Assessment NGRT with the Closing the Gap trial we were part of and the kids couldn’t cope with that at all – too long and in one sitting. Most gave up and guessed so the results, as beautifully presented as they were, were useless. One of our teachers has been looking at what CEM has to offer and I think it’s worth investigating, but I don’t think it’s what we’re after as a ‘useful’ baseline, certainly with the criteria I’ve thought of.

Thinking about it properly is pretty daunting. It’s a lot of work to set up – the whole process for the whole school is, but surely as we get to grips with how we approach assessment without levels, it’s worth investing time and effort in the part that’ll start them all off? Do we wait for an online ‘bank’ of questions and go from there? Finding a balance between putting together a robust test, that fits our pupils and our curriculum, and avoids excess data management for teachers, whilst ensuring they don’t have to test them again after I’ve done it, is something I’m sure is possible. It’s on the tip of my brain but I’m not sure how we start.

I wrote some ideas about CPD in school last year and I rather suspect I overstepped the mark with that so I’m cautious about suggesting around this topic. Does anyone have any answers? What do other people use? Do I just carry on with what I’m doing ‘til told otherwise or I rock the boat a bit?


*The picture at the top isn’t an actual question from our tests. That’s just some stuff from our mantlepiece at home. Still, it’s quite similar to some of the questions so you can see where our problems lie.*

The recent news features about state school students outperforming private students at university have raised a few questions and theories about how to explain the ‘unexplained gap’ from all sides of the educational domain. Understandably, if you’ve forked out thousands for years of education you’d rather not rely on any connections to get ahead in place of academic success, and equally, if you’re paying for extra tuition you want to know it’s having an impact.

Anecdotally I know of people who have been coached, by either parents or professional tutors, throughout their school life, achieving high grades until the point where they no longer had that support and found themselves unable to study on their own. Whether that’s due to constant nagging to get work done, parental ‘input’ in coursework or being given revision plans instead of working it out for themselves, it’s not an unusual occurrence.

When we talk about exams and qualifications with our pupils we place so much pressure on that particular moment, those particular results; yet for most of us they’re just a stepping stone to the next phase. There’s no way we’re going to downplay what they’re working towards of course, and I think for some children (certainly with our lot) focusing too far ahead is overwhelming, but I do think it’s something for us (the grown ups) to keep in mind.

Thinking about where my peers have ended up is an interesting (and perhaps scary) exercise. People I was in a class with from 7-16 are in jobs as wide ranging as pub landlord, shop assistant and deputy headteacher. People I was in a class with from 11-18 include a web designer, an MP, a psychiatrist and a chiropodist. We all had the same education (state FWIW). We all had the same access to the same subjects, with the same teachers, and we’ve all gone down different routes. Some went to university, some travelled, some had decided what they wanted to do when they were in infant school and some took a while to choose. Obviously parts of this can be put down to different life experiences and we all know, try as they can, schools can’t force outcomes from children. Even within my own family though my brothers and I have gone for very diverse pathways. Between us there’s a teaching assistant, an architect, and Andy kills Mufasa every night* in the West End. *not every night. The other day he mended the Sun.

My point is that… I’m not sure what my point is.

Education prepares you and takes you to the next step. For most people, once they’ve got the GCSEs to get into their post-16 experience of choice, no one asks for their GCSE results (maybe a requirement for a grade in certain subjects). Once they have the required A level grades for university, no one needs to know those. Once they have a degree it’s about experience and fighting for jobs with everyone else. None of that’s new information of course, but it shouldn’t be surprising that the same education leads to different futures with different needs and values.

We like to measure things though. Especially if we’re doing well. This article highlighted today by Carl Hendrick explains quite clearly our need to be one-up, how we see some things as intrinsically better or more valuable. In education this is the same; we’re held accountable for everything we do whether that’s in school, parents, government or in the press – no one wants to be the one that’s seen to break the chain of progress.

The thing I think is most important that we take from the performance at university story is that we have a responsibility to think about preparing pupils for the next phase so they don’t flounder. Yes, our job is to arm them with the knowledge and skills to pass the next set of tests, but the way we do that should ensure that once they’ve moved to the next step they can keep going, so if working without the supports that got them there turns out to be a factor in success, they’ve got a chance. Equally of course, we need to keep the curriculum as wide a we can to allow them to refine their choices as they go and find out what they really love.

The annual celebration of the release of Taylor Swift’s now classic album ‘Red’, took place on Saturday at South Hampstead High School in London.


There were some noticeable absences which was a shame (although there were people I didn’t see at all and I know they were there because of twitter, and I didn’t see one Bennett point-and-wink ALL DAY), but it was another jam-packed, triumph of a day as the researchED juggernaut hurtled through London on its way back round the globe. I’ve already written a bit about my own session so this one’s more of a mulling over of the themes and ideas that I’ve taken away from the day (not at all in the order of viewing).

2013 was the year of ‘no lunch’, 2014 the year of jealously staring towards the single box of air conditioning, and 2015 the year of a steep hill and many, many stairs. The overarching message of researchED though has remained and that is all about taking control, but now with increasing support. There still seems to be a determination to keep accountability away from this precious seed of control and I’m really glad this is the case. We’re still ‘working out what works’ and surely the whole point is that we’re open to shifting ideas – making schools accountable increases the need to find answers right now. I can sense it in the TSA requirement for research and development and I think this is great (certainly providing me with opportunities) but it’s important that there’s still the chance to find our own way.

Becky Allen showed clearly how accessible a role involving research can be – no specialbeckya equipment needed. Yes, she gave a mention to journal clubs and perhaps I’m biased towards that, but it’s a great example of how you don’t need a dedicated research lead or super access to research to get involved. She had a lot of advice about how to get going and I think the next steps for me need to be around developing areas for research in my setting. I know it’s not the direction for everyone, but I think I’ve got to try. I’ve got the added difficulty of working in a small school so numbers aren’t ideal for any sort of pilot, but maybe there’s room to use the TSA.

I was nickrreally interested to hear about Nick Rose’s research lead role and how he is coaching colleagues in teaching enquiry and using teaching logs as a scaffold. Allowing teachers to engage in enquiry and explore ideas in a low stakes environment, before using the outcomes to help inform professional targets is the best way I can see to encourage teachers to engage with research and use the information that comes out of it without people feeling like they might get it wrong. I’m coming to the end of a few projects this year so I need to keep the momentum up in school. Nick has sparked me to look at the SIP and see where I might be able to suggest ways research can inform our response to that. I don’t know why I hadn’t thought of it before, but I feel I’ve tested the water with the support of other organisations and maybe now’s a good time to push it a bit further.

I’ve managed to gather a few ideas for my next issue(s) of Relay. Particularly drawing on the sessions from Daisy Christodoulou, Tom Sherrington and David Didau. Daisy and Tom both focussed on how we can use rdaisycesearch to inform the decisions schools are having to make. Daisy’s was first and discussed using research to develop assessment without levels. I’ve read most of her blog posts on these issues but she managed to put it all into an easily digestible capsule and I’m going to go back and read her blog again. Particularly the parts on multiple choice questions (had a course that used these as the exam at university, hated it but now have a greater appreciation of their worth), and comparative assessment (this is how we do it in art and it’s good to have some back up for our methods). I thought back to one of Daisy’s points on writing multiple choice answers during David Didau’s talk when he said ‘when we’re certain, we stop looking’ and I definitely think it’s something we should consider.

In a more personal way, Tom Sherrington took us through the process he had used to make decisions about literacy tomsprovision at Highbury Grove School. It is fairly easy for any headteacher to search for an answer on Google and run with the results, but not everyone would question what they found and email the researcher to ask. It’s actions like this that will change the way we use research in schools. Not by taking part in massive RCTs (although brilliant), not by sitting on government committees ( I know that’s good too), but by understanding how to read the research we find, questioning it and finding out how it really can inform our practice.

David Didau posed perhaps the most important question of the day; ‘Are you a fox or a hedgehog?’. Despite the slight hint of Barnum statement, I am deffo a hedgehog. Mostly because spikes and the grumpy face. daviddI don’t know if this was the answer he was after, but then again, we don’t know what we know we don’t know, if we know we know what we don’t know. Y’know? #teamhedgehog

My final session was in Sam Freedman’s Room of Despair. He took us through the top five issues facing the government including classics such as funding, capacity and infrastructure. Despite the lack of lols, this is a man who knows his stuff. I’ve already paraphrased him several times in school and whilst most of us left feeling a little deflated, Howard left with the new found ambition to become a Regional School Commissioner. It takes all sorts.

So that was my day. I suspect more elements of it will filter through over the coming weeks. Special mention needs to go to Davis Weston’s posture. It’s a beautiful, beautiful thing and if you don’t give a fig for educational research, go to one of these gigs just to see him glide.


2015-09-04 18.56.00

At the moment I think my #rED15 blog posts will stretch across about 3. This is just a quick one about my session including a couple of bits I forgot to say.

Having bitten the bullet and volunteered to do a journal club session at the Cambridge Research Leads event, I decided there was nothing to lose and offered to do a similar one at the national conference. Turns out it’s a bit of a different affair. I was nervous before Cambridge and dealt with that through extreme preparation. This time I went into denial. On the day I felt so out-of-place I got told off by Tom Bennett for not going into the speakers room.

Anyway. I did it.

Despite the fact I was very aware that there wasn’t a huge amount of time to squeeze in all the ‘about’ journal clubs bit and the actual experience of a journal club, and the clock on the wall was one of those hilarious ones with backwards numbers, the response I’ve had in person, over twitter and email has been lovely. I think there are quite a few school journal clubs that are going to pop up now and I’m really excited to hear how everyone goes, so let me know!

The session was very similar to Cambridge so I won’t repeat all the information when I can direct you to that post:

Cambridge Journal Club

The slides from my #rED15 presentation are here:

Journal Club #rED15 Presentation (pdf)

The paper I picked was ‘The Relationship Between Student Engagement and Academic Performance: Is It a Myth or Reality?’ (Jung-Sook Lee, 2014). Which I accessed through the Education Arena collections. A few people asked why I’d gone for that specifically and how I go about picking papers for our clubs. This one ticked a few boxes for me. I wanted something with a general topic so more people would find it relevant in some way, not too long (10 pages I think), and something with some statistics for people who like that. I think the stats threw a few people, and I completely understand. There are plenty of papers without pages of numbers so I urge everyone to have a little look around. Once you’ve gone through the process a few times you’ll start looking forward to the next collection and gauging what will work for your group.

The things I forgot to say…

Education Arena are offering 30 days free access to their education journals – tweet, re-tweet, share or mention their hashtag #SharingEducation on Facebook and Twitter and they’ll private message you the access token allowing you access to all Education content from 2013-2015 for 30 days from activation.  See here.

Also, it’s worth following SAGE too as for the past few years they’ve given free access to all their publications during October. I think both access events are to do with National Teachers Day.

So. Not my most coherent post but I wanted to get something out before everyone forgot about it and their awesome journal club plans. Please let me know how you get on and I’m happy to point anyone in the right direction. Finally, Howard did an excellent job of shamelessly plugging the biscuit element, so props go to him for boosting the numbers 🙂