This was a juicy one. Our PE teacher Joe came to see me the other day to ask if I took requests for looking at research – Research Lead 101 says yes I do, so I did. Apparently PE Edutwitter has been talking about cooperative learning for teaching PE. He wanted to know if there’s anything specifically that supports cooperative learning for SEMH pupils.
Certainly in regards to more academic subjects (the cooperative learning literature seems to use ‘mainstream classroom’ a lot but I’m adding SEND to the mix so don’t want to confuse things) direct instruction with occasional support from a bit of group work is something I’m happy with. In PE there’s obviously a lot of group work going on so it seemed like something that was worth a look at, particularly the opportunity to look at it from an SEMH/SEBD (the literature hasn’t caught up with SEMH yet) point of view.
The other thing that intregued me was that a few times now (I’ve no references, just vague memories) I’ve heard sports instruction – direct instruction, drilling, practice of individual skills rather than whole-game – as examples for what we should be doing in other subjects. Here is PE looking at the alternatives to doing that.
I thought it was a good opportunity to try out something I’d been mulling over and create a single-subject add-on to Relay. I’ve ended up creating ‘Relay FOCUS’ which in this instance looks at the research surrounding PE and SEBD/cooperative learning more broadly and then explores how they might work together. I’m not sure whether Joe was quite after what I’ve ended up with but I’m pleased with how it’s turned out and hopefully there’s more individual requests that I can work on.
I know it’s not perfect and it won’t cover the whole topic nearly enough, but it’s not intended as a formal piece of literature research and hopefully it’s enough to help Joe decide whether he want to explore the approach or whether it’s something he wants to look into more.
If you fancy having a glance, the pdf’s here: http://westburyschool.co.uk/wp-content/uploads/2017/10/Relay-FOCUS-Cooperative-Learning-and-PE.pdf
I’m going to say it. rED17 was the best one yet. There have been researchED conferences that rival of course, and the light-up pens have reached legendary status, but the atmosphere at this one was something different. Whether it was the venue, Chobham Academy, with its circular building that forced delegates to cross paths and talk as they found their next session, or whether on a more personal level I felt like I knew more people there, there was a buzz I’ve not sensed in the same way before.
Despite recent naysaying (and outright attacks) around researchED there appeared to be lots of people who put their hand up to say it was their first time so the great conspiracy doesn’t seem to have put them off. A noticeable feature was the conversations going on. In previous years dining room chatter was filled with overhearing people talk about the speakers they’d seen, in awe at who they were, and this year people seemed to be discussing the session and the ideas. Instead of the queue in the loos being all “I saw X, I love him. I’ve read all his books.” there was a definite vibe of “I went to see X talk about Y. That really fits in with what we’re trying to do with year 9”.
That’s not to say there weren’t ‘big hitters’ – 2017 attracted speakers such as Minister of State for School Standards, Nick Gibb and Ofsted Chief Inspector, Amanda Spielman; but I wonder if one of the achievements, and I really mean achievements, of researchED over the past few years is to make it normal to see and hear these people in person and it’s taken away some of that awe – allowing the debate to take hold. Nick Gibb tried (I think. I’ll be generous for a moment) to stick up for researchED and its contributors by talking about those academics who hadn’t engaged as being stuck in their ‘ivory towers’. I have to say, I think a considerable amount of ivory-tower-placing comes from us, not them, and the more we do engage and interact the further down the tower they come. It works both ways.
I did go and see Nick Gibb’s session. To be honest I wasn’t impressed. I wasn’t outraged either though and used it as a good opportunity to catch up on Twitter. I can be a crazy note-taker during these things but my word-for-word notes from this one read:
- Telling us about rED
- Embrace challenge and debate
- Answering the h8ters
- Ebacc, reading
- Reporting G4+
Tell me what’s new with that? The most impassioned bit was the first bit and that was pretty much reading Tom’s blog out loud (we’re at rED, and whether we agree or not, we’ve probably read Tom’s blog haven’t we?). The Schools Week article made it all sound punchy – you can view it here and decide for yourself.
So. To the good stuff!
My second session was Sam Sims with Katie Magee and Dhana Gorasia talking about Journal Clubs. Now, I do like a journal club and this was brilliant to go to as it was Sam’s 2014 session that boosted my explorations into journal clubs to what I’m doing now. The session explored a pilot study of journal clubs as a way to break behavioural habits around teaching and the theory behind this. The second half featured examples of how it had worked at Canons High School. I always make a point to stress that journal clubs aren’t a policy meeting and to avoid getting bogged down in how one paper could be used in school; Katie and Dhana showed how the two can work together with requests for papers on specific strategies and discussions that lead to implementation in practice. I liked the advice that journal clubs can be used to spot positive strategies and behaviours that are already happening in schools and enable these to be shared more widely. There seems to be a few exciting journal clubby things on the horizon, particularly with the Chartered College of Teaching, so I’m looking forward to seeing what happens and getting involved where I can.
Session 3 found me in the Britney-infused sardine can* that was David Weston’s session on Toxic schools. David explored how the school environment impacts on outcomes, looking a lot of things that Kev Bartle had talked about in York last summer. Coming from a school that has historically had a small number of staff with close relationships that is now expanding hugely, I’ve thought about this idea of trust and leadership a lot recently as the dynamics are being forced to shift. David took us through various biases to recognise and avoid (aided by occasional Britney) with some pointers to take back. His presentation’s here if you fancy it and I’m looking forward to any follow up sessions involving late nineties/early noughties classics with a dance routine.
It was me after lunch – good turn out, Journal Club info here – www.edujournalclub.com
I always find that once my own session is done I can sort of float through the rest of the day without the weight of it (literally, lugging biscuits is HARD) but it does make my notes a bit more relaxed. I went to see Martin Robinson give a warning about Growth Mindset. It does concern me that this is a bit of a bandwagon that looks fancy and evidence informed so it was good to pick up some ammunition for the time it descends on us here (and Martin was lovely, obvs). I went to an Institute of Ideas conversation about Mental Health in schools that was interesting. I agree with points that if we are over-cautious we risk medicalising ‘normal’ responses from pupils and found this idea that we have a ‘cultural script of fragility’ hit a point too. However I’m in a school where our pupils have (by definition) a range of mental health issues and I can see the problems of under-diagnosis and lack of intervention too. A teacher’s job is to teach but they’re with us a considerable amount of time – quite often the only stable time they have, and we have a role in safeguarding them too.
As has become normal at these things one of the best sessions was hosted by a local boozer (and a proper boozer it was too) where I had some great conversations with lovely people. Some about education, some about shoes, some about the Midsomer Murders Tour… I’ve stepped in to defend researchED a couple of times recently and I think we all know there are people who aren’t going to be swayed either way. It was, as always, a diverse day with rushing round behind the scenes that looked ripple-free from the outside. The familiar company was great, the new faces were also great, and I was left feeling overwhelmingly positive (even after the karaoke) about all the opportunities we have to do great things. If people choose not to engage with researchED then I really hope they find something they do like because it’s a shame to miss out.
I’m not too enthusiastic about ‘teachers doing research’; I am more enthusiastic about the opportunities for schools to take part in larger, more formal research trials and partnerships with higher education. My position set out, I think that what school staff can do is question things.
The staggered start to the new academic year for schools has caused my timeline to be peppered with INSET tweets throughout the week. A few have caught my eye and one particularly seemed to connect with last night’s #UKEdResChat which asked “Are we in a research bubble? Is so how can we pop it?”. The tweet’s from a locked account so I won’t embed it but it read:
“Really pleased how positively staff took on board our drive to embed a #GrowthMindset across school! #ThePowerOfYet”
This tweet was from a primary school colleague in our TSA Innovation Hub – a follow on from the Evidence Based Teaching Group we had, and shows engagement and dissemination of research we have touched on within the group and whilst it’s not bursting the bubble it is perhaps stretching it a bit.
It strikes me though that if research informed staff, including research leads, are striving to build research literacy in schools, is the ultimate goal not to have everyone on board, but to have them questioning? I appreciate that this single tweet doesn’t have a lot of information in it and there may have been a discussion around the approach – it’s just 140 characters. However, if we are to break out of the cycle of the same people driving the research-informed agenda in the same schools I think we need to be looking to encourage the critical eye rather than introducing top-down initiatives. It’s almost a cliché.
Taking the example of Growth Mindset, I know from my own reading that there has been a lot of debate around implementation in the classroom and my basic understanding is that it’s questionable as to whether there’s an impact and to be implemented properly staff need formal training. I don’t know how this school is approaching it (hopefully it’ll crop up at our next Innovation Hub meeting) and I feel uncomfortable using them as an example when I don’t know any detail about their method so putting that to one side, I think introducing something like this is an opportunity for the type of enquiry that should be encouraged in schools.
Questioning things isn’t the same as resisting change but about exploring initiatives and measuring the impact. Introducing school-wide initiatives should involve reading around the subject both for those driving the programme and those who are taking it on. When new ideas are put to staff it should be a positive thing to be met with questions and leaders should be able to answer those questions – either having read around the topic and predicted them, or by offering the opportunity for staff to answer the questions within an implementation and review process. We don’t need to be doing big research projects but at least exploring evidence for and against and looking for change if you do do it. And leaders shouldn’t be afraid to say that something hasn’t worked.
We need the rhetoric to move from ‘we’re going to do this’ to ‘we’re going to find out if this can work for us’. Doing with, not to. This isn’t something that should threaten leaders but that they should embrace. It can be difficult to accept if you have spent a lot of time in preparation but when changes are met with constructive questions that are taken on board and incorporated into the way we work, I think it will be an indication that the research bubble is at least expanding and we are truly embedding research in everyday practice.
#UKEdResChat asked what aspect of teaching, learning, school systems or government policy etc, what would we commission research in should we have the opportunity. It took me a while to think of anything at first as it’s quite an overwhelming brief but I jumped in with the suggestion that I would like to see some proper investigation into unqualified teachers.
Replies to my choice varied from “I don’t get the fuss” to “it undermines the professional aspect of teaching” – which mirrors wider twitter discussions I’ve seen over the past few years, but what I haven’t seen is anything beyond perhaps stats of how many people are being paid as unqualified teachers. Last night’s brief discussion offered solutions to the problem and mooted reasons for individuals choosing to qualify or not but the general vibe was that (even if the idea of unqualified teachers didn’t offend) qualifying was best. I found points on professionalism interesting as I wondered whether that was more important to people who have completed training than a reason to qualify – does it de-value the qualification in the eyes of some? The point of my original question though was to work this out. Where is the evidence that qualified status is best?
A quick search on Schools Week (a good font of knowledge) throws up a couple of recent stories that offer some figures and concerns. In July this year they had a piece on a Labour Party report that
The number of teachers in state-funded English schools without QTS rose to 24,000 in 2016, a figure that has grown by 62 per cent since the rules around unqualified teachers were relaxed in 2012.
I won’t regurgitate all the stats but they’re in there so have a look. The points made echo other comments I’ve seen though which is pretty much – standards will decline vs schools can hire on the basis of ‘skills and experience’. Another story from July reports on a school rehiring (not quite all) its teaching assistants as ‘Teaching Fellows’ to work towards a teaching qualification and the points raised by this story surround the exploitation of the sort of teacher-on-the-cheap argument to stretch budgets further. It’s only a little look but I think the two main worries about unqualified teachers – ‘standards’ and ‘exploitation’ are pretty much covered by these. Thing is though, there’s a lot of opinion on how it ‘might’ affect pupils or the ‘potential to…’. Do we not need to look at whether it is or not?
My personal opinion is mixed. On the surface it seems reasonable to want all teachers to be trained, but my experience tells me there are some awesome unqualified teachers that get great results – both in terms of qualifications and on a more general level with pupils. Off the top of my head, reasons for me working with unqualified teachers include: staff cover where TAs have taken a class for an extended period and been able to have an uplift in pay to do it; vocational teacher with years of college and industry experience; art teacher with 20+ years experience as SEND TA, a fine art degree and work as an artist. Reasons for them not wanting to qualify include it being a temporary role; nearing retirement and not thinking the extra work’s worth it; not wanting a whole-school responsibility for the subject. There’ll be other reasons but you’d have to ask them. What I do know is that they were/are all brilliant at their jobs and it works for our setting really well. Different situations will appear in other schools where it does and doesn’t work as well but even though gut-instinct is useful, I think there’s an opportunity to explore it more formally.
So what do I think needs to be looked at? I don’t have an idea for a specific research question yet – more ideas of the data that needs to be gathered in order to prompt research questions. As a starting point, things I think we need to know include:
Where are they teaching and how many?
- In independent schools, state schools, and separate figures special schools/alternative provision settings. This needs to be gathered for each key stage.
- Data on the social demographics of schools and OfSTED grades.
What are they teaching?
- Which subjects are unqualified teachers working in?
- External test/exam data
- Primary – likely to be teaching a range of subjects; Secondary – more subject specific?
Who are they?
- What is their experience – as TAs/other school roles; college/higher education; teaching abroad.
- Qualifications – subject specific; ‘teaching’; training in specific programmes.
- Other responsibilities held in organisation.
Reasons for not qualifying
- Personal – happy in role; lack of entry qualifications; financial costs of training etc
- Institutional – school not willing to enable; not able to fund
Perceived gains of qualifying
- From both unqualified teachers and the wider teaching community. Issues such as pay and professionalism seem to be top of the list.
Some of the data for these points will already exist and may already have been collated, but other bits require going a bit deeper and finding out what’s happening at school level. I’m happy to offer my opinions whenever this comes up as a topic and there are some interesting ideas for how to qualify the unqualified, but we can’t do that well without looking at the current picture and working out what schools, pupils, and staff, need.
Enquiry isn’t a specialist activity. It is something we all do regularly – making a mental note of something that went well or how we could change things for next time. As easy as it seems to reflect on what we do day-to-day, the starting point for deliberate enquiry can be difficult. Key to engaging with research in a genuine, long-term way as a practitioner is to start with reflection – reflection on practice and reflection of practice.Reflective-practice is sometimes presented as the opposite to evidence based practice; the qualitative vs the quantitative. Quantitative research is held up as the best research can be, whether that’s the EEF toolkit and trials or the What Work Clearinghouse measures. One argument for evidence based practice over reflective practice is that the latter risks pathologising the practitioner and finds fault with the teacher or student rather than the wider environment.
Action research (or at least the term) is gaining popularity and the close link between action research and reflective practice leads to arguments that they lacks value and relevance between settings. Cautious voices remind that quantitative studies don’t necessarily provide the answers, as Dylan Wiliam says, “[In education] everything works somewhere and nothing works everywhere. The interesting question is ‘under what conditions does this work?”
It’s not realistic for everyone to be part of large-scale RCTs, particularly in settings that regularly don’t fit selection criteria like small or special schools. This can feel isolating and make research engagement seem irrelevant. Reflective practice is one way to get going and can take many forms, from keeping a diary to working in pairs or triads or developing cyclical action research projects, so where do we start and how can we incorporate research into what we do?
It’s frequently repeated that teachers don’t have time to trawl through and decipher research so I decided to have a bit of an experiment with taking a single research paper and setting it out in a way that can be used in the classroom. This is a paper I have used myself on several occasions to provide a framework to identify an area of focus and use as a starting point for enquiry.
The suggestions of how to use the document I’ve created are just that, suggestions. It doesn’t give instructions or solutions for practice and it doesn’t use multiple sources of evidence – the intention is that once a focus is identified, more refined research can take place if necessary around that area.
Whether it’s used for enquiry in the closed classroom or over a wider group; as a starting point for a whole school focus or simply to monitor the classroom over time, I hope it shows how research can be used in the classroom and it provides encouragement for more people to bring evidence into their practice.
So here it is in glorious pdf form. Let me know what you think.
A guide to ‘Evaluating the Learning Environment’ adapted from Ysseldyke, J. E. and Christenson, S. L. (1987) ‘Evaluating students’ instructional environments’, Remedial and Special Education, 8(3), pp. 17–24.
I’ve had a couple of interesting conversations recently about the reliability of ‘old’ research and whether it has some sort of ‘use by date’.
It is of course reasonable to be wary of over-relying on research that was published decades ago, and taking note of age when reviewing evidence is important, but it shouldn’t be a case of dismissing something because it’s been around for a long time if the points are still relevant. Fads come and go but that’s perhaps even more reason to look back at older research – new ideas quite often aren’t new at all. People use examples like ‘Would you trust your doctor if they prescribed using leeches?’ – maybe not for everything, but there are quite a few situations where leeches are still used in medicine today.
Should ‘good’ research be repeated to keep it fresh? Even if nothing new is being done? When this does happen, a quick look at the references and all the previous papers by the author/s are usually still there – of course there are likely to be a few changes but the general take-away messages remain the same and then the work is criticised for re-hashing the old stuff for the sake of it. We hear arguments that research needs to be repeated and ideas challenged as we learn more about how we can improve teaching but as soon as someone writes about a ‘debunked’ idea there are criticisms in the opposite direction.
For example, I’ve seen enough evidence from people I respect to believe that there is no mileage in the concept of individual learning styles but if those same people present evidence that has changed their minds (as solid as that would have to be) I would of course have to reconsider my own position. That’s a provocative example of course but my point is there – we can’t criticise research simply because it’s testing something we think is long-disproved – we need to criticise the research itself.
As research increasingly finds its place in schools, with different staff at different levels of engagement, it’s important to stress the need to develop critical evaluation skills. The role of research lead includes helping people to come at research from all angles – treading round popular ideas of the moment, being critical but not dismissive in the face personal bias. We need to be careful with new research that simply repeats itself rather than challenging ideas and be aware that not everyone has heard all the evidence around each theory – however much we think it’s been discussed to death.
Those of us who have heard all the arguments to the point of fatigue need to make sure we use and develop our own critical eyes too and remember how easy it is to run about in the echo-chamber. As we focus on how we help our colleagues we can’t forget to challenge ourselves. As long as we are aware that the age of a piece of research may limit its value to our work then we’re a step ahead, but maybe it’s more ‘best before’ than ‘use by’. When it comes to it, we don’t have access to everything we need to make a fully informed decision and we need to trust what experts say. If I’m honest, whilst it doesn’t sound overly pleasant to be treated by leech I would have to trust that the doctor knew what they were on about, and that some older ideas have a place.
The role for technology in education, and the impact technology has on children generally, is a thoroughly embedded topic for debate. I’m sure if twitter had been around at its inception, the Casio Databank would’ve been the topic for a whole half-term’s Edutwitter ‘civilised’ discussion but there is an understandable increase in these sorts of conversations as we try to keep up.
The latest story to hit the tech-debate radar is this one in the Toronto Star reporting that grade 7 and 8 students at Earl Grey Senior Public School are to have restricted access to their mobile phones during lessons. Now, I work in a school where the pupils have always handed everything in when they get to school – even before mobile phones were commonplace – so I’ve not really noticed the rise in personal tech use in classroom in perhaps the same way as other schools, but it still seems odd that this sort of ban (and not even for all year groups) would be newsworthy.
Screen-use in the classroom is becoming increasingly ubiquitous, so what concerns should we have with this? Carl Hendrick recently blogged about why the Internet should be kept out of the classroom, citing a 2016 study (Ravizza et al) looking at how university students use laptops in class reported the relationship between classroom performance and internet usage. They found that ‘nonacademic Internet use was common among students who brought laptops to class and was inversely related to class performance’. A recently published Japanese study (Kawahara and Ito, 2017) looked at the ‘Effect of the Presence of a Mobile Phone during a Spatial Visual Search’ and found that even without using it, the mere presence of a mobile phone can adversely affect cognitive performance. This offers an opportunity for us to look at the impact of classroom technology and how schools can use classroom technology in a balanced way.
In response to an open letter published in December 2016 over concerns about children’s ‘screen-based’ lifestyles, a second letter quickly responded, calling for ‘quality research and evidence to support these claims and inform any policy discussion’. Whilst worries over increasingly sedentary lifestyles and mental health issues are understandable, the letter argues that there is little evidence to support the concerns in the initial letter and encourages the government and research bodies to invest in well-founded guidelines.
The evidence around the benefits and disadvantages of technology for children is ever-changing. In 2015 the American Academy of Paediatrics reviewed their guidelines for early childhood screen time, mostly based on old research into television time, which previously recommended that children under two should stay away from screen media. They have now provided more evidence-based guidance as to how children should use screens, including for unstructured play and the positives of video chatting with distant relatives.
At our school a decision was reached several years ago to provide each pupil with their own laptop to use in school. As we started to expand we found our ICT suite with 6 laptops wasn’t enough for 30 pupils and they were getting damaged etc so we started to roll out laptops and now we have 1:1 from KS2-4. Pupils use these within all lessons – we run KS2/3 on a mostly primary model of class teacher teaching most subjects with some specialist teachers/swapping (KS4 is more specialist). Laptops move with the pupil throughout day/years – it’s easier to track use and damage etc. Obviously (perhaps) laptops aren’t used in every lesson but they are used a lot. They are also used during some reward times and some break times (probably why online games are still accessible).
Certain websites are blocked from use like social media/YouTube/keywords and as websites appear that we want to block (YTPak as a YouTube substitute for instance) we can inform our blocking people (although I did find recently that I wasn’t able to access websites using the word ‘edge’ in the URL. This was an issue as I was trying to look at the knowledge organiser blogs and ‘knowledge’ was banned). We also use software for managing and monitoring what the pupils are using live. Teachers can view (and control) pupil laptops which is useful for both instances of inappropriate pupil activity and in-lesson sharing of work on the IWB. If pupils are using the computers inappropriately then we have reward/sanction systems that are used.
Clearly this is different to other types of screen use in the classroom but I do have concerns that we, staff and pupils, can be over-reliant on the laptops. Whether that means a reduction in the amount of handwriting pupils do, ‘lazy’ internet research (we’ve all heard amusing tales of Wikipedia regurgitation), or a slightly more concerning impact on processing information as described by Mueller and Oppenheimer (2014) looking at laptop vs written note-taking by university students. Our pupils don’t take a huge amount of notes in lessons, but if we over rely on using the laptops when they do, the chances are we’re denying them the opportunity to process the information in a meaningful way.
Even if we can’t do much about what they do at home, we have a lot of control over how much technology pupils use in the classroom. There are some great resources out there and the deeper debate over this is perhaps for another day, but how much of school-tech is driven by what staff quite fancy having a play around with over the genuine benefits in the classroom? It’s almost becoming a cliché to ask whether the 1:1 iPads are essential or could you do it another way and save thousands of pounds (seems old but I had this conversation a fortnight ago). In a desire for an easy ride, doing something different’, squeezing in some of those illusive ’21st Century Skills’, is it actually more revolutionary to go without?
More robust research will hopefully lead to better guidelines, but we need to use our professional common sense as well. We’ll never be completely on top of it but we do have some control over our classrooms and probably just as well because with last month’s speculation that Apple are set to introduce a ‘cinema mode’ for iPhones, it seems like it soon won’t be single screen-use we’ll be discussing, but perhaps multi-screen use as well.
The other day I was going through some possible reading materials for upcoming journal clubs and came across this in this (pdf):
The next day I was asked why I’d even consider looking at a paper from a computer science conference as a journal club text. I didn’t see this as a negative thing and I replied saying it was a case study of iPad use in primary (which the I assume satisfied as it was followed by the customary ‘like’), but it does throw up interesting questions about the types of reading we should be looking at in education journal clubs. My own stance is that the reading/s are used as a stimulus for conversation – this can be everything from discussing the ins and outs of current research in detail, to debating a wider topic (in this case I was thinking it might be interesting to compare how iPad technology was first introduced with how it is now), but I wonder how many people think we should only be looking at ‘perfect’, purely educational research?
The recent opening of Chartered Collage of Teaching membership, particularly with its free access to 2000+ journals has excited many on my timeline. I’ve got my own jealousy that I can’t join in with that part but it seems to have worked to change a few people’s minds and soon after the announcement I saw tweets suggesting people are wiling to join just for the access. I do have some misgivings about how useful journal access in itself will be but I think (presume) there will be different benefits of membership for those who aren’t interested in journals so I know it’s not all about that.
The way people use research in education is a recurring topic for debate and recently renewed. If teachers are thinking they’ll be able to search for papers that tell them ‘x is good, y is bad’, the chances are they won’t (and if they do then I think they should be cautious). I still believe that most people won’t have time to look for information in detail and if they do have time, wading through what’s out there can be hard and end up with cherry picking and sweeping assumptions. My choice of papers for journal clubs won’t always be a shining beacon of quality educational research or perfectly relatable to what we’re looking at in school (with or without access) but that’s an important part of the discussions we need to have.
I think the role of Research Lead now has an even greater chance to be pivotal in helping to translate research and point colleagues in the right direction. Journal club discussions can help with this of course and allow people to dip their toes in; but even for more rigorous investigation, knowledge translation is going to be important. I’ve delved into the world of Knowledge Mobilisation for various things recently and I’m convinced that there are exciting directions this can go in, whether that’s research summaries, brokering or bespoke investigations.
Increased access to research will be great for sharing original sources and following up of ideas. It will be used by some for deep academic study and inevitably by others to try and find a quick fix, tick-the-research-informed-box activity. It’s a brilliant opportunity for teachers but it’s also an opportunity to put guidance in place so that everyone can really make the most of it. I think it’s important we remember that just because something isn’t presented as ‘education research’ it doesn’t mean we can’t call it out for saying touching a screen with more than one finger is a ‘natural means of input’ and that this will motivate students, and recognising that something isn’t perfect is good for us too. In fact, I’d argue that’s exactly what we should be doing with our widening engagement with research.
For me, as I don’t suspect I’m going to have access to thousands of journals any time soon, I’ll just have to continue using the wealth of open and free access articles for starting discussions and helping focus ideas.
So one minute you’re planning which cheap package holiday to book for half term and the next you’re flippantly replying to a tweet about an educational research conference in Washington DC. ‘Do they want to know about Journal Clubs?’ you ask. ‘Keep going…’ comes the reply.
So Portugal turned into Washington DC, and a tour of a local church turned into a tour of the White House. There are a lot of people who have done a lot of unexpected things because of researchED, but walking round the White House is pretty epic even by rED standards. It came at the end of a fairly intense week as we decided we’d go to New York for a few days before DC and it essentially turned into 7 days straight of open-topped bus tours which is hard-core even for us. To be honest, one of the reasons it’s taken me so long to write about it is that I’ve not quite come down from the whirlwind. The downside of this is that now I’ve had more time to process it, I’ve got so much more to say.
Other people have written about some of the sessions I went to like Kate Walsh’s and Ben Riley’s, and they’ve covered Dylan Wiliam’s brilliant keynote in varying amounts of detail so I won’t repeat that, however I did love that some of his themes were picked up throughout the day and this made for a unifying thread among some challenging ideas.
My favourite session by far was Ruth Nield‘s session on ‘Researcher-Practitioner Partnerships’. One of the reasons it’s taken me so long to write about #rEDWash is the amount of time I’ve spent looking through and getting sucked into her links – there’s some brilliant stuff out there! I went to this session partly because of a project that I’ve been working on with the School of Education at the University of Nottingham around collaborations between schools and researchers and I’m always up for new ideas (‘What Matters‘ – I’ll probably write about another time), and also to hear more about the wider picture of research use in schools in America. The day before we had been hosted by The Center for Transformative Teaching and Learning at St Andrews Episcopal School for lunch and one of the questions they were asking us was about how to widen the scope of education research in American settings. I got the impression that there are lots of pockets of activity and it’s a question of how these can come together – perhaps easier with the smaller scale of the UK – but also the more limited use of social media with practitioners in the US.
Turns out there’s quite a bit of stuff going on with the Institute of Education Sciences (IES) (apologies if I get muddled with this, I wrote my notes furiously and interpreting my special shorthand a week later is proving awkward). Ruth set out the different strands of work they are doing and it sounds amazing – if the College of Teaching wants research to be at its core then they could do worse than looking at what’s going on here; the US may be behind in regards to practitioner involvement with research, but it’s all there for the taking. Their work is independent and covers a range of practical approaches, much more than just RCTs.
She started by discussing the value of education research, the disconnect between schools and researchers (something we’re all familiar with) and how their researcher-practitioner partnerships (RRP) are aiming to address this. Researchers and practitioners work together over time to co-construct agendas of work for mutual benefit. This allows them to work on research that is more relevant and hopefully more likely to inform practice; they are able to form long-term working relationships and both sides can develop professionally. To support these partnerships (which can be in cities, states, cross-state, cross-district) they have Regional Education Laboratories (REL) working with”Research Alliances” of education practitioners and policymakers. They identify areas of need and work together to analyse data and conduct research to develop and test strategies. The IES provides seed-grants of around $400,000 to develop projects for which they can go on to apply for further funding if required. Projects mentioned included one that created software to track progress and now has a national audience.
In addition to the RELs, the IES has the What Works Clearinghouse (how had I not seen this before?) which ‘reviews the existing research on different programs, products, practices, and policies in education’ and hosts intervention reports, reviews of individual studies and a series of practice guides that they are now seeing schools adapt and use for their setting. Tom Bennett came in to the session towards the end and drew similarities to the EEF in the UK; this is so much more than that; it’s sort of like a cross between the EEF Toolkit and the Literacy intervention review Tom Sherrington talked about at rED15, but it’s nicer to look at with simple infographics (including a lovely representation/summary of the setting for each intervention). You can see effectiveness, improvement, and my favourite bit – you can compare interventions. Something that really impressed me was that all their reports go through peer review before they are released (see picture for questions). Ruth was clear to point out that they aren’t just interested in the ‘gold standard’ of research – they publish a cross-section of work, summaries and of course RCTs.
On top of all this they have ERIC – the Education Resources Information Center. I did know about this one and they’ve recently changed the website to be a bit more easy to use. ERIC is a digital library of education research and other information – in their words ‘ERIC’s mission is to provide a comprehensive, easy-to-use, searchable Internet-based bibliographic and full-text database of education research and information for educators, researchers, and the general public.’. How can we not be excited about this? Perfect for Journal Clubs too…
One of the issues the IES, and research engagement in US schools generally, seems to have is with ‘reach’ and I asked how they get their message out to schools. They use social media, professional associations and each REL has a governing board with regional commissioning officers that work in their localities. I’m sure there could be more. I was already following one of the IES twitter accounts and have since followed more, but when I look at how many followers these accounts have or how many RTs/Likes the posts have they are no where near the amount that similar UK accounts have. This is a bigger issue than the IES of course; one of the things that stood out to me and others was the low number of classroom teachers at rEDWash in comparison to the UK events. I’m certain that if more teachers engaged with this work, the impact could be massive. I don’t know that I have any answers to the questions this throws up, but I’m sure researchED has a part to play.
There’s so much I’ve not written about yet, just with Ruth Nield’s session, and I’m at a ludicrous amount of words already. I’ve not talked about my session, the people we met, the pub, what’s next for me and research. This has been an amazing, crazy week and I’ve thought about researchED a lot – as always it’s about keeping up the momentum and sharing ideas. I’ll have a think and write about it more next.