Articles

What if there is no reading research on an issue?


How do we 'follow the research' if there isn't any on a particular topic or practice? Tim Shanahan shares his thoughts and some practical advice for monitoring and using research evidence.

By Tim Shanahan

Download PDF

Teacher question:

I agree with you about the need for basing what we do on research. But what do you do for the things for which there is no or limited research? For example, what about Orton-Gillingham instruction, what is the best way to sequence phonemes for teaching, or how specifically should background knowledge be taught? What about research that is evolving so that we do things a certain way and then refine these (say with Ehri and Gonzalez-Frey’s recent work in SSR) – what about all the time that we did the practice the other way? There are some topics with so much research that we can’t digest all of it, and other topics with no research or with ambiguous results. How do we follow the research?

Shanahan responds:

Yes, I’m a proponent of using research to make instructional decisions. Let’s start with that.

First, I want to make good decisions for kids. I seek practices that have unambiguously helped them to learn to read better. I can put more trust in an instructional practice found to be effective again and again under close analysis. If those other educators could make that work, I could too. That’s better than buying what the district next door bought!

Second, I want to be able to act without everything being a big megillah. Reading is a contentious field, and our crazy arguments rightfully cause parents to worry about whether we are making the best choices for their kids. Physicians and engineers don’t always get it right, but they have methods for determining acceptable practice. In reading, the serve often goes to the loudest, kids’ literacy learning be damned. Consistent standards of evidence make educational decision-making more professional – fostering confidence rather than disgust and despair.

An argument against a research-based approach is that it supposedly undermines teacher authority. Yep, there are some who believe teachers should make all classroom decisions (e.g., Diane Ravitch). That includes the idea that the best education comes from teachers who shrug off the curriculum and author all their own lessons. Think of Robin Williams (Dead Poet’s Society) encouraging kids to tear up the school’s poetry anthology; now that’s inspired pedagogy.

Your question lets the air out of that teacher-as-inspired-genius complaint.

The fact is, as with physicians, no matter how explicit or thorough any research-based standards of practice might be, there’ll always be plenty of consequential decisions for the teacher that must be based upon their judgment and experience. As standards of practice in medicine have become more certain through empirical study, physician decision-making has actually increased in significance.

I have no problem with those who improvise when there is no sound research to go on; what else can we do? But I rage at states, districts and schools that mandate an improvisation as if guessing on scale ensures success.

Variations in practices can help us to determine which choices are best – as long as we’re aware that we’re improvising and pay attention. What kills me is that so often authorities in their fervour to advance an approach (or to defend a wobbly decision) claim it to be research-based, when it was really more a child of logic, a hunch, or susceptibility to a really great sales pitch.

I lose patience with those ‘thought leaders’ who proffer their darling approach under the guise of research. These days that happens a lot. There is a ton of research showing the benefits of explicit phonics instruction. When someone is arguing that phonics is beneficial, and they cite research studies and government reports, I’m on board. But once they’ve made that argument and have convinced an audience that systematic daily instruction in decoding in grades K–2 is the way to go, they don’t know when to stop. They keep going without any acknowledgement that the claims that follow lack the same evidential pedigree … with assertions in which they may sincerely believe, but about which they should be confessing a lack of certainty: the value of tracing in the teaching of decoding skills, advanced phonemic awareness instruction, decodable text, the most effective sequencing of skills, sound walls and so on.

The same nonsense accompanies nostrums for reading comprehension or fluency – substantial research evidence supporting a basic premise allied with specific practical recommendations with a decided lack of convincing or relevant research support (e.g., extensive comprehension strategy teaching, front-loading of background information about a text prior to reading, thematic units, weekly fluency tests, individual conferencing and so on). Discerning readers may look at that parenthetical list and protest, “Isn’t there research on reading strategies or background knowledge?” There is, of course, but not research that shows how much strategy teaching is beneficial or whether providing background knowledge has anything but transitory effects. It certainly improves comprehension of a specific text, but we have no idea what that means to students’ reading ability in the long run.

There is nothing wrong with making any of these claims – as long as they are proposed along with an open admission that there is no proof that they work. Lack of evidence doesn’t mean something doesn’t work, only that we don’t know. That admission is important because we can only respond professionally if we know when something has worked consistently in the past and when it is just somebody’s hunch.

Too often I hear from teachers and principals distraught over the local ineffectiveness of an approach that they’d been led to believe was research-based. They are often told that the failure is due to their shoddy implementation. That happens, of course, but I’m more likely to buy that charge if the practice has consistently worked elsewhere in the past. If there is no rigorous evidence that the practice has ever worked then maybe the fault is neither in us nor in the stars.

Basically, if there is no research on a particular practice – feel free to adopt it but keep a close eye on it and be ready to adjust accordingly.

As for keeping up with the research? No one can read the 1000+ relevant research studies published each year. Even if we could, it would not be a good idea to adopt those results into practice immediately. Most studies in education tend to be small, and single studies are rarely determinative. It is wisest to limit data-based decision-making to topics on which sufficient data have accumulated to justify pedagogical action – responding to each new study as published would lead to changing your policies every 27 minutes. We use research to increase the certainty we can invest in our actions, not for the sake of novelty.

Practical advice on how to monitor and use research evidence?

  1. Monitor some of the better research journals just to see what topics they are addressing. Some of the best journals to watch for reading research include Journal of Educational Psychology; Reading Research Quarterly; Reading & Writing Quarterly; Review of Educational Research; Scientific Studies of Reading. These aren’t the only journals that publish high-quality reading research, but they’re among the most rigorously reviewed and widely cited by scholars in the field.

  2. Pay particular attention to research reviews and meta-analyses that synthesise bodies of research. The benefit of that approach is that you get the combined power of an entire collection of research rather than one particular study; that should reveal to you both the average outcome but also the variations in results that have been obtained. Effective approaches may vary in how often they pay off.

  3. When you read research make sure you understand what they were studying (and what they weren’t). As noted earlier, a lot of comprehension research examines how we can facilitate comprehension of a particular text. That is not unimportant theoretically. However, it isn’t the same thing as finding that an approach helps kids to read better independently.

  4. There are many kinds of research, all of it potentially valuable. If your goal is to determine what to teach or how to teach something, then you need to depend upon evidence that shows whether a practice can benefit learners. Focus on instructional research; studies that consider the impact of teaching. Indeed, there are other kinds of research that may be provocative (that study with the cool multicolour fMRI pictures, for instance) – as interesting as such research may be, it usually has little value for prescribing effective teaching practice.

  5. When there is no research? Get professionals together and think it through. Whatever courses of action you agree upon, make sure folks understand the reasoning (rather than the evidence) behind the choice. That makes it easier to change course up the road if things don’t pan out. If you can’t agree on a course of action, perhaps set up your own local study to see if it even matters. If it doesn’t, let teachers and principals improvise.

  6. Finally, that research says something is advantageous doesn’t mean it will work for you. If you rely on meta-analyses to set a policy or practice direction, I’d suggest going back and reading some of the individual studies included in the meta-analysis. I do that to determine whether the approach worked in situations like mine and to get clues about proper implementation (“Gee, the successful programs provided 18 hours of training for each teacher, and I didn’t budget for any of that, yikes”). Knowing those specific articles can have another pay-off as well. Sometimes the researchers may publish a practice-oriented version in a journal like The Reading Teacher; the research article proving that it works and the practice article giving details as to what it really was.

This article appeared in the Dec 2021 edition of Nomanis.

This article originally appeared on the author’s blog, Shanahan on Literacy.

Timothy Shanahan (@ReadingShanahan on Twitter) is Distinguished Professor Emeritus at the University of Illinois at Chicago and was formerly Director of Reading for the Chicago Public Schools, and president of the International Literacy Association. He is a former first-grade teacher and is a parent and grandparent. His website www.shanahanonliteracy.com is popular with parents and teachers.

Similar Articles