Q&A: <em>Chalkbeat</em>’s Matt Barnum Discusses What He Looks for When Reporting on Education Research
Q&A: Chalkbeat’s Matt Barnum Discusses What He Looks for When Reporting on Education Research

November 2017

The following Q&A is one in an occasional series of conversations with policy and opinion leaders with an interest in and commitment to education research.

Matt Barnum is Chalkbeat’s national reporter, covering education policy, politics, and research. Previously he was a staff writer at The 74 and a middle school language arts teacher in Colorado. He can be reached at mbarnum@chalkbeat.org.

Q. What factors do you consider when determining whether a new research study is worth covering? 

A. I consider a number of factors, but I would say the two main questions are: (1) whether the research appears to be of high quality — that is, do the methods supports the findings and the claims? and (2) whether it's on a topic that would be of interest to readers. 

On the second point, I'm certainly interested in issues that are often in the news, and that might be of pressing interest to policymakers. At the same time, I sometimes find a study interesting because it's a topic that few people are discussing at the moment — I see my role as trying to convince readers that the issue is worth reading about.

I also try to look for studies that might not get covered otherwise. I get a lot of research sent through public relations people and think tanks — which is great and always appreciated — but I also keep my eyes peeled for studies published in academic journals that the public hasn't heard of. Those studies are often on a topic of wide interest, but may not have anyone to help get the word out. A good example is a recent study I wrote up on the effects of unionization on charter school performance — I only saw it because I subscribed to email alerts from the journal in which it appeared.

Q. How do you decide if research is of high enough quality to cover?

A. I am certainly no expert in research methods and statistics. But I try to stay reasonably well informed on different research methodologies and their strengths and weaknesses. I also try to be vigilant for when a study makes claims beyond what its data can support; when that happens, it's a red flag.

At the same time, I know my judgement is far from perfect. When I'm considering covering a study, I often talk to the researcher at length about its methodology and findings — that's really helpful. (And I especially appreciate the many researchers who are generous and patient in explaining complex findings!) I also try to talk to external researchers who are experts in the field being studied, and get their take on the study I'm writing about.

Q. What areas of current education research are ripe for expanded coverage? 

A. The issue of charter schools has certainly gotten a lot of coverage, including by me. But a lot of the research has focused on a narrow question — what's their relative performance on standardized tests. That's important, but I think there are other issues related to charter schools that don't always get enough attention — from either journalists or researchers.

I also would love to see more coverage on using research to inform the implementation of ESSA. We reporters (understandably) often focus on problems after they happen; but I'd also like to see more coverage on what we know from research about designing school ratings systems and how states might avoid some of the widely acknowledged pitfalls of No Child Left Behind. Journalism shouldn't always be a policy post-mortem.

Finally, I think the issue of teacher evaluation could use more coverage right now. There's a lot less policy energy on this issue than there was during the Obama administration, but the impacts of evaluation are still felt in schools, many of which remain under evaluation laws passed due to Race to the Top. There is also continued high-quality research on the issue.

Q. What education issues are especially newsworthy right now? Which ones do you see emerging over the next six months or year? What questions related to those issues lend themselves to education research? 

A. I think the issue of charters and private school vouchers will continue to be newsworthy at all levels—federal, state, and local. ESSA implementation is crucial and it will be really interesting to see states move from planning to implementing. I think school curriculum is having a moment, with many people starting to say that it has not gotten enough attention among policymakers, and the Gates Foundation planning to invest more in this issue.

Education researchers can help inform policies on each of these issues. The points I raised earlier on charters and ESSA apply. On ESSA, I am sure many researchers are excited to look into variation in how states implement the law, and what that means for students. On curriculum, I think there's an emerging body of research on the importance of different curriculum, but less sense of what makes a good curriculum and how school districts can make informed choices.

Generally, I would love to see more work that combines quantitative and qualitative research — which sometimes feel siloed. 

Q. What advice do you have for researchers and other science communicators to make research findings more accessible to reporters and other non-researchers?

A. Traditional research studies can be really dense and confusing to read, and the findings can be opaque. I really appreciate the growing breed of university-based institutes that are prioritizing making research accessible to the public and reporters — without sacrificing quality or academic rigor. The Education Research Alliance at Tulane and the University of Chicago Consortium of School Research jump to mind in this regard (and I know there are others). What I think is a terrific practice that some already do is release easy-to-understand, public-facing briefs alongside full technical reports. The briefs can summarize the main findings and include accessible graphs and charts.

One other thing researchers and reporters can improve on is explaining effect sizes to readers. In most studies I read, effect sizes are based on standard deviations — but that means nothing to most of my readers. I know some have made efforts to translate standard deviations into days of learning, though some scholars are skeptical of this approach (and so am I). Figuring out a good way to explain the size of measured impacts to a lay audience is really important.

Q. What are some best practices researchers and science communicators should consider when pitching research?

A. The best pitches highlight the main points in new or forthcoming education research, and explain why it might be relevant for policy. Having a good lead time before a study is publicly released is also nice. It's pretty straightforward, and I always appreciate when researchers share their studies with me (so long as they understand that I can only cover a fraction of the interesting studies out there!) 

The worst type of pitch I get is one where the person obviously does not even know what topic I cover. This happens remarkably often!

Designed by Weber-Shandwick   Powered by eNOAH