July 2022
Julie Marsh, a professor of education policy at the Rossier School of Education at the University of Southern California, has served as co-editor of Educational Evaluation and Policy Analysis (EEPA) since 2019. Her term ends at the end of this year. The following is adapted from a recent Twitter thread posted by Marsh.
As I begin to transition off the editorial team for EEPA, I’ve been reflecting on reviews seen and decisions made these past four years. (My focus was on qualitative and mixed methods research, but I think these reflections apply to quantitative research as well.) Based on my experience, I offer 10 tips.
By far the most common and consequential reviewer comments focus on these first two:
#1 Clearly articulate your purpose and the main argument/problem that you are addressing.
#2 Clearly articulate the contributions/significance of your paper—the “so what?”
It’s not enough to fill a gap. Tell us why filling it is important for policy, practice, or the field and what motivated the paper and research questions. Why should we care? Why does it matter? Build these arguments into the introduction, literature review, and conclusion—as well as the abstract! Clarifying #1 and #2 can also help with focus. Sometimes you may be taking on too much in one paper.
#3 Have someone outside of the research team review the abstract and introduction.
This might help you with #1 and #2. (Credit for this idea goes to Randy Reback at Barnard College, who suggested it on an editor panel I recently participated in.)
#4 Align your literature review with the phenomena of study.
This sounds easy, but sometimes people review literature that may be useful in setting up the paper but not for guiding what was studied. For example, if you are studying the implementation of something, the review should not focus on outcomes. Also, remember the purpose of the lit review: to summarize what is known about the phenomena of study, what is not known, and how this literature helps your analysis or provides a foundation on which you build.
#5 Align the piece to the journal and its audience.
For EEPA this means ensuring that findings have policy relevance and that you end with policy implications (along with implications for future research, perhaps practice, and/or theory). And please don’t forget the implications!
#6 Add more details on methods.
For a survey, don’t just tell us the N. Tell us sampling procedure, response rate, weighting, or comparison to population. You coded data? Tell us who rated, how codes were determined, interrater reliability. Justify your design and choices.
#7 Include the evidence.
Don’t tell us, show us. When making assertions, ask yourself if you’ve provided the empirical evidence to back them up.
#8 Define and integrate the theory and concepts.
If you are using a framework/theory/set of concepts, clearly define them at the start, and be sure you tie them to the original sources (not to the folks who applied the foundational scholars’ ideas). Then return to these ideas throughout.
#9 For long findings sections, consider using figures/tables to summarize the big ideas at the start or end.
#10 Proofread before submission.
Multiple typos throughout can annoy reviewers who feel their time is being wasted on a less-than-final manuscript.
This list is in no way complete and is not endorsed by the journal I co-edit. It’s based on my own experience and reflections. I welcome others to share their ideas and suggestions on my original Twitter post.