Skip to Content

Disclaimer

Disclaimer
MDS makes every effort to publish accurate information on the website. "Google Translate" is provided as a free tool for visitors to read content in one's native language. Translations are not guaranteed to be 100% accurate. Neither MDS nor its employees assume liability for erroneous translations of website content.

International Parkinson and Movement Disorder Society
Main Content

What are journal editors looking for in a good review?

April 04, 2022
Episode:58
Series:Peer Reviewing
This is the third episode of the special series focused on the peer review process. In this episode, Dr. Alana Kirby speaks with Dr. Jon Stoessl about what journal editors are looking for in a good review. He provides practical tips for writing an effective peer review.

[00:00:00] Alana Kirby:
Hello and welcome to the MDS podcast, the podcast channel of the International Parkinson's and Movement Disorders Society. I am Dr. Alana Kirby. I'm an assistant professor at Rush University Medical Center in Chicago, Illinois. I'm also the co-chair of the Peer Review Mentoring and Educational Program, along with my colleagues, Dr. Di Luca and Dr. Persad.

Today, we'll continue our series focused on the peer review process, in which we explore common questions faced by young MDS members as they embark on reviewing papers. For the third episode in our series, it is my pleasure to introduce Dr. Jon Stoessl.

Dr. Stoessl is a professor and head of the department of neurology at the University of British Columbia in Vancouver, Canada. He currently serves as the editor in chief of the Movement Disorders journal. He has been on the editorial board of seven prestigious journals and serves as a reviewer for more than 30 journals. He has more than 18,000 citations, has authored 275 journal articles, and more than 60 book chapters and invited reviews.

Dr. Stoessl, thank you for joining me.
 

Open full transcript

[00:01:10] Jon Stoessl:
Hi, there!
 

[00:01:11] Alana Kirby:
What we want to talk about today is what journal editors are looking for in a good review. If you were to list three points that you think of as being a good review, what would they be?
 

[00:01:23] Jon Stoessl:
Well, I think the main three points would likely be common between most editors. The first is the significance to the field in general— the perceived significance to the field. Second would be novelty. And third would be the robustness, validity, methodological soundness, et cetera.

And then I guess as a fourth point, whether it's appropriate for the journal. Or perhaps more importantly, if the reviewer thinks this is great science, but it's not really appropriate for this journal.
 

[00:01:57] Alana Kirby:
How much would you expect reviewers to back up their assertions in their written review?
 

[00:02:04] Jon Stoessl:
Well the more, the better. So in terms of laying out the review, personally, what I find helpful is a very brief capsule summary of what the authors actually did, bearing in mind that the editors have already read the paper. And then followed by an overall statement on significance. And that, I think can be very helpful, particularly if the editors themselves are people who don't necessarily work in that subfield.

And then a statement about, "This is a really significant problem because ..." Or, " It's an old problem, we've heard about it a hundred times, but this paper actually helps resolve an unresolved issue." Or it's just adding one more brick to the wall. It's helpful to have some sense of that.

Now that perception may obviously differ from one reviewer to another, and we take the final responsibility for deciding on that. But I think it is really useful to have that perception.

Personally, I try to start my own reviews that way, with why I think the paper is important or not. Some people go through the paper in order: abstract, introduction, methods, results. That's okay, but I find it less helpful because it's, by definition, not going to be as prioritized. I'd like to know the big picture right away, then go on to a discussion of major strengths, in the hopes that there are some. And then fatal flaws, if there are any.

You can always get into the nitty gritty of more minor technical issues. And that can still be very helpful. But the truth is what you really want to know up front is: Is this important? Why is it important, or you know, it would have been important except there's something so flagrant that you really can't draw these conclusions. Anything beyond that is detail— sometimes can be important details.

I think we do want to see an honest and constructive listing of weaknesses. But I would also differentiate between weaknesses that are important in an effort to improve the paper. In an ideal world, whether we take the paper or we've don't, we would like to see it improved. And obviously if we take it, we want it to be in the best possible shape. But even if it ends up going to another journal, sometimes reviewer comments can be very helpful to the authors and in helping them make it a better paper. So I would differentiate between things that absolutely have to be done to improve the paper or to make it a valid observation, versus those things that would be nice.

And then at the end, I would put minor issues. That's where it's appropriate to say, well, you didn't cite this paper, or there's a spelling mistake, or the figure legends are hard to read. Those can be important things for the end product, but they're really not the critical issue for deciding whether or not a paper should be published.

There are some things that probably should not go into a review. Obviously just destructive comments, willfully destructive comments, have no place in a review. We will actually edit those out if we see them. And we may flag the reviewer as somebody who won't be invited again. But that's a relatively uncommon thing.

What is common, and it really shouldn't appear in the body of the review is a recommendation on whether or not to publish. We obviously listen to the reviewers recommendations, but that should go in our confidential section, because it could end up being confusing to authors.

We may get reviews that cover the entire span of possible recommendations. That obviously makes the job a little bit more difficult. But if one reviewer says, "This is brilliant and must be published," and then we send back a decision based on other considerations that says, "Sorry, you didn't make the cut," that's going to be really hard for authors to digest.

 I would also say it's important to use the confidential comments. I think it gives you a little bit more freedom as a reviewer to state what you really think. So if you're working hard to create a constructive review in the comments that will be returned to the author — still important to be honest and balanced in that review so that the author understands why the decision was made. But you know, if you really think this paper should not see the light of day, in the confidential comments, you can say, "I've reviewed a hundred papers on this topic. This is the worst one I've ever seen." Or, "I think it's sloppy work. I don't think they could have actually done this the way they say." That kind of thing. Fortunately, those types of comments are pretty uncommon, but it can be very helpful to know them.
 

[00:07:22] Alana Kirby:
It sounds like what the reviewer brings to the table, ideally, would be: a knowledge of the technical aspects that allow them to judge the quality of the research; and knowledge of the overall state of the field that allows them to summarize this finding's place in the greater scientific literature; and a sense of general perspective, which I think we all want to have, about what is important and what is not.

And focusing on those things really does help the editors to understand how this paper should be treated. But it also helps make the work better, to make the writing better.
 

[00:08:03] Jon Stoessl:
I agree with all of that, just with the caveat that, if you have somebody who's a real content expert and methodology expert, they may by definition think that any paper in that field is important because they've devoted their life to it. But nonetheless, their methodological knowledge may be incredibly helpful. Particularly if they see a fatal flaw, or have particular recommendations.

Somebody else who's got a broader perspective on the field and may not necessarily know the methodology — that may be valuable as well. And then of course, ideally you'd like a little bit of everything.


[00:08:44] Alana Kirby:
This is where the role of the editors is very important in selecting reviewers who may play complimentary roles in evaluating the paper overall. As an individual reviewer, you may not necessarily know where you fit in that greater scheme. So the safest thing is just to try and get all of the points.
 

[00:09:03] Jon Stoessl:
If it's a difficult topic, I may write to a reviewer and say, "I know that this aspect of the paper is not in your area, but I'm really asking you to look at such and such, and I've already requested reviews from somebody who's got the content expertise on some other aspect of the paper."
 

[00:09:22] Alana Kirby:
For listeners still early on in their scientific careers and maybe have only reviewed a few articles, do you have any advice for how they can work to make their reviews as useful to you as possible?
 

[00:09:37] Jon Stoessl:
Well, first of all, I would say it's not only useful to me, but useful to them. I do think they should get something out of the review process. Review the papers that you find interesting. If you find the topic completely uninteresting, it may tell us that nobody likes this.

But also to think about, what would be helpful to you in a review if it's your paper, and what you would want back. We all get used to rejection and while it's always disappointing, you can learn to move forward from the rejection. Particularly if you have some constructive comments. It makes you rethink being very critical of your own work, or sometimes to move on to something else.

You learn a lot by reviewing and by handling papers. Not only do you learn about the field in general, so you pick up things that you may not have known, but also, I think it helps you create better papers and better grants when you're doing that all the time, and also when you see what other reviewers have to say.

If I'm a reviewer, I read what the other reviewers have to say quite carefully, because I'd like to know if I've missed something myself.

 This is, by definition, an imperfect process. And so we have to have some humility in appreciating that. If there's anybody who has not had a paper rejected, it's because they haven't written one! Or they have written enough. So it happens to absolutely everybody. No matter how senior they are, they still will get papers rejected.
 

[00:11:23] Alana Kirby:
Well, thank you so much for talking with me today.
 

[00:11:27] Jon Stoessl:
Thanks for the opportunity.
 

[00:11:28] Alana Kirby:
That completes episode three of the peer review podcast. In this episode, we learned what journal editors are looking for in a peer review and got some tips from Dr. Stoessl on how to write an effective review. In our next few episodes, we'll be focusing on how to review specific types of articles.

Next week, we will discuss how to review a clinical case report or case series. Thank you so much for listening.

Special thank you to:Dr. Jon Stoessl
Host(s):
Dr. Alana Kirby 

We use cookies to give you the best possible experience with our website. These cookies are also used to ensure we show you content that is relevant to you. If you continue without changing your settings, you are agreeing to our use of cookies to improve your user experience. You can click the cookie settings link on our website to change your cookie settings at any time. Note: The MDS site uses related multiple domains, including mds.movementdisorders.org and mds.execinc.com. This cookie policy only covers the primary movementdisorders.org and mdscongress.org domain. Please refer to the MDS Privacy Policy for information on how to configure cookies for all other domains on the MDS site.
Cookie PolicyPrivacy Notice