• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Does Peer Review Promote Mediocrity?

Jack Hays

Traveler
Banned
DP Veteran
Joined
Jan 28, 2013
Messages
94,823
Reaction score
28,342
Location
Williamsburg, Virginia
Gender
Male
Political Leaning
Independent
Does peer review promote mediocrity? A new PNAS paper suggests that may be the case.

Tracking retractions as a window into the scientific process

[h=2]Peer review isn’t good at “dealing with exceptional or unconventional submissions,” says study[/h]with 4 comments
One of the complaints about peer review — a widely used by poorly studied process — is that it tends to reward papers that push science forward incrementally, but isn’t very good at identifying paradigm-shifting work. Put another way, peer review rewards mediocrity at the expense of breakthroughs.

A new paper in the Proceedings of the National Academy of Sciences (PNAS) by Kyle Silera, Kirby Leeb, and Lisa Bero provides some support for that idea.
Here’s the abstract: Read the rest of this entry »
 
So, where is the exceptional work we missed out on? What wasn't published and was significant.

Exceptional work will publish itself, PR or otherwise.
 
So, where is the exceptional work we missed out on? What wasn't published and was significant.

Exceptional work will publish itself, PR or otherwise.


From the linked abstract:

Of the 808 eventually published articles in our dataset, our three focal journals rejected many highly cited manuscripts, including the 14 most popular; roughly the top 2 percent. Of those 14 articles, 12 were desk-rejected. This finding raises concerns regarding whether peer review is ill-suited to recognize and gestate the most impactful ideas and research.
 
Take it up with the National Academy of Sciences. They published the paper. They don't say peer review is a scam; they say it may encourage small thinking.

we've already thoroughly discussed peer review, Jack. reread the thread.
 
we've already thoroughly discussed peer review, Jack. reread the thread.

The discussion in this thread is an entirely different perspective, and if PNAS thinks it's worth bringing to people's attention then why shouldn't we discuss it?
 
From the linked abstract:

Of the 808 eventually published articles in our dataset, our three focal journals rejected many highly cited manuscripts, including the 14 most popular; roughly the top 2 percent. Of those 14 articles, 12 were desk-rejected. This finding raises concerns regarding whether peer review is ill-suited to recognize and gestate the most impactful ideas and research.

If the articles are cited, they were PR published by someone.
 
The discussion in this thread is an entirely different perspective, and if PNAS thinks it's worth bringing to people's attention then why shouldn't we discuss it?

You're a fast reader, Jack. Did you read the part where we talked about my multi-decade career doing peer reviewed research and that your experience with peer review mostly just involves posting politically charged threads on internet message boards?
 
If the articles are cited, they were PR published by someone.

I think that's a fair assumption, but beside the point. If what the study found in its limited sample is representative of better journals as a whole then it is more difficult for the most impactful work to be published. Not impossible, perhaps, but more difficult, and that raises the uncomfortable possibility that some might not be published at all.
 
You're a fast reader, Jack. Did you read the part where we talked about my multi-decade career doing peer reviewed research and that your experience with peer review mostly just involves posting politically charged threads on internet message boards?

And how is a PNAS article linked in a site supported by the MacArthur Foundation in any way politically charged? I would have thought you of all people would see the value in this discussion.
 
Does peer review promote mediocrity? A new PNAS paper suggests that may be the case.

Tracking retractions as a window into the scientific process

[h=2]Peer review isn’t good at “dealing with exceptional or unconventional submissions,” says study[/h]with 4 comments
One of the complaints about peer review — a widely used by poorly studied process — is that it tends to reward papers that push science forward incrementally, but isn’t very good at identifying paradigm-shifting work. Put another way, peer review rewards mediocrity at the expense of breakthroughs.

A new paper in the Proceedings of the National Academy of Sciences (PNAS) by Kyle Silera, Kirby Leeb, and Lisa Bero provides some support for that idea.
Here’s the abstract: Read the rest of this entry »

I would argue it is all about a counter balance. Peer review is important to process, but in concept is not designed to diminish theory or that process. I can see where it could in the right conditions but the absense of peer review could lead to even larger complications than what the articles suggest.
 
I would argue it is all about a counter balance. Peer review is important to process, but in concept is not designed to diminish theory or that process. I can see where it could in the right conditions but the absense of peer review could lead to even larger complications than what the articles suggest.


No one is advocating the abandonment of peer review.
 
And how is a PNAS article linked in a site supported by the MacArthur Foundation in any way politically charged? I would have thought you of all people would see the value in this discussion.

because your main interest in peer review is that you want to push the right wing view of global warning, an issue that i don't care much about, as both sides treat it with zealotry. my interest in the peer review process is because i make my living in this field, and i understand the process in ways that you don't and won't. i have experienced how difficult it is to get a peer reviewed study published in a legitimate journal, and have spend many hours doing suggested experiments to get papers published. the process is thorough, and it works
 
because your main interest in peer review is that you want to push the right wing view of global warning, an issue that i don't care much about, as both sides treat it with zealotry. my interest in the peer review process is because i make my living in this field, and i understand the process in ways that you don't and won't. i have experienced how difficult it is to get a peer reviewed study published in a legitimate journal, and have spend many hours doing suggested experiments to get papers published. the process is thorough, and it works

This thread has nothing to do with global warming. Nor does it have anything to do with your personal scientific background. I found it interesting and posted it because of my interest, going back to graduate school 40 years ago, in the history of science. Unfortunately, your assumption of bad faith and your wholly unwarranted attack on me personally is spoiling a thread that I hoped (perhaps naively) would be another step in the bridge building I thought we had begun lately. Oh well. Live and learn.
 
No one is advocating the abandonment of peer review.

Not quite what I was saying. The point is the process has to exist else we will have complications, the counter balance works as a source of process integrity. If the process is stifling some idea, I would question the idea before going after the process.
 
Not quite what I was saying. The point is the process has to exist else we will have complications, the counter balance works as a source of process integrity. If the process is stifling some idea, I would question the idea before going after the process.

That's fair, but I think the authors' point is that your approach will protect research quality in general but at the expense of stifling the rare sudden advance.
 
This thread has nothing to do with global warming. Nor does it have anything to do with your personal scientific background. I found it interesting and posted it because of my interest, going back to graduate school 40 years ago, in the history of science. Unfortunately, your assumption of bad faith and your wholly unwarranted attack on me personally is spoiling a thread that I hoped (perhaps naively) would be another step in the bridge building I thought we had begun lately. Oh well. Live and learn.

please, Jack. you know exactly why you post these threads, and my experience working in peer reviewed research is absolutely relevant. i generally don't call you on it, but when you attack peer review and demonstrate your incomplete knowledge about the rigor of the process, i'm going to point that out.
 
please, Jack. you know exactly why you post these threads, and my experience working in peer reviewed research is absolutely relevant. i generally don't call you on it, but when you attack peer review and demonstrate your incomplete knowledge about the rigor of the process, i'm going to point that out.


I am not attacking peer review. Do you think PNAS is attacking peer review? The only impediment to a good discussion is your prejudice.
 
That's fair, but I think the authors' point is that your approach will protect research quality in general but at the expense of stifling the rare sudden advance.

I would offer that we consider the idea of peer review risking mediocrity applied to the various sciences.

Take your basic sciences (the Biology areas and Chemistry areas, perhaps a few others) that lead to discoveries in medicine. I am selecting these areas to make a point. I would argue that profit motive and purposed direction does more to cause mediocrity than the peer review processes. Where as without peer review we may see more problems with results from new medicines, even more than we see today because of profit or purposed motives. Take that and apply it to other areas advanced by scientific discovery and we have more reason to conclude that other factors may be causeing the mediocrity that the article suggests we risk with peer review.

Those rare but sudden advances then would have an even greater need for a peer review to confim the process and conclusion of that discovery. Even if it was accidental discovery, outside of the intention or direction of the project. Very generally speaking, that is how the greatest advancements in science tend to be found. And it begs for someone to verify or reproduce the results.

On the complete other side of the fence take Theoretical Physics and Quantum Sciences, especially when the ideas being looked at involve more mental process or exercise. Something that pushes the limits of scientific process to verify a theory and as a result put a strain on peer review to verify process and result. There I can see the idea that peer review would cause an element of mediocrity as in these fields we tend to push science into areas of belief systems. Think about this in terms of the Materialism debate and how peer review would be looked at in that context.
 
I would offer that we consider the idea of peer review risking mediocrity applied to the various sciences.

Take your basic sciences (the Biology areas and Chemistry areas, perhaps a few others) that lead to discoveries in medicine. I am selecting these areas to make a point. I would argue that profit motive and purposed direction does more to cause mediocrity than the peer review processes. Where as without peer review we may see more problems with results from new medicines, even more than we see today because of profit or purposed motives. Take that and apply it to other areas advanced by scientific discovery and we have more reason to conclude that other factors may be causeing the mediocrity that the article suggests we risk with peer review.

Those rare but sudden advances then would have an even greater need for a peer review to confim the process and conclusion of that discovery. Even if it was accidental discovery, outside of the intention or direction of the project. Very generally speaking, that is how the greatest advancements in science tend to be found. And it begs for someone to verify or reproduce the results.

On the complete other side of the fence take Theoretical Physics and Quantum Sciences, especially when the ideas being looked at involve more mental process or exercise. Something that pushes the limits of scientific process to verify a theory and as a result put a strain on peer review to verify process and result. There I can see the idea that peer review would cause an element of mediocrity as in these fields we tend to push science into areas of belief systems. Think about this in terms of the Materialism debate and how peer review would be looked at in that context.

I can't speak for the authors but I suspect they would regard "profit motive and purposed direction" as factors external to their discussion. Nor do they claim that peer review is the only possible promoter of mediocrity.
 
please, Jack. you know exactly why you post these threads, and my experience working in peer reviewed research is absolutely relevant. i generally don't call you on it, but when you attack peer review and demonstrate your incomplete knowledge about the rigor of the process, i'm going to point that out.

Nature has now joined the discussion. Are they attacking peer review too?

[h=2] Nature admits peer review filters out controversial “champion” papers[/h]
How to separate creative genius from creative mistakes? Not with peer-review. It is a consensus filter.
Classical peer review is a form of scientific gatekeeping (it’s good to see that term recognized in official literature). Unpaid anonymous peer review is useful at filtering out some low quality papers, it is also effective at blocking the controversial ones which later go on to be accepted elsewhere and become cited many times, the paradigm changers.
And the more controversial the topic, presumably, the worse the bias is. What chance would anyone have of getting published if, hypothetically, they found a consequential mathematical error underlying the theory of man-made global warming? Which editors would be brave enough to even send it out for review and risk being called a “denier”? Humans are gregarious social beings, and being in with the herd affects your financial rewards, as well as your social standing. Even high ranking science journal editors are afraid of being called names.
Mark Peplow discusses a new PNAS paper in Nature:
Using subsequent citations as a proxy for quality, the team found that the journals were good at weeding out dross and publishing solid research. But they failed — quite spectacularly — to pick up the papers that went to on to garner the most citations.
“The shocking thing to me was that the top 14 papers had all been rejected, one of them twice,” says Kyle Siler, a sociologist at the University of Toronto in Canada, who led the study[SUP]1[/SUP]. The work was published on 22 December in the Proceedings of the National Academy of Sciences.
There is no formalized sure-fire system to find and reward the creative genius needed for the big leaps in science. Their work must be impeccable logical, but it is an art to cut through human biases to recognise that genius. And art cannot be mandated or controlled. We should never place much confidence in a formalized process, especially one that’s unpaid and anonymous, to spot the papers that will be the most cited 50 years from now.
But the team also found that 772 of the manuscripts were ‘desk rejected’ by at least one of the journals — meaning they were not even sent out for peer review — and that 12 out of the 15 most-cited papers suffered this fate. “This raises the question: are they scared of unconventional research?” says Siler. Given the time and resources involved in peer review, he suggests, top journals that accept just a small percentage of the papers they receive can afford to be risk averse.
For the record:
Siler and his team tapped into a database of manuscripts and reviewer reports held by the University of California, San Francisco, that had been used in previous studies of the peer-review process.
Anyone who thinks “peer review” is somehow part of the scientific method does not know what science is.
h/t to the brilliant Matthew.
[h=4]REFERENCES[/h] Peplow, Mark (2014) Peer review — reviewed, Top medical journals filter out poor papers but often reject future citation champions. Nature,doi:10.1038/nature.2014.16629 [Discussion of Siler et al]
Siler, K., Lee, K. & Bero, L. Proc. Natl Acad. Sci. USA Measuring the effectiveness of scientific gatekeeping (2014).
 
Nature has now joined the discussion. Are they attacking peer review too?

[h=2] Nature admits peer review filters out controversial “champion” papers[/h]
How to separate creative genius from creative mistakes? Not with peer-review. It is a consensus filter.
Classical peer review is a form of scientific gatekeeping (it’s good to see that term recognized in official literature). Unpaid anonymous peer review is useful at filtering out some low quality papers, it is also effective at blocking the controversial ones which later go on to be accepted elsewhere and become cited many times, the paradigm changers.
And the more controversial the topic, presumably, the worse the bias is. What chance would anyone have of getting published if, hypothetically, they found a consequential mathematical error underlying the theory of man-made global warming? Which editors would be brave enough to even send it out for review and risk being called a “denier”? Humans are gregarious social beings, and being in with the herd affects your financial rewards, as well as your social standing. Even high ranking science journal editors are afraid of being called names.
Mark Peplow discusses a new PNAS paper in Nature:
Using subsequent citations as a proxy for quality, the team found that the journals were good at weeding out dross and publishing solid research. But they failed — quite spectacularly — to pick up the papers that went to on to garner the most citations.
“The shocking thing to me was that the top 14 papers had all been rejected, one of them twice,” says Kyle Siler, a sociologist at the University of Toronto in Canada, who led the study[SUP]1[/SUP]. The work was published on 22 December in the Proceedings of the National Academy of Sciences.
There is no formalized sure-fire system to find and reward the creative genius needed for the big leaps in science. Their work must be impeccable logical, but it is an art to cut through human biases to recognise that genius. And art cannot be mandated or controlled. We should never place much confidence in a formalized process, especially one that’s unpaid and anonymous, to spot the papers that will be the most cited 50 years from now.
But the team also found that 772 of the manuscripts were ‘desk rejected’ by at least one of the journals — meaning they were not even sent out for peer review — and that 12 out of the 15 most-cited papers suffered this fate. “This raises the question: are they scared of unconventional research?” says Siler. Given the time and resources involved in peer review, he suggests, top journals that accept just a small percentage of the papers they receive can afford to be risk averse.
For the record:
Siler and his team tapped into a database of manuscripts and reviewer reports held by the University of California, San Francisco, that had been used in previous studies of the peer-review process.
Anyone who thinks “peer review” is somehow part of the scientific method does not know what science is.
h/t to the brilliant Matthew.
[h=4]REFERENCES[/h] Peplow, Mark (2014) Peer review — reviewed, Top medical journals filter out poor papers but often reject future citation champions. Nature,doi:10.1038/nature.2014.16629 [Discussion of Siler et al]
Siler, K., Lee, K. & Bero, L. Proc. Natl Acad. Sci. USA Measuring the effectiveness of scientific gatekeeping (2014).

no, Nature is trying to make peer review better. you're attacking peer review, because you think that it helps to prop up your climate change position. and as someone who understands the process firsthand, i intend to keep calling you on it.
 
no, Nature is trying to make peer review better. you're attacking peer review, because you think that it helps to prop up your climate change position. and as someone who understands the process firsthand, i intend to keep calling you on it.

I lack the standing to claim a contribution to improving peer review, so I content myself with facilitating discussion among those who, perhaps, can. The rest is just prejudice that you and you alone bring to the forum. You're not "calling" me on anything; you're just parading blind prejudice.
 
So, where is the exceptional work we missed out on? What wasn't published and was significant.

Exceptional work will publish itself, PR or otherwise.

There is some truth to it. the point being the string theory vs m theory.
the man that actually discovered m theory well before hand was simply dismissed as nuts per the peer review process.

however years and years later they came to replace the string theory with the m theory and they added another dimension to make the M theory.
 
Back
Top Bottom