snopes.com  

Go Back   snopes.com > Urban Legends > NFBSK

Reply
 
Thread Tools Display Modes
  #1  
Old 08 September 2018, 12:59 AM
Steve Steve is offline
 
Join Date: 19 October 2002
Location: Charleston, SC
Posts: 4,879
Default Victorian-Era Orgasms and the Crisis of Peer Review

It’s among the most delectably scandalous stories in the history of medicine: At the height of the Victorian era, doctors regularly treated their female patients by stimulating them to orgasm. This mass treatment—a cure for the now-defunct medical condition of “hysteria”—was made possible by a new technology: the vibrator.

There is absolutely no evidence that Victorian doctors used vibrators to stimulate orgasm in women as a medical technique, asserts the paper, written by two historians at Georgia Tech. “Manual massage of female genitals,” they write, “was never a routine medical treatment for hysteria.”


https://www.theatlantic.com/health/a...octors/569446/
Reply With Quote
  #2  
Old 08 September 2018, 03:52 PM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,334
Default

Quote:
“As most senior scholars know, university presses peer-review their books by relying on other senior scholars to comment on the quality of the work,” said Greg Britton, its editorial director. “Before it was accepted for publication two decades ago, this book would have been selected by the editor, undergone a rigorous round of single-blind peer review, and then approved by a faculty editorial board.”

He added: “Presses do not, however, fact-check their books as Lieberman and Schatzberg acknowledge. More to the point, Professor Maines has always maintained that her assertions were a hypothesis open to further exploration.”

...

“One of the big takeaways for me is that the peer-reviewed process is flawed. Peer review is no substitute for fact-checking,” she added. “We need to fix this, and we need to start checking other people’s work, especially in history.”
What I don't get is what peer review (in the humanities?) is, if it doesn't involve fact-checking. Do they just check the spelling?
Reply With Quote
  #3  
Old 08 September 2018, 06:46 PM
Ellestar Ellestar is offline
 
Join Date: 31 July 2008
Location: Michigan
Posts: 1,817
Default

Fact checking would be an onerous process for someone just brought in to peer-review, though. I assume it would involve going to the same sources cited in the book and reading all the the same resources and coming to the same conclusion. Basically, fact-checking would be essentially redoing all the work of the original authors.

I think peer review in the humanities would be the same or similar to peer review in the sciences. You get someone in the same or similar field to read the presented research to determine if they are citing the correct people, using accepted methodologies, and getting what appears to be valid results. In the sciences, we don't expect peer reviewers to replicate the experiments themselves to get the same results; they use their own expertise to see if it, basically, makes sense to run the experiment as they have and that the findings fall in line with what is already known.

It seems with this, as the article says, the peer-review process failed. It's probably that there were no true "peers" (Victorian-era historians familiar with medical cures for hysteria?) to faithfully review the text. Further, it sounds as though there was some pressure to publish because this looked like a book that would sell, which I'm sure is kind of rare for academic publishing.
Reply With Quote
  #4  
Old 09 September 2018, 03:03 PM
UEL's Avatar
UEL UEL is offline
 
Join Date: 01 August 2004
Location: Fredericton, Canada
Posts: 9,311
Baseball

Quote:
Originally Posted by Richard W View Post
What I don't get is what peer review (in the humanities?) is, if it doesn't involve fact-checking. Do they just check the spelling?
Quote:
Originally Posted by Ellestar View Post
I think peer review in the humanities would be the same or similar to peer review in the sciences. You get someone in the same or similar field to read the presented research to determine if they are citing the correct people, using accepted methodologies, and getting what appears to be valid results.
I have had 2 peer review journal articles and 1 peer reviewed scientific paper published**. And Ellestar's process is almost exactly what happened to all my articles.

In essence, my reviewer ensured that I looked at a broad range of information, that my deductions were in line with the evidence, that my conclusions were fairly supported by my deductions.

Specifically, with my scientific paper, I was particularly challenged because my look at the crystal ball was somewhat different than that of others looking through the same crystal ball. I had to rewrite a section of it to better explain my position, including my justification from deviating from the near consensus of the group.

In the end, all three were well received, and my scientific paper is still being referenced today in other people's follow on work.

That was my experience with peer review. It was not editing or fact checking, but challenging my rigour, and ensuring that my positions were defensible.



**The journal articles are not truly "humanities" but historical analyses. One article was an analytical "how we got here from there" look at capability development with our military. The second was a critical look at implementation processes for new capabilities. The scientific paper was an analysis of future developments with military capability.
Reply With Quote
  #5  
Old 09 September 2018, 06:37 PM
Onyx_TKD Onyx_TKD is offline
 
Join Date: 17 December 2007
Location: Los Angeles, CA
Posts: 391
Default

Quote:
Originally Posted by Ellestar View Post
Fact checking would be an onerous process for someone just brought in to peer-review, though. I assume it would involve going to the same sources cited in the book and reading all the the same resources and coming to the same conclusion. Basically, fact-checking would be essentially redoing all the work of the original authors.

I think peer review in the humanities would be the same or similar to peer review in the sciences. You get someone in the same or similar field to read the presented research to determine if they are citing the correct people, using accepted methodologies, and getting what appears to be valid results. In the sciences, we don't expect peer reviewers to replicate the experiments themselves to get the same results; they use their own expertise to see if it, basically, makes sense to run the experiment as they have and that the findings fall in line with what is already known.

It seems with this, as the article says, the peer-review process failed. It's probably that there were no true "peers" (Victorian-era historians familiar with medical cures for hysteria?) to faithfully review the text. Further, it sounds as though there was some pressure to publish because this looked like a book that would sell, which I'm sure is kind of rare for academic publishing.
Exactly. (Coming from the science/engineering side,) Peer-review isn't about fact-checking*; it's a logic check and sanity check that the methodology, reported results, and conclusions appear to be suitable, plausible, and self-consistent to someone with expertise in the field and that the overall paper is worth publishing (i.e., it adds value of some sort to the body of knowledge).

For example, in my areas of research, a paper generally ought to:
  • Show that the authors have reviewed the literature for relevant prior results, generally by summarizing relevant information from the literature and/or noting that they were unable to find existing literature on [topic]. Usually this includes identifying a niche where information is lacking that the paper intends to fill. As a reviewer, I'm not going to hunt down and read all of their references myself, but if I see something hinky, e.g., I know there is published literature when they claimed they couldn't find any, or they cite/summarize a paper I've read in a way that doesn't match the actual content, or they have failed to acknowledge a paper that is highly relevant, I'm going to flag it. If I see something that just looks really off (e.g., a claim that "Dewey, Cheatum, and Howe [14] observed that dropping ice into water causes it to boil"), then I'll probably pull up the reference to check. (Overall, a "good-practice" check)
  • Describe the methodology clearly and thoroughly, state assumptions involved, and justify why the methodology was chosen. As a reviewer, I'm not going to replicate the work for the review, but I'm going to be checking that they've listed enough information that I think I could replicate it (completeness check) and that what they say they did makes sense for what they're trying to accomplish (logic check).
  • Describe the results that they are using to draw conclusions in terms of the facts of what was observed rather than the conclusions drawn from the results ("good-practice" check). Again, as a reviewer, I'm not going to "fact-check" by replicating and I'm going to assume they're not lying about their results, but if something seems off-the-wall from what would be expected on current knowledge and the authors haven't adequately addressed that oddity, I'll flag it ("sanity" and logic check). (E.g., if the authors claim their experimental result is that they added ice to water and the water boiled, then they'd better have thoroughly explained how they've controlled for other factors that could have caused the water to heat up and that they have replicated their experiment to ensure it is reproducible.)
  • Present conclusions that are consistent with the data presented in the paper (even if that conclusions is "Our results are inconsistent with prior data, and further investigation is required to determine why"). As a reviewer, I'm checking for logic, not facts. If they say, e.g., "The water in our experiment boiled when ice was added. We attribute this to aliens with heat rays being attracted to ice cubes," then that's not going to fly (at least not unless they've also presented results that provide evidence for the involvement of aliens with heat rays). If they say, "The water in our experiment boiled [...], therefore we conclude that boiling is triggered by decreasing temperature," then that's not going to fly because they're jumping to conclusions that ignore other possible factors and ignore the extensive, existing body of work outside their experiment. OTOH, if they say, "Our results appear to contradict prior work [3-12] other than that of Dewey, Cheatum, and Howe [14]. There is some indication that the vacuum chamber used can malfunction at low temperature [20], so we hypothesize that a pressure decrease could have lowered the boiling point of water. As future work, we will retest using [additional experimental controls, e.g., a less crappy vacuum chamber ]" then that might be ok.

Now, a lot of those checks lean on the presence of a decent pre-existing body of knowledge in the topic area, both in terms of literature to compare to and the reviewer having knowledge of the field. So a paper with flawed methodology and bogus conclusions might get past peer-review in a novel topic area through no fault of the reviewer. That's where replication comes in--further researchers shouldn't be treating a single paper as if its results and conclusions are gospel, so if the topic is important enough to inspire further research, the collective findings of additional researchers should eventually overwhelm the original flawed conclusions.

*Keep in mind that (at least in the sciences) peer-reviewers are usually performing the review for free, on top of their normal jobs, as a service to the research field/research community and because it benefits them as researchers to make sure research being published in their field is of good quality. They are not professional editors paid to go through the research with a fine-tooth comb.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Donating Orgasms to Science A Turtle Named Mack Techno-Babble 4 18 January 2014 03:25 AM
10 truly bizarre Victorian deaths A Turtle Named Mack Crash and Burn 0 27 December 2013 05:01 PM
Standing Dead Victorian Photographs LadyMcBain Fauxtography 21 21 July 2013 05:31 PM
Yes! Yes! No... when orgasms just don't match up to porn snopes NFBSK Gone Wild! 2 07 May 2013 04:30 PM
Revealed: Chubby Victorian footballer who inspired 'who ate all the pies' chant Stoneage Dinosaur Language 4 13 November 2007 08:05 PM


All times are GMT. The time now is 06:37 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2018, vBulletin Solutions, Inc.