Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search
12 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
So Science…Might Have Gotten It Wrong. Now What? (Original Post) bluedeathray Jul 2013 OP
If you expect people to click on a link, show a brief excerpt. n/t eridani Jul 2013 #1
hyperbolic article title. Notafraidtoo Jul 2013 #2
No, just catchy. Igel Jul 2013 #9
Here dipsydoodle Jul 2013 #3
Consider that anti-vaccine fervor still exists years after Wakefield was discredited Thor_MN Jul 2013 #5
I only posted that dipsydoodle Jul 2013 #6
Just because it has been cited does not mean others believe it ... eppur_se_muova Jul 2013 #11
there you go. greenman3610 Jul 2013 #4
"Scientists" publishing bogus studies is so utterly fucked-up siligut Jul 2013 #7
Strikes me as a problem that nobody's offered a solution to yet. Igel Jul 2013 #10
Nothing surprising here. Orsino Jul 2013 #8
This is a problem for information technology. napoleon_in_rags Jul 2013 #12

Notafraidtoo

(402 posts)
2. hyperbolic article title.
Mon Jul 8, 2013, 06:09 AM
Jul 2013

The title seems to have no purpose other then to get you to read nonsense. Someone below in the comments said this-

"Person does a study. Publishes it. Others can’t replicate it. They publish that. Looks like the system’s working. What’s your point? By the way, where’s the bit in your article where you address the “now what” referred to in the title?"

I agree with this persons assessment,The person who wrote the article doesn't seem to understand the Scientific method.

Igel

(35,350 posts)
9. No, just catchy.
Mon Jul 8, 2013, 06:15 PM
Jul 2013

But he gets it.

It's just that it's not enough. He has two points. The first is the mRNA study, in which a paper gets published and fairly quickly while not exactly debunked, it's knocked in the head. Maybe there's something there, but that paper didn't show it. Perhaps others will sort it out. Sci Method.

The bigger problem is how he starts out, and it's a problem in every field (not just biology or even "science&quot . A paper gets published. It gets cited.

Later--perhaps months, perhaps decades--somebody checks up on it. Oops. The paper was bogus. Or had bad stats. Or was somehow methodologically flawed. The failure to replicate might not be publicized very well, so the original, flawed paper can still be cited. Or perhaps the researchers a few years later don't bother to read *all* the relevant literature. Like we're all really careful about that. So a flaw continues in the science, and research is built on that flawed basis. That's part of the "now what"? How to prevent that?

It works the other way, too. Had one acquaintance who found some minor discrepancy using the Bark equation--how sensitive the human ear is at various frequencies to pitch differences. He rummaged and rummaged tracing back the equation he was using from textbook to published paper to handbook to reference manual. Finally he went all the way back to Bark's original paper. He found that shortly after Bark published his equation somebody Really Famous cited it in a Standard Work that was very widespread. And he claimed that that Really Famous person had introduced a typo into the equation. A small, little typo. But it had been cited in that form for decades in all kinds of places. Bark's paper was cited, as well, but nobody ever looked at it.

Again, another error. Except what kind of a publication would that have been? I mean, is it really "research"? "Uh, hi there. I'd like to point out that this equation as originally published was _________ but that in later works, from xxxx to yyyyy have it in a mistaken form. The earliest citation I can find is in Famous Guy's paper ______. My methodology consisted of looking at the original paper everybody cited but nobody ever bothered to read, even though it was assigned to graduate students regularly for the last 25 years. Now, please don't be embarrassed and deny me tenure."

dipsydoodle

(42,239 posts)
3. Here
Mon Jul 8, 2013, 06:27 AM
Jul 2013

Last week, I wrote about a scientific paper that was published in the elite journal Nature in 1995. Within a couple of years, the findings of said paper were called into question by several other papers in different journals. As of today, nearly two decades since the original came out, nobody has replicated it. And yet, it’s still sitting there in the literature, still influencing others. It’s been cited nearly 1,000 times.

Some readers were angry with my post, arguing, for example, that “science’s self-correcting paradigm works over decades”. Indeed, that was my point. Science’s self-correction is generally very slow — perhaps, as many argue, too slow.

This week I learned about an unfolding scientific debate that’s got me thinking again about the challenge — the impossibility? — of swift and sure scientific correction. What does it mean when one group of researchers, or even two or three groups, can’t replicate a particular scientific finding? Does that necessarily mean it’s wrong? At what point should a scientist give up on a new idea for lack of supporting evidence?

That unfolding debate started in late 2011, when Chen-Yu Zhang’s team from Nanjing University in China found something pretty wild: bits of rice RNA floating in the bloodstreams of Chinese men and women. That might not seem so strange; rice was a primary ingredient of their diets, after all. But RNA molecules are pretty fragile. So the discovery shocked and intrigued many biologists.

http://phenomena.nationalgeographic.com/2013/07/04/so-science-might-have-gotten-it-wrong-now-what/

 

Thor_MN

(11,843 posts)
5. Consider that anti-vaccine fervor still exists years after Wakefield was discredited
Mon Jul 8, 2013, 09:18 AM
Jul 2013

It's been years since his medical license has been removed and his articles retracted. Yet there are still groups that revile vaccines. If anything, the internet has lengthened the time that bad science sticks around as search engines make it easier to find.

dipsydoodle

(42,239 posts)
6. I only posted that
Mon Jul 8, 2013, 09:23 AM
Jul 2013

because reply #1 had a reluctance to hit what is clearly a National Geographic link.

eppur_se_muova

(36,281 posts)
11. Just because it has been cited does not mean others believe it ...
Mon Jul 8, 2013, 09:24 PM
Jul 2013

If you are publishing a paper in any field, you are expected to summarize any previous work in this area and cite precedence. If the citation is that "Smith claimed X, but subsequent studies do not support this", that constitutes a "citation", but not one that adds credence to the study cited. One cannot judge the merit of a paper simply by counting the number of citations (something which everyone knows -- but everyone still wants to see their paper listed as one of the most cited). It is *assumed* that anyone wishing to do research in that area will search the literature themselves, and recognize the bad papers as well as the good ones. Obviously, this is not always true, but in such cases as the error in the Bark equation, the obvious thing to do is to include a short paragraph or footnote pointing out the persistent error as objectively as possible, without snark or mockery, or attempt to assign blame. It doesn't matter one whit whether the mistake was due to Some Famous Person or Some Totally Unknown Drudge; the correction is not a personal attack, and only the most daft would take it as such.

Catching a real boner in the literature is very unlikely to cost you tenure; it is likely to get you talked about in a positive way, including by people outside your specialty.

siligut

(12,272 posts)
7. "Scientists" publishing bogus studies is so utterly fucked-up
Mon Jul 8, 2013, 11:12 AM
Jul 2013

We have traditionally relied on peer review and replication for verification and acceptance of the work into the scientific knowledge base. With science, the work of others is used as building blocks to further research and add to that base.

I can't help but think this article is a fucked-up attempt to cast doubt into what we know and accept to be fact.

Igel

(35,350 posts)
10. Strikes me as a problem that nobody's offered a solution to yet.
Mon Jul 8, 2013, 06:48 PM
Jul 2013

A lot of what we know and accept to be fact is all wet. It's a serious problem in teaching high school science--a lot of what is taught is so simplified as to be wrong. Or teachers got it wrong and the error keeps on keeping on.

Mistakes happen. Analyses and methodologies go astray. Group think keeps people from seeing what's wrong. Reputations keep people from questioning results. Accepted facts get destroyed and shown to be accepted falsehoods.

A lot of research is never, ever duplicated because it's not sexy--it gobbles up lab time and resources but may get you nothing useful for your research or career. A lot of that research is used as a building block by others. Then when somebody does duplicate it a decade later, you have to go back and sort out what, exactly, used that as a necessary building block and what research can still stand with the building blocks that *are* valid. There's no mechanism for that.

Or you find it cited even though it's been falsified because the paper is cited widely but the rebuttal is in an obscure journal by an obscure working group usually doing something Completely Different.

Orsino

(37,428 posts)
8. Nothing surprising here.
Mon Jul 8, 2013, 02:23 PM
Jul 2013

I guess the author was just outlining the scientific process for the benefit of the magazine's readers?

napoleon_in_rags

(3,991 posts)
12. This is a problem for information technology.
Mon Jul 8, 2013, 10:38 PM
Jul 2013

Of course people cited the work, of course the work is still being validated/invalidated. The only problem is when people don't know the certainty level of the work, which in this case is low. In the information age, there is no reason why citations should be just numbers on a page. Work should be published in a way where experiments which validate the work done by peers increase a probability metric that the original work is correct. A researcher should be able to cite an uncertain work, and write their paper around that uncertainty or certainty, (lack of experimental corroboration) and have their assertions become more or less certain as cited work becomes more or less certain.

This is no big deal in this age, and it should be done.

Latest Discussions»Culture Forums»Science»So Science…Might Have Got...