So, this morning, I suggested that consideration of popular taste has its place in religious practice. Somehow, an article titled “Most scientists ‘can’t replicate studies by their peers’,” on the BBC, by Tom Feilden, strikes me as related:
The problem[, when he couldn’t replicate a textbook study], was not with Marcus Munafo’s science, but with the way the scientific literature had been “tidied up” to present a much clearer, more robust outcome.
“What we see in the published literature is a highly curated version of what’s actually happened,” he says.
“The trouble is that gives you a rose-tinted view of the evidence because the results that get published tend to be the most interesting, the most exciting, novel, eye-catching, unexpected results.
The article goes on to quote University of Cambridge Sainsbury Laboratory director Dame Ottoline Leyser as suggesting that it’s not deliberate fraud, per se, but a push for “impact over substance, flashy findings over the dull, confirmatory work.” One could read that two ways: scientists want to make interesting discoveries to be read, but also, if they confirm the political or ideological biases of their peers and audiences, they’ll get more notice.
Thus, a flashy finding will make its way around the world and become truth even though other scientists might only be able to replicate the results with the same odds of rolling dice.
I can’t seem to find the link, but it certainly opened my eyes way back when I began reading and responding to items in the news that a study proclaiming the value of contraception over abstinence went around the world without anybody but your humble blogger pointing out that the percentages in the equation were such that improvements in abstinence actually made it look like contraceptives improved even more.
Back then, as I recall, the Bush Administration’s emphasis on abstinence education was the anti-science that the smart set was dying to prove to be superstition. That incentive made their whole vaunted system for generating and promulgating information susceptible to believing erroneous science.