The spread of ignorance – and the guilt of science

Snake on microscope

4 April 2016 – BBC Future has just published an interesting article on Robert Proctor, a science historian from Stanford University, who studies how people or companies with vested interests spread ignorance and obfuscate knowledge. The spread of ignorance follows certain patterns, whether it is about tobacco or climate change. Proctor found that ignorance spreads when firstly, many people do not understand a concept or fact and secondly, when special interest groups — like a commercial firm or a political group – then work hard to create confusion about an issue. In the case of ignorance about tobacco and climate change, a scientifically illiterate society will probably be more susceptible to the tactics used by those wishing to confuse and cloud the truth.

This hit me because of a recent survey on scientists. Nearly 500 eminent astronomers, biologists, chemists, physicists and earth scientists have been surveyed to identify the “core traits of exemplary scientists”. Their answer? Honesty is critical, second only to curiosity, and we ought to do more to instill it in those eyeing science careers.

Ironically, they are deceiving themselves. Researchers have never been whiter than white. Here are a couple of revealing numbers. About 2 per cent of scientists admit to at least one act of research misconduct. But as a whole, researchers say that around 14 per cent of colleagues are involved in such behaviour. Someone’s not being straight.

Those figures come from a 2009 meta-analysis that also found one-third of scientists confessed to “questionable research practices” such as cooking data, mining it for a significant result that is then presented as the original target of the study, selective publication or concealing conflicts of interest.

We may never know for sure how widespread such behavior is. According to another meta-analysis published in October, scientists are becoming less likely to admit to fabrication, falsification or plagiarism. That study also found that researchers see plagiarism as more heinous than making results up. They are more likely to report a colleague they catch plagiarising than one fabricating or falsifying data.

How can this be so, when honesty is supposedly such an essential attribute? Because a little bit of rule bending helps get the job done. Raymond De Vries at the University of Michigan and colleagues have argued that data manipulation based on intuition of what a result should look like is “normal misbehavior”. They see such common misbehaviors as having “a useful and irreplaceable role” in science. Why? Because of “the ambiguities and everyday demands of scientific research”.

In other words, data isn’t often as clean as you would like. Frederick Grinnell, an ethicist at the University of Texas, says intuition is “an important and perhaps in the end a researcher’s best guide to distinguishing between data and noise”. Sometimes you just know that data point was an anomaly to be ignored.

Should we do something to make science more virtuous? Probably not. Those eminent academics questioned for the survey by Michigan State University are hopelessly optimistic when it comes to improving ethical standards: 94 per cent of them said students can learn scientific values and virtues from “exemplary scientists”.

Clearly, they haven’t read the 1996 study that found teaching research ethics made students more likely, not less, to misbehave. Scientists, eh? It’s almost like they’re human.

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top