Raising Concerns vs. Denialism

Christie Wilcox, writing for Discover Magazine, on the debate surrounding the use of a pesticide called glyphosate (aka RoundUp). Here, Wilcox responds to an article by Kara Moses in which the necessity of peer-reviewed scientific confirmation is questioned. I’m not linking in regard the specific case, but rather to the broader points made by Wilcox:

The trouble is, it’s one thing to notice a potential danger and raise a few alarm bells to get scientists to investigate an issue — it’s a whole other to publicize and propagandize an unsubstantiated fear despite evidence against it. The former is important, as Kara suggests, and should occur. I have no problem with non-scientists raising honest concerns, if their goal is to have the concerns considered — so long as they’re actually willing to hear what the evidence has to say. The latter, on the other hand, is denialism. You see, once scientists have weighed in, you have to be willing to listen to them.

and to drive it home (emphasis mine):

To reply to Kara’s original question: no, you don’t need a body of scientific evidence to raise concerns, if that’s really the goal of what you’re doing. But you do need at least a shred that suggests such concerns are valid before you shout them as facts from the rooftops. You should support independent scientists that study what you’re concerned about instead of trying to tie every one (usually in some ludicrous way) to biased funding. And if those scientists weigh in with well-designed studies that don’t agree with your initial concerns, you should feel relieved, not betrayed. If scientists are in consensus on a topic, it’s because the evidence is strong. It’s because they’ve investigated and rigorously tested the possible hypotheses using different methods, and the same conclusions keep stubbornly arising. Scientists don’t come to consensus easily, so when they do, you should listen to them

There is value in advocacy groups creating small, non-scientific studies in order to investigate whether a particular issue merits concern. The key is in what happens next. If that group believes they have findings that warrant further study, then they should certainly seek rigorous scientific investigation. If peer-reviewed research backs up their initial concerns, the group should be commended. If, however, the results don’t confirm their initial concerns, that should come as welcome news.

Unfortunately, when peer-reviewed science doesn’t back up the claims of a particular lobby, that lobby all too often goes public with their initial findings and cries foul. The media is not adept at discerning which studies are scientifically valid and they particularly love using false equivalency to create the illusion of debate.1 After all, society surely finds it much more compelling to envision rogue scientists conspiring with governments and evil corporations than it is to believe that their unfounded concerns were wrong.

One need only look to climate change science, or the debunked link between autism and vaccines, to witness the embodiment of Wilcox’s main point. People, even when presented with sound and overwhelming scientific evidence, find it easier to subscribe to unreasonable conspiracy plots than it is to relinquish their emotional beliefs. Perhaps it speaks more to human behavior than to a specific distrust of science. However, as a scientist, I find the trend disturbing and depressing.