As the U.S. recoils from the divisions of recent years and the scientific community tries to rebuild trust in science, scientists may be tempted to reaffirm their neutrality. If people are to trust us again, as I have frequently heard colleagues argue, we have to be scrupulous about not allowing our values to intrude into our science. This presupposes that value neutrality is necessary for public trust and that it is possible. But available evidence suggests that neither presumption is correct.
Recent research in communications has shown that people are most likely to accept a message when it is delivered by trusted messengers—teachers, for example, or religious or business leaders, or local doctors and nurses. One strategy to build trust, therefore, is for scientists to build links from their laboratories, institutes and academic departments into the communities where they live and work. One way to do this is by partnering with organizations such as the National Center for Science Education, founded to fight creationism in the classroom but now working broadly with teachers to increase understanding of the nature of science itself. To do this, scientists do not need to throw off their personal values; they merely need to share with teachers a belief in the value of education. This is important because research suggests that, even if we try, we can’t throw off our values.
It is well known that people are more likely to accept evidence that accords with what they already believe. Psychologists call this “motivated reasoning,” and although the term is relatively recent, the insight is not. Four hundred years ago Francis Bacon put it this way: “Human understanding is not composed of dry light, but is subject to influence from the will and the emotions … man prefers to believe what he wants to be true.”*
Scientists may assume this motivated reasoning explains erroneous positions—such as the refusal to wear a mask to limit the spread of COVID-19—but plays little role in science. Alas, there is little evidence to support such confidence. Some research suggests that even with financial incentives, most people are apparently incapable of escaping their biases. Thus, the problem seems to be not a matter of will but of capacity. Great scientists may think because they are trained to be objective, they can avoid the pitfalls into which ordinary people fall. But that isn’t necessarily the case.
Does this mean that science cannot be objective? No. What makes it so is not scientists patrolling their own biases but rather the mechanisms used to ensure that bias is minimized. Peer review is the best known of these, though equally if not more important is diversity. As I contend in the new edition of my book Why Trust Science, diversity in science is crucial not just to ensure that every person has a chance to develop their talent but to ensure that science is as unbiased as possible.
Some will argue that value neutrality is an ideal toward which we should strive, even if we know it can’t be achieved entirely. In the practice of science, this argument may hold. But what is useful in scientific research may be counterproductive in public communication because the idea of a trusted messenger implies shared values. Studies show that U.S. scientists want (among other things) to use their knowledge to improve health, make life easier, strengthen the economy through innovation and discovery, and protect people from losses associated with disruptive climate change.
Opinion polls suggest that most Americans want many of these things, too; 73 percent of us believe that science has a mostly positive impact on society. If scientists decline to discuss their values for fear that they conflict with the values of their audiences, they may miss the opportunity to discover significant points of overlap and agreement. If, on the other hand, scientists insist on their value neutrality, they will likely come across as inauthentic, if not dishonest. A person who truly had no values—or refused to allow values to influence their decision-making—would be a sociopath!
Value neutrality is a tinfoil shield. Rather than trying to hide behind it, scientists should admit that they have values and be proud that these values motivate research aiming to make the world a better place for all.
*Editor’s Note (3/31/21): This sentence was edited after posting to correct the quote.