People who trust science are more likely to be duped into believing and disseminating pseudoscience, finds a new paper in the Journal of Experimental Social Psychology.
Pseudoscience is false information that references science broadly or scientific terms, research or phenomena. Across four experiments, researchers asked U.S. adults to read news articles written for the study that intentionally made false claims about two topics: a fictional virus created as a bioweapon or the health effects of genetically modified organisms, or GMOs.
The experiments reveal that study participants who indicated they had higher levels of trust in science were most likely to believe the fake account if it contained scientific references. Those individuals also were more likely to agree that stories spotlighting pseudoscience should be shared with others.
On the other hand, people who demonstrated a stronger understanding of scientific methods were less likely to believe what they read and say it should be shared, regardless of whether the information was attributed to science.
The researchers note their findings conflict with ongoing campaigns to promote trust in science as a way to fight misinformation about the COVID-19 pandemic, mask-wearing and COVID-19 vaccines. Trust in science alone is insufficient, says the lead author of the paper, Thomas C. O’Brien, a social psychologist who studies conflict resolution and trust in institutions, most recently at the University of Illinois Urbana-Champaign.
“Importantly, the conclusion of our research is not that trust of science is risky but rather that, applied broadly, trust in science can leave people vulnerable to believing in pseudoscience,” write O’Brien and co-authors Dolores Albarracín, a psychologist recently named director of the Science of Science Communication Division at the University of Pennsylvania’s Annenberg Public Policy Center, and Ryan Palmer, a former graduate researcher at the University of Illinois Urbana-Champaign.
The researchers note that it’s tough for lay audiences to fully understand complex topics such as the origins of a virus or how GMOs might affect public health. They suggest a more sustainable solution for curbing misinformation is helping the public develop a type of scientific literacy known as methodological literacy. People who understand scientific methods and research designs can better evaluate claims about science and research, they explain.
In an email interview, Albarracín pointed out that blanket trust in science can lead people who would otherwise brush off conspiracy theories to believe them if they’re presented alongside science-related content such as quotes from scientists and references to academic studies.
She added that skepticism is a healthy, essential part of science.
“The solution to climate change denial, irrational fears of GMOs or vaccination hesitancy is not to preach trust in science,” wrote Albarracín, recently named a Penn Integrates Knowledge university professor.
“Trust in science has a critical role to play with respect to increasing public support for science funding, enhancing science education and separating trustworthy from untrustworthy sources,” she continued. “However, trust in science does not fix all evils and can create susceptibility to pseudoscience if trusting means not being critical.”
How they conducted the study
To test whether trust in science makes people more susceptible to pseudoscience, researchers conducted four experiments with nearly 2,000 U.S. adults in total. They recruited volunteers for two experiments via Amazon Mechanical Turk, an online crowdsourcing platform. Survey research company Dynata provided samples for the other two experiments.
Several hundred people participated in each experiment. All began and ended in 2020, varying in duration from two days to a little over a week.
For each experiment, researchers randomly assigned study participants to read a news article and complete an online questionnaire asking, among other things, if they believed the article and whether it should be shared with others.
In one of the articles, scientists from prominent universities claim the fictional “Valza Virus” was made in a lab and that the U.S. government concealed its role in creating it as a bioweapon. Another story mentions a real study supporting the idea that mice develop tumors after eating GMOs, but does not note the paper was retracted in 2013. For comparison, researchers assigned some people to read versions of the news stories that featured activists as sources of information, not scientists or research.
To gauge participants’ level of trust in science, researchers asked them to indicate whether they agreed with statements such as “Scientists usually act in a truthful manner and rarely forge results” and “The Bible provides a stronger basis for understanding the world than science does.”
People also answered multiple-choice questions designed to measure how well they understand scientific methodology.
In the final experiment, participants responded to a writing prompt meant to put them in a certain mindset before reading their assigned article.
One of the writing prompts was meant to put people in a “trust in science” mindset. It directed them to give three examples of how science had saved lives or otherwise benefited humanity.
Another prompt, aimed at inducing a “critical evaluation mindset,” directed participants to give examples of people “needing to think for themselves and not blindly trust what media or other sources tell them.”
The remaining prompt, created solely for comparison purposes, focused on unusual or interesting landscapes.
Results suggest respondents who considered the benefits of science before reading their article were more likely to believe pseudoscientific claims than respondents who offered examples of individuals who need to think for themselves.
An important limitation
O’Brien and Albarracín noted that study participants did not have a chance to check the article they read against other sources. In real life, some participants might have tried to verify claims by, for example, comparing their news story to coverage from other news outlets.
Albarracín wrote in her e-mail that good source checkers would have discovered that the study on GMOs mentioned in one of the stories had been withdrawn by the journal that published it. According to the journal’s retraction statement, a closer examination of the details of the study revealed that definitive conclusions could not be reached because of the small sample size.
Suggestions for journalists
The paper’s findings have important implications for newsrooms.
Albarracín encouraged journalists covering research to describe their process for assessing research quality. Reporters can also explain how the design of a research study and the scientific methods used might have influenced findings, she wrote.
Doing these things can help the public learn how to evaluate scientific claims. Journalists “could routinely report on doubts and uncertainty and the strengths and weaknesses of the method, to model this thought process for their audiences,” Albarracín wrote.
It would be helpful, she added, if journalists wrote more about the distinction between science and pseudoscience.
O’Brien urges journalists to learn terms they do not understand but frequently encounter in academic literature. That will help them better understand research and explain it to their audiences.
“Like, what does ‘randomization’ mean?” he asks. “What’s statistical power and what does it mean to have converging evidence? And what is peer review and [what are] the limits of peer review? These are definitely things that should be of interest to journalists.”
from Hacker News https://ift.tt/3lGLdMa
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.