ITHACA, N.Y. – Cornell University researchers have found in a public health emergency, most people pick out and click on accurate information during internet searches.
ITHACA, N.Y. – Cornell University researchers have found in a public health emergency, most people pick out and click on accurate information during internet searches.
Though higher-ranked results are clicked more often, they are not more trusted. And the presence of misinformation does not damage trust in accurate results that appear on the same page. However, banners at the top of the search results page warning about misinformation decrease trust in accurate information, according to the research published in Scientific Reports.
The relationship between search result rank and misinformation is particularly important during a global pandemic because medical misinformation could be fatal, said Sterling Williams-Ceci, doctoral student and the paper’s first author.
“Misinformation has been found to be highly ranked in audit studies of health searches, meaning accurate information inevitably gets pushed below it. So, we tested whether exposure to highly ranked misinformation taints people’s trust in accurate information on the page, and especially in accurate results when they are ranked below the misinformation,” Williams-Ceci said.
Experiments showed that misinformation was highly distrusted in comparison with accurate information, even when shown at or near the top of the results list. In fact, contrary to assumptions in prior work, there was no general relationship between search results’ ranking on the page and how trustworthy people considered them to be.
“Misinformation was rarely clicked and highly distrusted: Only 2.6% of participants who were exposed to inaccurate results clicked on these results,” the researchers wrote.
Further, the presence of misinformation, even when it showed up near the top of the results, did not cause people to distrust the accurate information presented below it.
Another experiment introduced warning banners on the search pages. These banners appeared at the top of the page for some participants and warned that unreliable information may be present in the results without identifying what this information said.
Google currently uses banners like these, but few studies have explored how they affect decisions about what information to trust in online searches, Williams-Ceci said.
The researchers found that one of these banners had an unanticipated backfire effect: It significantly decreased people’s trust in accurate results, while failing to decrease their trust in misinformation results to the same degree.
“The backfire effect of warning labels is very alarming, and further research is needed to learn more about why the labels backfire and how misinformation can be more effectively combatted, not only on Google but on other platforms as well,” said co-author Michael Macy, professor of sociology.
For additional information, see this Cornell Chronicle story.
-30-
Journal
Scientific Reports
Discover more from Science
Subscribe to get the latest posts sent to your email.