People’s views are becoming more and more polarized, with "echo chambers" — social bubbles that reinforce existing beliefs — exacerbating differences in opinion. This divergence doesn’t just apply to political opinions; it also touches on factual topics, from climate change to vaccination.
And social media is not the sole culprit, according to a recent study published in the Proceedings of the National Academy of Sciences USA. It turns out that people use search engines in ways that confirm their existing beliefs, potentially amplifying polarization. A simple tweak to search algorithms, the researchers propose, could help deliver a broader range of perspectives.
Online participants were asked to rate their beliefs on six topics, including nuclear energy and caffeine’s health effects. They then chose search terms to learn more about each topic. The researchers rated the terms’ scope and found that between 9 and 34 percent (depending on the topic) were "narrow." For example, when researching the health effects of caffeine, one participant used "caffeine negative effects," whereas others used "benefits of caffeine."
These narrow terms tended to align with participants’ existing beliefs, and generally less than 10 percent did this knowingly. "People often pick search terms that reflect what they believe, without realizing it," says Eugina Leung of Tulane University’s business school, who led the study.
"Search algorithms are designed to give the most relevant answers for whatever we type, which ends up reinforcing what we already thought." The same was true when participants used ChatGPT and Bing for searches aided by artificial intelligence.
When the researchers randomly assigned participants to view different results, they saw those results affect people’s opinions and even behavior. For instance, participants who saw search results using "nuclear energy is good" felt better about nuclear energy afterward than those using "nuclear energy is bad." People who saw results using "caffeine health benefits," rather than "risks," were more likely to choose a caffeinated drink afterward.
Pointing out biases in the search terms had only a small effect on people’s final opinions. But changing the search algorithm either to always provide broad results or to alternate between results obtained with broad and user-provided terms mitigated the effects of narrow searches.
The researchers "have thought through how these technologies could be optimized for the benefit of users," says Kathleen Hall Jamieson of the University of Pennsylvania, who studies political and science communication. For search technology to do what we need it to do, "this kind of research is very important."
Participants rated the broader results as just as useful and relevant as standard searches. "People are able to bring in different perspectives when they’re exposed to them, which is encouraging," Leung says. "At least for the topics we tested." The researchers recommend implementing such strategies, possibly as “search broadly” buttons. "This would be really helpful," Leung says, but whether it will ever happen "is hard to predict."
No comments:
Post a Comment