Saturday, February 7, 2026
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Technology and Engineering

Chatbots tell people what they want to hear

May 13, 2024
in Technology and Engineering
Reading Time: 3 mins read
0
Chatbots tell people what they want to hear
66
SHARES
601
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

Chatbots share limited information, reinforce ideologies, and, as a result, can lead to more polarized thinking when it comes to controversial issues, according to new Johns Hopkins University–led research. 

Chatbots share limited information, reinforce ideologies, and, as a result, can lead to more polarized thinking when it comes to controversial issues, according to new Johns Hopkins University–led research. 

The study challenges perceptions that chatbots are impartial and provides insight into how using conversational search systems could widen the public divide on hot-button issues and leave people vulnerable to manipulation. 

“Because people are reading a summary paragraph generated by AI, they think they’re getting unbiased, fact-based answers,” said lead author Ziang Xiao, an assistant professor of computer science at Johns Hopkins who studies human-AI interactions. “Even if a chatbot isn’t designed to be biased, its answers reflect the biases or leanings of the person asking the questions. So really, people are getting the answers they want to hear.”

Xiao and his team will share their findings at the Association of Computing Machinery’s CHI conference on Human Factors in Computing Systems at 5 p.m. ET on Monday, May 13.

To see how chatbots influence online searches, the team compared how people interacted with different search systems and how they felt about controversial issues before and after using them.

The researchers asked 272 participants to write out their thoughts about topics including health care, student loans, or sanctuary cities, and then look up more information online about that topic using either a chatbot or a traditional search engine built for the study. After considering the search results, participants wrote a second essay and answered questions about the topic. Researchers also had participants read two opposing articles and questioned them about how much they trusted the information and if they found the viewpoints to be extreme.

Because chatbots offered a narrower range of information than traditional web searches and provided answers that reflected the participants’ preexisting attitudes, the participants who used them became more invested in their original ideas and had stronger reactions to information that challenged their views, the researchers found.

“People tend to seek information that aligns with their viewpoints, a behavior that often traps them in an echo chamber of like-minded opinions,” Xiao said. “We found that this echo chamber effect is stronger with the chatbots than traditional web searches.”

The echo chamber stems, in part, from the way participants interacted with chatbots, Xiao said. Rather than typing in keywords, as people do for traditional search engines, chatbot users tended to type in full questions, such as, What are the benefits of universal health care? or What are the costs of universal health care? A chatbot would answer with a summary that included only benefits or costs. 

“With chatbots, people tend to be more expressive and formulate questions in a more conversational way. It’s a function of how we speak,” Xiao said. “But our language can be used against us.”  

AI developers can train chatbots to extract clues from questions and identify people’s biases, Xiao said. Once a chatbot knows what a person likes or doesn’t like, it can tailor its responses to match. 

In fact, when the researchers created a chatbot with a hidden agenda, designed to agree with people, the echo chamber effect was even stronger. 

To try to counteract the echo chamber effect, researchers trained a chatbot to provide answers that disagreed with participants. People’s opinions didn’t change, Xiao said. The researchers also programmed a chatbot to link to source information to encourage people to fact-check, but only a few participants did.

“Given AI-based systems are becoming easier to build, there are going to be opportunities for malicious actors to leverage AIs to make a more polarized society,” Xiao said. “Creating agents that always present opinions from the other side is the most obvious intervention, but we found they don’t work.”

Authors include Johns Hopkins graduate student Nikhil Sharma and Microsoft principal researcher Q. Vera Liao.



Share26Tweet17
Previous Post

The facility for rare isotope beams observes five never-before-seen isotopes

Next Post

Coming out to a chatbot?

Related Posts

blank
Technology and Engineering

Comprehensive Global Analysis: Merging Finance, Technology, and Governance Essential for Just Climate Action

February 7, 2026
blank
Technology and Engineering

Revolutionary Genetic Technology Emerges to Combat Antibiotic Resistance

February 6, 2026
blank
Technology and Engineering

Nanophotonic Two-Color Solitons Enable Two-Cycle Pulses

February 6, 2026
blank
Technology and Engineering

Insilico Medicine Welcomes Dr. Halle Zhang as New Vice President of Clinical Development for Oncology

February 6, 2026
blank
Technology and Engineering

Novel Gene Editing Technique Targets Tumors Overloaded with Oncogenes

February 6, 2026
blank
Technology and Engineering

New Study Uncovers Microscopic Sources of Surface Noise Affecting Diamond Quantum Sensors

February 6, 2026
Next Post
Coming out to a chatbot?

Coming out to a chatbot?

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27610 shares
    Share 11040 Tweet 6900
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    1017 shares
    Share 407 Tweet 254
  • Bee body mass, pathogens and local climate influence heat tolerance

    662 shares
    Share 265 Tweet 166
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    529 shares
    Share 212 Tweet 132
  • Groundbreaking Clinical Trial Reveals Lubiprostone Enhances Kidney Function

    515 shares
    Share 206 Tweet 129
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Enhancing Education: Effective Support for Gender Equality
  • Improving Dementia Care with Enhanced Activity Kits
  • TPMT Expression Predictions Linked to Azathioprine Side Effects
  • Evaluating Pediatric Emergency Care Quality in Ethiopia

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Biotechnology
  • Blog
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Editorial Policy
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 5,190 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading