Sunday, August 24, 2025
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Science Education

Study: Algorithms used by universities to predict student success may be racially biased

July 11, 2024
in Science Education
Reading Time: 4 mins read
0
Study: Algorithms used by universities to predict student success may be racially biased
66
SHARES
596
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

Washington, July 11, 2024—Predictive algorithms commonly used by colleges and universities to determine whether students will be successful may be racially biased against Black and Hispanic students, according to new research published today in AERA Open, a peer-reviewed journal of the American Educational Research Association. The study—conducted by Denisa Gándara (University of Texas at Austin), Hadis Anahideh (University of Illinois Chicago), Matthew Ison (Northern Illinois University), and Lorenzo Picchiarini (University of Illinois Chicago)—found that predictive models also tend to overestimate the potential success of White and Asian students. 

Washington, July 11, 2024—Predictive algorithms commonly used by colleges and universities to determine whether students will be successful may be racially biased against Black and Hispanic students, according to new research published today in AERA Open, a peer-reviewed journal of the American Educational Research Association. The study—conducted by Denisa Gándara (University of Texas at Austin), Hadis Anahideh (University of Illinois Chicago), Matthew Ison (Northern Illinois University), and Lorenzo Picchiarini (University of Illinois Chicago)—found that predictive models also tend to overestimate the potential success of White and Asian students. 

ADVERTISEMENT

Video: Co-authors Denisa Gándara and Hadis Anahideh discuss findings and implications of the study

“Our results show that predictive models yield less accurate results for Black and Hispanic students, systemically making more errors,” said study co-author Denisa Gándara, an assistant professor in the College of Education at the University of Texas at Austin.

These models incorrectly predict failure for Black and Hispanic students 19 percent and 21 percent of the time, respectively, compared to false negative rates for White and Asian groups of 12 percent and 6 percent. At the same time, the models incorrectly predict success for White and Asian students 65 percent and 73 percent of the time, respectively, compared to false negative rates for Black and Hispanic students of 33 percent and 28 percent.

“Our findings reveal a troubling pattern—models that incorporate commonly used features to predict success for college students end up forecasting worse outcomes for racially minoritized groups and are often inaccurate,” said co-author Hadis Anahideh, an assistant professor of industrial engineering at the University of Illinois Chicago. “This underscores the necessity of addressing inherent biases in predictive analytics in education settings.”

The study used nationally representative data spanning 10 years from the U.S. Department of Education’s National Center for Education Statistics, including 15,244 students.

Findings from the study also point to the potential value of using statistical techniques to mitigate bias, although there are still limitations.

“While our research tested various bias-mitigation techniques, we found that no single approach fully eliminates disparities in prediction outcomes or accuracy across different fairness notions,” said Anahideh.  

Higher education institutions are increasingly turning to machine learning and artificial intelligence algorithms that predict student success to inform various decisions, including those related to admissions, budgeting, and student-success interventions. In recent years, there have been concerns raised that these predictive models may perpetuate social disparities.

“As colleges and universities become more data-informed, it is imperative that predictive models are designed with attention to their biases and potential consequences,” said Gándara. “It is critical for institutional users to be aware of the historical discrimination reflected in the data and to not penalize groups that have been subjected to racialized social disadvantages.”

The study’s authors noted that the practical implications of the findings are significant but depend on how the predicted outcomes are used. If models are used to make college admissions decisions, admission may be denied to racially minoritized students if the models show that previous students of the same racial categories had lower success. Higher education observers have also warned that predictions could lead to educational tracking, encouraging Black and Hispanic students to pursue courses or majors that are perceived as less challenging.

On the other hand, biased models may lead to greater support for disadvantaged students. By falsely predicting failure for racially minoritized students who succeed, the model may direct greater resources to those students. Even then, Gándara noted, practitioners must be careful not to produce deficit narratives about minoritized students, treating them as though they had a lower probability of success.

“Our findings point to the importance of institutions training end users on the potential for algorithmic bias,” said Gándara. “Awareness can help users contextualize predictions for individual students and make more informed decisions.”

She noted that policymakers might consider policies to monitor or evaluate the use of predictive analytics, including their design, bias in predicted outcomes, and applications.

Funding note: This research was supported by the Institute of Education Sciences at the U.S. Department of Education

Study citation: Gándara, D., Anahideh, H., Ison, M., & Picchiarini, L. (2024). Inside the black box: Detecting and mitigating algorithmic bias across racialized groups in college student-success prediction. AERA Open, 10(1), 1–15. 

###

About AERA
The American Educational Research Association (AERA) is the largest national interdisciplinary research association devoted to the scientific study of education and learning. Founded in 1916, AERA advances knowledge about education, encourages scholarly inquiry related to education, and promotes the use of research to improve education and serve the public good. Find AERA on Facebook, X, LinkedIn, Instagram, Threads, and Bluesky.



Journal

AERA Open

DOI

10.1177/23328584241258741

Article Title

Inside the Black Box: Detecting and Mitigating Algorithmic Bias Across Racialized Groups in College Student-Success Prediction

Article Publication Date

11-Jul-2024

Share26Tweet17
Previous Post

Trained peers are as effective as clinical social workers in reducing opioid overdose, new trial finds

Next Post

Same workout, different weight loss: Signal molecule versions are key

Related Posts

blank
Science Education

Illinois Family Shares Heartbreaking Story of Losing Son to Necrotizing Enterocolitis Ahead of NEC Society Symposium in Chicago

August 22, 2025
blank
Science Education

NUS Medicine Launches Ellen Siow Professorship in Neurosurgery to Propel Neuro-Oncology Research

August 22, 2025
blank
Science Education

Weight Loss Trial Shows Promising Results for Breast Cancer Patients at One-Year Follow-Up

August 21, 2025
blank
Science Education

The Case for Greater Parental Input in Educational Leadership: A Scientific Perspective

August 21, 2025
blank
Science Education

NLP-Powered App Boosts Engineering, Physics Engagement

August 21, 2025
blank
Science Education

Enhancing Patient-Provider Communication and COPD Education Crucial for Advancing Patient Care

August 21, 2025
Next Post
Ogawa-Weight_loss-Mouse_wheel

Same workout, different weight loss: Signal molecule versions are key

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27537 shares
    Share 11012 Tweet 6882
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    952 shares
    Share 381 Tweet 238
  • Bee body mass, pathogens and local climate influence heat tolerance

    641 shares
    Share 256 Tweet 160
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    508 shares
    Share 203 Tweet 127
  • Warm seawater speeding up melting of ‘Doomsday Glacier,’ scientists warn

    311 shares
    Share 124 Tweet 78
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Revolutionizing Drug Interaction Prediction with Graph Networks
  • Diverse Reproductive Strategies in Cryptic European Earwigs
  • Five-Year Study on Flood Preparedness in Dutch Healthcare
  • ColoViT: Next-Gen AI Fusion for Colon Cancer Detection

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 4,859 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading