Thursday, September 18, 2025
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Science Education

Study: Algorithms used by universities to predict student success may be racially biased

July 11, 2024
in Science Education
Reading Time: 4 mins read
0
Study: Algorithms used by universities to predict student success may be racially biased
66
SHARES
596
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

Washington, July 11, 2024—Predictive algorithms commonly used by colleges and universities to determine whether students will be successful may be racially biased against Black and Hispanic students, according to new research published today in AERA Open, a peer-reviewed journal of the American Educational Research Association. The study—conducted by Denisa Gándara (University of Texas at Austin), Hadis Anahideh (University of Illinois Chicago), Matthew Ison (Northern Illinois University), and Lorenzo Picchiarini (University of Illinois Chicago)—found that predictive models also tend to overestimate the potential success of White and Asian students. 

Washington, July 11, 2024—Predictive algorithms commonly used by colleges and universities to determine whether students will be successful may be racially biased against Black and Hispanic students, according to new research published today in AERA Open, a peer-reviewed journal of the American Educational Research Association. The study—conducted by Denisa Gándara (University of Texas at Austin), Hadis Anahideh (University of Illinois Chicago), Matthew Ison (Northern Illinois University), and Lorenzo Picchiarini (University of Illinois Chicago)—found that predictive models also tend to overestimate the potential success of White and Asian students. 

Video: Co-authors Denisa Gándara and Hadis Anahideh discuss findings and implications of the study

“Our results show that predictive models yield less accurate results for Black and Hispanic students, systemically making more errors,” said study co-author Denisa Gándara, an assistant professor in the College of Education at the University of Texas at Austin.

These models incorrectly predict failure for Black and Hispanic students 19 percent and 21 percent of the time, respectively, compared to false negative rates for White and Asian groups of 12 percent and 6 percent. At the same time, the models incorrectly predict success for White and Asian students 65 percent and 73 percent of the time, respectively, compared to false negative rates for Black and Hispanic students of 33 percent and 28 percent.

“Our findings reveal a troubling pattern—models that incorporate commonly used features to predict success for college students end up forecasting worse outcomes for racially minoritized groups and are often inaccurate,” said co-author Hadis Anahideh, an assistant professor of industrial engineering at the University of Illinois Chicago. “This underscores the necessity of addressing inherent biases in predictive analytics in education settings.”

The study used nationally representative data spanning 10 years from the U.S. Department of Education’s National Center for Education Statistics, including 15,244 students.

Findings from the study also point to the potential value of using statistical techniques to mitigate bias, although there are still limitations.

“While our research tested various bias-mitigation techniques, we found that no single approach fully eliminates disparities in prediction outcomes or accuracy across different fairness notions,” said Anahideh.  

Higher education institutions are increasingly turning to machine learning and artificial intelligence algorithms that predict student success to inform various decisions, including those related to admissions, budgeting, and student-success interventions. In recent years, there have been concerns raised that these predictive models may perpetuate social disparities.

“As colleges and universities become more data-informed, it is imperative that predictive models are designed with attention to their biases and potential consequences,” said Gándara. “It is critical for institutional users to be aware of the historical discrimination reflected in the data and to not penalize groups that have been subjected to racialized social disadvantages.”

The study’s authors noted that the practical implications of the findings are significant but depend on how the predicted outcomes are used. If models are used to make college admissions decisions, admission may be denied to racially minoritized students if the models show that previous students of the same racial categories had lower success. Higher education observers have also warned that predictions could lead to educational tracking, encouraging Black and Hispanic students to pursue courses or majors that are perceived as less challenging.

On the other hand, biased models may lead to greater support for disadvantaged students. By falsely predicting failure for racially minoritized students who succeed, the model may direct greater resources to those students. Even then, Gándara noted, practitioners must be careful not to produce deficit narratives about minoritized students, treating them as though they had a lower probability of success.

“Our findings point to the importance of institutions training end users on the potential for algorithmic bias,” said Gándara. “Awareness can help users contextualize predictions for individual students and make more informed decisions.”

She noted that policymakers might consider policies to monitor or evaluate the use of predictive analytics, including their design, bias in predicted outcomes, and applications.

Funding note: This research was supported by the Institute of Education Sciences at the U.S. Department of Education

Study citation: Gándara, D., Anahideh, H., Ison, M., & Picchiarini, L. (2024). Inside the black box: Detecting and mitigating algorithmic bias across racialized groups in college student-success prediction. AERA Open, 10(1), 1–15. 

###

About AERA
The American Educational Research Association (AERA) is the largest national interdisciplinary research association devoted to the scientific study of education and learning. Founded in 1916, AERA advances knowledge about education, encourages scholarly inquiry related to education, and promotes the use of research to improve education and serve the public good. Find AERA on Facebook, X, LinkedIn, Instagram, Threads, and Bluesky.



Journal

AERA Open

DOI

10.1177/23328584241258741

Article Title

Inside the Black Box: Detecting and Mitigating Algorithmic Bias Across Racialized Groups in College Student-Success Prediction

Article Publication Date

11-Jul-2024

Share26Tweet17
Previous Post

Trained peers are as effective as clinical social workers in reducing opioid overdose, new trial finds

Next Post

Same workout, different weight loss: Signal molecule versions are key

Related Posts

blank
Science Education

Gut Infections Commonly Overlooked in Men Who Have Sex with Men

September 17, 2025
blank
Science Education

Paperpal Pioneers Academic AI with Comprehensive End-to-End Research and Writing Solutions

September 17, 2025
blank
Science Education

Henry Sauermann Named TEAM GLOBAL Chair for Disruptive Innovation at ESMT Berlin

September 16, 2025
blank
Science Education

Exploring UML Class Diagrams: A Semiotic Study

September 16, 2025
blank
Science Education

Creating LAGOM: Collaborative Burnout Prevention with Experts

September 16, 2025
blank
Science Education

How Name, Image, and Likeness (NIL) Rights Enhance Competitive Equilibrium in College Football

September 15, 2025
Next Post
Ogawa-Weight_loss-Mouse_wheel

Same workout, different weight loss: Signal molecule versions are key

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27550 shares
    Share 11017 Tweet 6886
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    965 shares
    Share 386 Tweet 241
  • Bee body mass, pathogens and local climate influence heat tolerance

    644 shares
    Share 258 Tweet 161
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    511 shares
    Share 204 Tweet 128
  • Groundbreaking Clinical Trial Reveals Lubiprostone Enhances Kidney Function

    318 shares
    Share 127 Tweet 80
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Boosting Maize Yield with Pyrolyzed Bio-Oil Insights
  • AI Classifies and Predicts Stunting in Egyptian Kids
  • Functional Task Training Boosts Older Adults’ Performance
  • Telemedicine Payment Parity Linked to Reduced Overdose Rates

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Blog
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 5,183 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading