Science news and articles on health, environment, global warming, stem cells, bird flu, autism, nanotechnology, dinosaurs, evolution -- the latest discoveries in astronomy, anthropology, biology, chemistry, climate & bioengineering, computers, engineering ; medicine, math, physics, psychology, technology, and more from the world's leading research centers universities.

Artificial intelligence to uncover human biases

0
IMAGE

Credit: Insilico Medicine

Since 2015 artificially-intelligent systems surpassed human accuracy in many tasks including image recognition, voice recognition, autonomous driving and strategy games such as GO. Early experiments such as Beauty.AI beauty contest developed by Youth Laboratories and Aging.AI predictor of chronological age developed by Insilico Medicine uncovered the various biases with the AI systems as well as the many opportunities for using AI to detect and report human biases. Advances in artificial intelligence and specifically in the fields of deep learning and reinforcement learning present the many threats and opportunities.

"Bias, be it race, sex, age or any other type, is a hugely contributing factor that shapes science and society. This paper is important, not only because it demonstrates the apparent and permeating prejudice that exist in executive boards around the globe, but also because it shows how AI and deep learning can visualize bias in complex systems.", said Morten Scheibye-Knudsen, MD, PhD, Center for Healthy Aging, University of Copenhagen.

Scientists from diverse backgrounds ranging from computer science to genetics from Youth Laboratories, University of Copenhagen, University of Virginia, Insilico Medicine and University of Oxford started series of studies to understand the opportunities presented by the deep learning systems in uncovering human bias.

The first research project of Diversity.AI is focused on using the deep neural networks to evaluate the race and sex diversity in the world's largest companies. Scientists from diverse backgrounds are invited to contribute to the work in progress paper titled "Evaluating race and sex diversity in the world's largest companies using deep neural networks" deposited to ArXiv preprint server on July 10th. New Scientist was the first news outlet to announce this research project. Scientists of diverse backgrounds are invited to contribute to improving the datasets and methods presented in this study.

"Recently a number of publications have reported significant racial and gender bias in artificially intelligent systems. It has been suggested that the detected bias is not intentionally introduced but is an artifact of human-generated training data which contains such biases. In this paper we have demonstrated AI designed to detect human biases and it is our hope that AI itself can be used to mitigate problems of contaminated datasets in training of future AIs." Dr. Roman Yampolskiy, PhD, Computer Science and Engineering.

Diversity is one of the fundamental properties for survival of species, populations, and organizations. Recent advances in deep learning allow for the rapid and automatic assessment of organizational diversity and possible discrimination by race, sex, age and other parameters. Automating the process of assessing the organizational diversity using the deep neural networks and eliminating the human factor may provide a set of real-time unbiased reports to all stakeholders.

In this pilot project deep-learned predictors were used to detect the race and sex in the executive management and board member profiles of the 500 largest companies from the 2016 Forbes Global 2000 list and compared the predicted ratios to the ratios within each company's country of origin and ranked them by the sex-, age- and race- diversity index (DI). While the study has many limitations and no claims are being made concerning the individual companies, it demonstrates a method for the rapid and impartial assessment of organizational diversity using deep neural networks.

"Systems trained on biased data sets or in biased environments may be discriminative to specific population groups discriminating on race, religious attributes and clothing, sex, age and other features. On the other hand, systems trained on diverse data sets to rapidly and accurately detect these features may assist in uncovering human bias in the various organizations, communities and even governments. In the future it may be possible to apply AI detect and prevent discrimination, harassment and other faulty human behaviors", said Konstantin Chekanov, PhD, Diversity AI Lead at Youth Laboratories, Ltd.

"A picture is worth a thousand genes and provides a vast amount of information about the person's health status. While studying organizational diversity is important, using AI to accurately identify the various features on pictures and videos may help us diagnose and prevent diseases and personalize the various interventions", said Alex Zhavoronkov, PhD, CEO of Insilico Medicine and advisor to Youth Laboratories.

###

About Insilico Medicine, Inc

Insilico Medicine, Inc. is an artificial intelligence company located at the Emerging Technology Centers at the Johns Hopkins University Eastern campus in Baltimore with R&D resources in Belgium, Russia, and the UK hiring talent through hackathons and competitions. It utilizes advances in genomics, big data analysis and deep learning for in silico drug discovery and drug repurposing for aging and age-related diseases. The company pursues internal drug discovery programs in cancer, Parkinson's, Alzheimer's, ALS, diabetes, sarcopenia and geroprotector discovery. Through its Pharma.AI division, the company provides advanced machine learning services to biotechnology, pharmaceutical, and skin care companies. In 2017 NVIDIA selected Insilico as the top 5 AI companies for social impact. Brief company video: https://www.youtube.com/watch?v=l62jlwgL3v8 http://www.insilicomedicine.com

Media Contact

Qingsong Zhu
[email protected]
443-451-7212
@InSilicoMeds

http://www.insilicomedicine.com

Leave A Reply

Your email address will not be published.