Saturday, May 28, 2022
SCIENMAG: Latest Science and Health News
No Result
View All Result
  • Login
  • HOME PAGE
  • BIOLOGY
  • CHEMISTRY AND PHYSICS
  • MEDICINE
    • Cancer
    • Infectious Emerging Diseases
  • SPACE
  • TECHNOLOGY
  • CONTACT US
  • HOME PAGE
  • BIOLOGY
  • CHEMISTRY AND PHYSICS
  • MEDICINE
    • Cancer
    • Infectious Emerging Diseases
  • SPACE
  • TECHNOLOGY
  • CONTACT US
No Result
View All Result
Scienmag - Latest science news from science magazine
No Result
View All Result
Home SCIENCE NEWS Technology and Engineering

Researchers to tackle the mysteries of the AI ‘black boxâ problem

December 11, 2019
in Technology and Engineering
0
Share on FacebookShare on Twitter

Researchers are aiming to shed light into one of the most significant problems in artificial intelligence

Researchers are aiming to shed light into one of the most significant problems in artificial intelligence.

AI computing systems are moving into many different parts of our lives, and offer great potential from self-driving vehicles, assisting doctors with diagnosing health conditions, to autonomous search and rescue robots.

However, one major unresolved issue, particularly with the branch of AI known as neural networks, is that when things go wrong, scientists are often at a loss to explain why. This is due to a lack of understanding of the decision-making within the AI systems. This issue is called the black box problem.

A new 15-month research project led by Lancaster University, and involving the University of Liverpool, will seek to dispel the mysteries of the black box problem and to discover a new way of creating deep-learning AI computing models that make decisions that are transparent and explainable.

The funding is awards through the Offshore Robotics for the Certification of Assets (ORCA) Hub, which is managed by the Edinburgh Centre for Robotics (Heriot-Watt University and University of Edinburgh).The Hub develops Robotics, Artificial Intelligence and Autonomous Systems for the offshore sector. The Partnership Resource Funding (PRF) awards fund research activities relating to the existing ORCA strategic themes through an identified white space, innovative complimentary research or a collaborative project. The funding awards will work to advance the Hub’s research and technology through clearly defined impact acceleration activities. The PRF enables the Hub to expand its current work into new areas with a clearly identified industrial need.

The ‘Towards the Accountable and Explainable Learning-enabled Autonomous Robotic Systems’ project will develop a series of safety verification and testing techniques for the development of AI algorithms. These will help to guarantee robust and explainable decisions taken by the systems.

The researchers will use a technique called ‘adversarial training’. This involves presenting the system with a given situation, where it learns how to perform an action – such as identifying and picking up an object. The researchers then change various elements in the scenario, such as colour, shape, environment, and observe how the system learns through trial and error. The researchers believe these observations could lead to greater understanding of how the system learns and provide insights into its decision-making.

By developing ways to create neural network systems where decision-making can be understood, and predicted, the research will be key to unleashing autonomous systems in areas where safety is critical, such as vehicles and robots in industry.

Dr Wenjie Ruan, Lecturer at Lancaster University’s School of Computing and Communications and Lead Investigator on the project, said: “Although deep learning, as one of the most remarkable AI techniques, has achieved great success in many applications, it has its own problems when applied to safety-critical systems, including opaque decision-making mechanisms and vulnerability to adversarial attacks.

“This project provides a great opportunity for us to bridge the research gap between deep learning technique and safety-critical systems.

“In collaboration with researchers from the University of Liverpool, we will develop an accountable and explainable deep learning model by taking the recent advances of safety verification and testing techniques.

“This research will ultimately enable end-users to understand and trust the decisions made by deep learning models in various safety-critical systems, including self-driving cars, rescue robots and those applications in healthcare domain.”

Dr Xiaowei Huang, Lecturer at the University of Liverpool and Co-investigator on this project, said: “This project has great potential leading to significant impacts on both the artificial intelligence, and the robotics and autonomous systems, where AI techniques have been extensively used. In addition to the theoretical advancement, the project is expected to produce practical methodology, tools, and demonstrations, in order to provide guidance to the industrial development of safe and accountable robotic systems.”

###

Media Contact
Ian Boydon
[email protected]

Tags: Computer ScienceSoftware EngineeringTechnology/Engineering/Computer ScienceTheory/Design
Share25Tweet16Share4ShareSendShare
  • Body weight influences the chance of developing Polycystic Ovary Syndrome

    69 shares
    Share 28 Tweet 17
  • Protein supplement helps control Type 2 diabetes

    66 shares
    Share 26 Tweet 17
  • Study reveals potential target for treatment of diseases associated with mitochondrial DNA mutations

    65 shares
    Share 26 Tweet 16
  • New survey illustrates challenges associated with healthcare environmental hygiene in facilities worldwide

    67 shares
    Share 27 Tweet 17
  • Shared autonomous micro-mobility

    65 shares
    Share 26 Tweet 16
  • Discovery offers starting point for better gene-editing tools

    65 shares
    Share 26 Tweet 16
ADVERTISEMENT

About us

We bring you the latest science news from best research centers and universities around the world. Check our website.

Latest NEWS

Data contradict fears of COVID-19 vaccine effects on pregnancy and fertility

Long-duration energy storage beats the challenge of week-long wind-power lulls

‘Democracy’ governs mass jackdaw take-offs

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 188 other subscribers

© 2022 Scienmag- Science Magazine: Latest Science News.

No Result
View All Result
  • HOME PAGE
  • BIOLOGY
  • CHEMISTRY AND PHYSICS
  • MEDICINE
    • Cancer
    • Infectious Emerging Diseases
  • SPACE
  • TECHNOLOGY
  • CONTACT US

© 2022 Scienmag- Science Magazine: Latest Science News.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
Posting....