Friday, August 8, 2025
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Mathematics

New machine learning algorithm promises advances in computing

May 9, 2024
in Mathematics
Reading Time: 4 mins read
0
New machine learning algorithm promises advances in computing
67
SHARES
612
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT
ADVERTISEMENT

COLUMBUS, Ohio – Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests. 

COLUMBUS, Ohio – Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests. 

Using machine learning tools to create a digital twin, or a virtual copy, of an electronic circuit that exhibits chaotic behavior, researchers found that they were successful at predicting how it would behave and using that information to control it.

Many everyday devices, like thermostats and cruise control, utilize linear controllers – which use simple rules to direct a system to a desired value. Thermostats, for example, employ such rules to determine how much to heat or cool a space based on the difference between the current and desired temperatures.

Yet because of how straightforward these algorithms are, they struggle to control systems that display complex behavior, like chaos.

As a result, advanced devices like self-driving cars and aircraft often rely on machine learning-based controllers, which use intricate networks to learn the optimal control algorithm needed to best operate. However, these algorithms have significant drawbacks, the most demanding of which is that they can be extremely challenging and computationally expensive to implement. 

Now, having access to an efficient digital twin is likely to have a sweeping impact on how scientists develop future autonomous technologies, said Robert Kent, lead author of the study and a graduate student in physics at The Ohio State University. 

“The problem with most machine learning-based controllers is that they use a lot of energy or power and they take a long time to evaluate,” said Kent. “Developing traditional controllers for them has also been difficult because chaotic systems are extremely sensitive to small changes.”

These issues, he said, are critical in situations where milliseconds can make a difference between life and death, such as when self-driving vehicles must decide to brake to prevent an accident.

The study was published recently in Nature Communications.

Compact enough to fit on an inexpensive computer chip capable of balancing on your fingertip and able to run without an internet connection, the team’s digital twin was built to optimize a controller’s efficiency and performance, which researchers found resulted in a reduction of power consumption. It achieves this quite easily, mainly because it was trained using a type of machine learning approach called reservoir computing. 

“The great thing about the machine learning architecture we used is that it’s very good at learning the behavior of systems that evolve in time,” Kent said. “It’s inspired by how connections spark in the human brain.”

Although similarly sized computer chips have been used in devices like smart fridges, according to the study, this novel computing ability makes the new model especially well-equipped to handle dynamic systems such as self-driving vehicles as well as heart monitors, which must be able to quickly adapt to a patient’s heartbeat.   

“Big machine learning models have to consume lots of power to crunch data and come out with the right parameters, whereas our model and training is so extremely simple that you could have systems learning on the fly,” he said. 

To test this theory, researchers directed their model to complete complex control tasks and compared its results to those from previous control techniques. The study revealed that their approach achieved a higher accuracy at the tasks than its linear counterpart and is significantly less computationally complex than a previous machine learning-based controller. 

“The increase in accuracy was pretty significant in some cases,” said Kent. Though the outcome showed that their algorithm does require more energy than a linear controller to operate, this tradeoff means that when it is powered up, the team’s model lasts longer and is considerably more efficient than current machine learning-based controllers on the market. 

“People will find good use out of it just based on how efficient it is,” Kent said. “You can implement it on pretty much any platform and it’s very simple to understand.” The algorithm was recently made available to scientists. 

Outside of inspiring potential advances in engineering, there’s also an equally important economic and environmental incentive for creating more power-friendly algorithms, said Kent. 

As society becomes more dependent on computers and AI for nearly all aspects of daily life, demand for data centers is soaring, leading many experts to worry over digital systems’ enormous power appetite and what future industries will need to do to keep up with it. 

And because building these data centers as well as large-scale computing experiments can generate a large carbon footprint, scientists are looking for ways to curb carbon emissions from this technology. 

To advance their results, future work will likely be steered toward training the model to explore other applications like quantum information processing, Kent said. In the meantime, he expects that these new elements will reach far into the scientific community. 

“Not enough people know about these types of algorithms in the industry and engineering, and one of the big goals of this project is to get more people to learn about them,” said Kent. “This work is a great first step toward reaching that potential.”

This study was supported by the U.S. Air Force’s Office of Scientific Research. Other Ohio State co-authors include Wendson A.S. Barbosa and Daniel J. Gauthier. 

#

Contact: Robert Kent, Kent.321@osu.edu

Written by: Tatyana Woodall, Woodall.52@osu.edu



Journal

Nature Communications

DOI

10.1038/s41467-024-48133-3

Method of Research

Experimental study

Subject of Research

Not applicable

Article Title

Controlling chaos using edge computing hardware

Article Publication Date

8-May-2024

Share27Tweet17
Previous Post

Topological Phonos: Where vibrations find their twist

Next Post

NASA selects UF mission to better track the Earth’s water and ice

Related Posts

blank
Mathematics

AI Powers Breakthroughs in Advanced Heat-Dissipating Polymer Development

August 7, 2025
blank
Mathematics

Mathematical Proof Reveals Fresh Insights into the Impact of Blending

August 7, 2025
blank
Mathematics

Researchers Discover a Natural ‘Speed Limit’ to Innovation

August 5, 2025
blank
Mathematics

World’s First Successful Parallelization of Cryptographic Protocol Analyzer Maude-NPA Drastically Cuts Analysis Time, Enhancing Internet Security

August 5, 2025
blank
Mathematics

Encouraging Breakthroughs in Quantum Computing

August 4, 2025
blank
Mathematics

Groundbreaking Real-Time Visualization of Two-Dimensional Melting Unveiled

August 4, 2025
Next Post
GRATTIS Mission

NASA selects UF mission to better track the Earth’s water and ice

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27531 shares
    Share 11009 Tweet 6881
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    943 shares
    Share 377 Tweet 236
  • Bee body mass, pathogens and local climate influence heat tolerance

    641 shares
    Share 256 Tweet 160
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    507 shares
    Share 203 Tweet 127
  • Warm seawater speeding up melting of ‘Doomsday Glacier,’ scientists warn

    310 shares
    Share 124 Tweet 78
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Czech Validation Confirms Accuracy of OGD-Q Tool
  • Ultrasound Advances in Pediatric Tonsil Pathology
  • “Nutrient Supply from Fish Enhances Coral Growth and Resilience”
  • SNU Researchers Unveil Innovative Wearable Blood Pressure Monitor Designed for Real-Time Continuous Monitoring, Attachment Similar to a Bandage

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 4,859 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading