Monday, March 2, 2026
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Mathematics

A first, physical system learns nonlinear tasks without a traditional computer processor

July 8, 2024
in Mathematics
Reading Time: 5 mins read
0
A first, physical system learns nonlinear tasks without a traditional computer processor
67
SHARES
605
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT

Scientists run into a lot of tradeoffs trying to build and scale up brain-like systems that can perform machine learning. For instance, artificial neural networks are capable of learning complex language and vision tasks, but the process of training computers to perform these tasks is slow and requires a lot of power.

Scientists run into a lot of tradeoffs trying to build and scale up brain-like systems that can perform machine learning. For instance, artificial neural networks are capable of learning complex language and vision tasks, but the process of training computers to perform these tasks is slow and requires a lot of power.

Training machines to learn digitally but perform tasks in analog—meaning the input varies with a physical quantity, such as voltage—can reduce time and power, but small errors can rapidly compound. An electrical network that physics and engineering researchers from the University of Pennsylvania previously designed is more scalable because errors don’t compound in the same way as the size of the system grows, but it is severely limited as it can only learn linear tasks, ones with a simple relationship between the input and output.

Now, the researchers have created an analog system that is fast, low-power, scalable, and able to learn more complex tasks, including “exclusive or” relationships (XOR) and nonlinear regression. This is called a contrastive local learning network; the components evolve on their own based on local rules without knowledge of the larger structure. Physics professor Douglas J. Durian compares it to how neurons in the human brain don’t know what other neurons are doing and yet learning emerges.

“It can learn, in a machine learning sense, to perform useful tasks, similar to a computational neural network, but it is a physical object,” says physicist Sam Dillavou, a postdoc in the Durian Research Group and first author on a paper about the system published in Proceedings of the National Academy of Sciences.

“One of the things we’re really excited about is that, because it has no knowledge of the structure of the network, it’s very tolerant to errors, it’s very robust to being made in different ways, and we think that opens up a lot of opportunities to scale these things up,” engineering professor Marc Z. Miskin says.

“I think it is an ideal model system that we can study to get insight into all kinds of problems, including biological problems,” physics professor Andrea J. Liu says. She also says it could be helpful in interfacing with devices that collect data that require processing, such as cameras and microphones.

In the paper, the authors say their self-learning system “provides a unique opportunity for studying emergent learning. In comparison to biological systems, including the brain, our system relies on simpler, well-understood dynamics, is precisely trainable, and uses simple modular components.”

This research is based in the Coupled Learning framework that Liu and postdoc Menachem (Nachi) Stern devised, publishing their findings in 2021. In this paradigm, a physical system that is not designed to accomplish a certain task adapts to applied inputs to learn the task, while using local learning rules and no centralized processor.

Dillavou says he came to Penn specifically for this project, and he worked on translating the framework from working in simulation to working in its current physical design, which can be made using standard circuitry components. “One of the craziest parts about this is the thing really is learning on its own; we’re just kind of setting it up to go,” Dillavou says. Researchers only feed in voltages as the input, and then the transistors that connect the nodes update their properties based on the Coupled Learning rule.

“Because the way that it both calculates and learns is based on physics, it’s way more interpretable,” Miskin says. “You can actually figure out what it’s trying to do because you have a good handle on the underlying mechanism. That’s kind of unique because a lot of other learning systems are black boxes where it’s much harder to know why the network did what it did.”

Durian says he hopes this “is the beginning of an enormous field,” noting that another postdoc in his lab, Lauren Altman, is building mechanical versions of contrastive local learning networks.

The researchers are currently working on scaling up the design, and Liu says there are a lot of questions about the duration of memory storage, effects of noise, the best architecture for the network, and whether there are better forms of nonlinearity.

“It’s not really clear what changes as we scale up a learning system,” Miskin says. “If you think of a brain, there’s a huge gap between a worm with 300 neurons and a human being, and it’s not obvious where those capabilities emerge, how things change as you scale up. Having a physical system which you can make bigger and bigger and bigger and bigger is an opportunity to actually study that.”

Sam Dillavou is a postdoc in the Durian Research Group.

Douglas J. Durian is the Mary Amanda Wood Professor of Physics and Astronomy in the School of Arts & Sciences.

Marc Z. Miskin is an assistant professor of electrical and systems engineering in the School of Engineering & Applied Science.

Andrea J. Liu is the Hepburn Professor of Physics in the School of Arts & Sciences.

Other authors are Benjamin D. Beyer and Menachem Stern of the Department of Physics and Astronomy in the School of Arts & Sciences.

This research was supported by the National Science Foundation (MRSEC/DMR1720530, MRSEC/DMR-DMR-2309043, and DMR-2005749, Simons Foundation (327939, and U.S. Department of Energy (DE-SC0020963).



Journal

Proceedings of the National Academy of Sciences

DOI

10.1073/pnas.2319718121

Method of Research

Experimental study

Subject of Research

Not applicable

Article Title

Machine learning without a processor: Emergent learning in a nonlinear analog network

Article Publication Date

2-Jul-2024

COI Statement

The authors declare no competing interest.

Share27Tweet17
Previous Post

Schrag studying history Of Dulles Corridor Metrorail Project

Next Post

Americans find hospital-at-home care appealing and safe

Related Posts

blank
Mathematics

Enhanced Backtesting Methods Advance Evaluation of Expectile Risk Forecasts

March 2, 2026
blank
Mathematics

Digital Clinical Decision Support Algorithm Significantly Cuts Antibiotic Prescriptions Without Affecting Recovery, Finds Non-Randomized Trial in 32 Rwandan Health Centers

February 26, 2026
blank
Mathematics

Review Indicates Music Might Not Enhance Focus or Mood During Exercise

February 26, 2026
blank
Mathematics

Aspirin Shows Limited Immediate Effect in Bowel Cancer Prevention

February 26, 2026
blank
Mathematics

SQU Research on Functionalized Gold Nanoparticles Featured on American Chemical Society Journal Cover

February 25, 2026
blank
Mathematics

Breakthrough Silicon Qubit Powers Next-Gen Telecom Technologies

February 25, 2026
Next Post
Americans find hospital-at-home care appealing and safe

Americans find hospital-at-home care appealing and safe

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27618 shares
    Share 11044 Tweet 6902
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    1022 shares
    Share 409 Tweet 256
  • Bee body mass, pathogens and local climate influence heat tolerance

    665 shares
    Share 266 Tweet 166
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    532 shares
    Share 213 Tweet 133
  • Groundbreaking Clinical Trial Reveals Lubiprostone Enhances Kidney Function

    518 shares
    Share 207 Tweet 130
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • How Trauma Type, Age, and Sex Shape Mental Traits
  • Emerging Vesicle-Associated Viruses Found in Wastewater
  • Author Corrects Study on Ecological Intercropping Benefits
  • Hair-Thin Fiber Detects Chemistry of a Single Drop of Body Fluid

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Biotechnology
  • Blog
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Editorial Policy
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 5,190 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading