Friday, August 15, 2025
Science
No Result
View All Result
  • Login
  • HOME
  • SCIENCE NEWS
  • CONTACT US
  • HOME
  • SCIENCE NEWS
  • CONTACT US
No Result
View All Result
Scienmag
No Result
View All Result
Home Science News Climate

Berkeley Lab researchers advance AI-driven plant root analysis

June 23, 2024
in Climate
Reading Time: 6 mins read
0
RhizoNet harnesses the power of AI to transform how we study plant roots
66
SHARES
601
VIEWS
Share on FacebookShare on Twitter
ADVERTISEMENT
ADVERTISEMENT

In a world striving for sustainability, understanding the hidden half of a living plant – the roots – is crucial. Roots are not just an anchor; they are a dynamic interface between the plant and soil, critical for water uptake, nutrient absorption, and, ultimately, the survival of the plant. In an investigation to boost agricultural yields and develop crops resilient to climate change, scientists from Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) Applied Mathematics and Computational Research (AMCR) and Environmental Genomics and Systems Biology (EGSB) Divisions have made a significant leap. Their latest innovation, RhizoNet, harnesses the power of artificial intelligence (AI) to transform how we study plant roots, offering new insights into root behavior under various environmental conditions.

RhizoNet harnesses the power of AI to transform how we study plant roots

Credit: Thor Swift, Lawrence Berkeley National Laboratory

In a world striving for sustainability, understanding the hidden half of a living plant – the roots – is crucial. Roots are not just an anchor; they are a dynamic interface between the plant and soil, critical for water uptake, nutrient absorption, and, ultimately, the survival of the plant. In an investigation to boost agricultural yields and develop crops resilient to climate change, scientists from Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) Applied Mathematics and Computational Research (AMCR) and Environmental Genomics and Systems Biology (EGSB) Divisions have made a significant leap. Their latest innovation, RhizoNet, harnesses the power of artificial intelligence (AI) to transform how we study plant roots, offering new insights into root behavior under various environmental conditions.

This pioneering tool, detailed in a study published on June 5 in Scientific Reports, revolutionizes root image analysis by automating the process with exceptional accuracy. Traditional methods, which are labor-intensive and prone to errors, fall short when faced with the complex and tangled nature of root systems. RhizoNet steps in with a state-of-the-art deep learning approach, enabling researchers to track root growth and biomass with precision. Using an advanced deep learning-based backbone based on a convolutional neural network, this new computational tool semantically segments plant roots for comprehensive biomass and growth assessment, changing the way laboratories can analyze plant roots and propelling efforts toward self-driving labs.

As Berkeley Lab’s Daniela Ushizima, lead investigator of the AI-driven software, explained, “The capability of RhizoNet to standardize root segmentation and phenotyping represents a substantial advancement in the systematic and accelerated analysis of thousands of images. This innovation is instrumental in our ongoing efforts to enhance the precision in capturing root growth dynamics under diverse plant conditions.” 

Getting to the Roots

Root analysis has traditionally relied on flatbed scanners and manual segmentation methods, which are not only time-consuming but also susceptible to errors, particularly in extensive multi-plant studies. Root image segmentation also presents significant challenges due to natural phenomena like bubbles, droplets, reflections, and shadows. The intricate nature of root structures and the presence of noisy backgrounds further complicate the automated analysis process. These complications are particularly acute at smaller spatial scales, where fine structures are sometimes only as wide as a pixel, making manual annotation extremely challenging even for expert human annotators.

EGSB recently introduced the latest version (2.0) of EcoFAB, a novel hydroponic device that facilitates in-situ plant imaging by offering a detailed view of plant root systems. EcoFAB – developed via a collaboration between EGSB, the DOE Joint Genome Institute (JGI), and the Climate & Ecosystem Sciences division at Berkeley Lab – is part of an automated experimental system designed to perform fabricated ecosystem experiments that enhance data reproducibility. RhizoNet, which processes color scans of plants grown in EcoFAB that are subjected to specific nutritional treatments, addresses the scientific challenges of plant root analysis. It employs a sophisticated Residual U-Net architecture (an architecture used in semantic segmentation that improves upon the original U-Net by adding residual connections between input and output blocks within the same level, i.e. resolution, in both the encoder and decoder pathways) to deliver root segmentation specifically adapted for EcoFAB conditions, significantly enhancing prediction accuracy. The system also integrates a convexification procedure that serves to encapsulate identified roots from time series and helps quickly delineate the primary root components from complex backgrounds. This integration is key for accurately monitoring root biomass and growth over time, especially in plants grown under varied nutritional treatments in EcoFABs.

To illustrate this, the new Scientific Reports paper details how the researchers used EcoFAB and RhizoNet to process root scans of Brachypodium distachyon (a small grass species) plants subjected to different nutrient deprivation conditions over approximately five weeks. These images, taken every three to seven days, provide vital data that help scientists understand how roots adapt to varying environments. The high-throughput nature of EcoBOT, the new image acquisition system for EcoFABs, offers research teams the potential for systematic experimental monitoring – as long as data is analyzed promptly. 

“We’ve made a lot of progress in reducing the manual work involved in plant cultivation experiments with the EcoBOT, and now RhizoNet is reducing the manual work involved in analyzing the data generated,” noted Peter Andeer, a research scientist in EGSB and a lead developer of EcoBOT, who collaborated with Ushizima on this work. “This increases our throughput and moves us toward the goal of self-driving labs.” Resources at the National Energy Research Scientific Computing Center (NERSC) – a U.S. Department of Energy (DOE) user facility located at Berkeley Lab – were used to train RhizoNet and perform inference, bringing this capability of computer vision to the EcoBOT, Ushizima noted.

“EcoBOT is capable of collecting images automatically, but it was unable to determine if how athe plant responds to different environmental changes alive or not or growing or not,” Ushizima explained. “By measuring the roots with RhizoNet, we capture detailed data on root biomass and growth not solely to determine plant vitality but to provide comprehensive, quantitative insights that are not readily observable through conventional means. After training the model, it can be reused for multiple experiments (unseen plants).”

“In order to analyze the complex plant images from the EcoBOT, we created a new convolutional neural network for semantic segmentation,” added Zineb Sordo, a computer systems engineer in AMCR working as a data scientist on the project. “Our goal was to design an optimized pipeline that uses prior information about the time series to improve the model’s accuracy beyond manual annotations done on a single frame. RhizoNet handles noisy images, detecting plant roots from images so biomass and growth can be calculated.”

One Patch at a Time

During model tuning, the findings indicated that using smaller image patches significantly enhances the model’s performance. In these patches, each neuron in the early layers of the artificial neural network has a smaller receptive field. This allows the model to capture fine details more effectively, enriching the latent space with diverse feature vectors. This approach not only improves the model’s ability to generalize to unseen EcoFAB images but also increases its robustness, enabling it to focus on thin objects and capture intricate patterns despite various visual artifacts.

Smaller patches also help prevent class imbalance by excluding sparsely labeled patches – those with less than 20% of annotated pixels, predominantly background. The team’s results show high accuracy, precision, recall, and Intersection over Union (IoU) for smaller patch sizes, demonstrating the model’s improved ability to distinguish roots from other objects or artifacts.

To validate the performance of root predictions, the paper compares predicted root biomass to actual measurements. Linear regression analysis revealed a significant correlation, underscoring the precision of automated segmentation over manual annotations, which often struggle to distinguish thin root pixels from similar-looking noise. This comparison highlights the challenge human annotators face and showcases the advanced capabilities of the RhizoNet models, particularly when trained on smaller patch sizes.

This study demonstrates the practical applications of RhizoNet in current research settings, the authors noted, and lays the groundwork for future innovations in sustainable energy solutions as well as carbon-sequestration technology using plants and microbes. The research team is optimistic about the implications of their findings. 

“Our next steps involve refining RhizoNet’s capabilities to further improve the detection and branching patterns of plant roots,” said Ushizima. “We also see potential in adapting and applying these deep-learning algorithms for roots in soil as well as new materials science investigations. We’re exploring iterative training protocols, hyperparameter optimization, and leveraging multiple GPUs. These computational tools are designed to assist science teams in analyzing diverse experiments captured as images, and have applicability in multiple areas.” 

Further research work in plant root growth dynamics is described in a pioneering book on autonomous experimentation edited by Ushizima and Berkeley Lab colleague Marcus Noack that was released in 2023. Other team members from Berkeley Lab include Peter Andeer, Trent Northen, Camille Catoulos, and James Sethian. This multidisciplinary group of scientists is part of Twin Ecosystems, a DOE Office of Science Genomic Science Program project that integrates computer vision software and autonomous experimental design software developed at Berkeley Lab (gpCAM) with an automated experimental system (EcoFAB and EcoBOT) to perform fabricated ecosystem experiments and enhance data reproducibility. The work of analyzing plant roots under different kinds of nutrition and environmental conditions is also part of the DOE’s Carbon Negative Earthshot initiative (see sidebar).



Journal

Scientific Reports

DOI

10.1038/s41598-024-63497-8

Method of Research

Computational simulation/modeling

Subject of Research

Not applicable

Article Title

Berkeley Lab Researchers Advance AI-Driven Plant Root Analysis

Article Publication Date

20-Jun-2024

Share26Tweet17
Previous Post

Harnessing ecological theory for successful ecosystem restoration

Next Post

STUDY: Sourcing genomically diverse seedlings to create climate-change resilient forests brings optimism for partnerships between science and practice

Related Posts

blank
Climate

Assessing Flood Insurance Gaps Across the USA

August 15, 2025
blank
Climate

Navigating Energy Transition Amid Minerals Constraints

August 7, 2025
blank
Climate

Warming Speeds Up Arctic Ocean Deoxygenation

August 3, 2025
blank
Climate

Marine Heatwaves Favor Heat-Tolerant Reef Corals

August 3, 2025
blank
Climate

Satellite-Era Sea Surface Temperature Trends Vary Widely

August 3, 2025
blank
Climate

Thermal Adaptation in Ecosystems Reduces Carbon Loss

August 3, 2025
Next Post

STUDY: Sourcing genomically diverse seedlings to create climate-change resilient forests brings optimism for partnerships between science and practice

  • Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    Mothers who receive childcare support from maternal grandparents show more parental warmth, finds NTU Singapore study

    27533 shares
    Share 11010 Tweet 6881
  • University of Seville Breaks 120-Year-Old Mystery, Revises a Key Einstein Concept

    947 shares
    Share 379 Tweet 237
  • Bee body mass, pathogens and local climate influence heat tolerance

    641 shares
    Share 256 Tweet 160
  • Researchers record first-ever images and data of a shark experiencing a boat strike

    507 shares
    Share 203 Tweet 127
  • Warm seawater speeding up melting of ‘Doomsday Glacier,’ scientists warn

    310 shares
    Share 124 Tweet 78
Science

Embark on a thrilling journey of discovery with Scienmag.com—your ultimate source for cutting-edge breakthroughs. Immerse yourself in a world where curiosity knows no limits and tomorrow’s possibilities become today’s reality!

RECENT NEWS

  • Partial Flood Defenses Heighten Risks, Inequality in Cities
  • Expanding Rock Extraction Boosts Enhanced Weathering Efficiency
  • Loop Quantum Gravity: Black Hole Effects Rewritten
  • New Multimodal Sentiment Analysis Technique Enhances Emotional Detection and Reduces Computing Costs

Categories

  • Agriculture
  • Anthropology
  • Archaeology
  • Athmospheric
  • Biology
  • Bussines
  • Cancer
  • Chemistry
  • Climate
  • Earth Science
  • Marine
  • Mathematics
  • Medicine
  • Pediatry
  • Policy
  • Psychology & Psychiatry
  • Science Education
  • Social Science
  • Space
  • Technology and Engineering

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 4,859 other subscribers

© 2025 Scienmag - Science Magazine

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • HOME
  • SCIENCE NEWS
  • CONTACT US

© 2025 Scienmag - Science Magazine

Discover more from Science

Subscribe now to keep reading and get access to the full archive.

Continue reading