Fully-automated root image analysis (faRIA)

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

  • Narendra Narisetti
  • Michael Henke
  • Christiane Seiler
  • Astrid Junker
  • Jörn Ostermann
  • Thomas Altmann
  • Evgeny Gladilin

Externe Organisationen

  • Leibniz-Institut für Pflanzengenetik und Kulturpflanzenforschung (IPK)
  • Masaryk University
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Aufsatznummer16047
FachzeitschriftScientific reports
Jahrgang11
Ausgabenummer1
Frühes Online-Datum6 Aug. 2021
PublikationsstatusVeröffentlicht - Dez. 2021

Abstract

High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.

ASJC Scopus Sachgebiete

Zitieren

Fully-automated root image analysis (faRIA). / Narisetti, Narendra; Henke, Michael; Seiler, Christiane et al.
in: Scientific reports, Jahrgang 11, Nr. 1, 16047, 12.2021.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Narisetti, N, Henke, M, Seiler, C, Junker, A, Ostermann, J, Altmann, T & Gladilin, E 2021, 'Fully-automated root image analysis (faRIA)', Scientific reports, Jg. 11, Nr. 1, 16047. https://doi.org/10.1038/s41598-021-95480-y
Narisetti, N., Henke, M., Seiler, C., Junker, A., Ostermann, J., Altmann, T., & Gladilin, E. (2021). Fully-automated root image analysis (faRIA). Scientific reports, 11(1), Artikel 16047. https://doi.org/10.1038/s41598-021-95480-y
Narisetti N, Henke M, Seiler C, Junker A, Ostermann J, Altmann T et al. Fully-automated root image analysis (faRIA). Scientific reports. 2021 Dez;11(1):16047. Epub 2021 Aug 6. doi: 10.1038/s41598-021-95480-y
Narisetti, Narendra ; Henke, Michael ; Seiler, Christiane et al. / Fully-automated root image analysis (faRIA). in: Scientific reports. 2021 ; Jahrgang 11, Nr. 1.
Download
@article{74fa2aa3c67c43e8b53194e87bc468a7,
title = "Fully-automated root image analysis (faRIA)",
abstract = "High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.",
author = "Narendra Narisetti and Michael Henke and Christiane Seiler and Astrid Junker and J{\"o}rn Ostermann and Thomas Altmann and Evgeny Gladilin",
note = "Funding Information: This work was performed within the German Plant-Phenotyping Network (DPPN) which is funded by the German Federal Ministry of Education and Research (BMBF) (project identification number: 031A053). M.H. was supported from European Regional Development Fund-Project “SINGING PLANT” (No. CZ.02.1.01/0.0/0.0/ 16_026/0008446). Open Access funding enabled and organized by Projekt DEAL. ",
year = "2021",
month = dec,
doi = "10.1038/s41598-021-95480-y",
language = "English",
volume = "11",
journal = "Scientific reports",
issn = "2045-2322",
publisher = "Nature Publishing Group",
number = "1",

}

Download

TY - JOUR

T1 - Fully-automated root image analysis (faRIA)

AU - Narisetti, Narendra

AU - Henke, Michael

AU - Seiler, Christiane

AU - Junker, Astrid

AU - Ostermann, Jörn

AU - Altmann, Thomas

AU - Gladilin, Evgeny

N1 - Funding Information: This work was performed within the German Plant-Phenotyping Network (DPPN) which is funded by the German Federal Ministry of Education and Research (BMBF) (project identification number: 031A053). M.H. was supported from European Regional Development Fund-Project “SINGING PLANT” (No. CZ.02.1.01/0.0/0.0/ 16_026/0008446). Open Access funding enabled and organized by Projekt DEAL.

PY - 2021/12

Y1 - 2021/12

N2 - High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.

AB - High-throughput root phenotyping in the soil became an indispensable quantitative tool for the assessment of effects of climatic factors and molecular perturbation on plant root morphology, development and function. To efficiently analyse a large amount of structurally complex soil-root images advanced methods for automated image segmentation are required. Due to often unavoidable overlap between the intensity of fore- and background regions simple thresholding methods are, generally, not suitable for the segmentation of root regions. Higher-level cognitive models such as convolutional neural networks (CNN) provide capabilities for segmenting roots from heterogeneous and noisy background structures, however, they require a representative set of manually segmented (ground truth) images. Here, we present a GUI-based tool for fully automated quantitative analysis of root images using a pre-trained CNN model, which relies on an extension of the U-Net architecture. The developed CNN framework was designed to efficiently segment root structures of different size, shape and optical contrast using low budget hardware systems. The CNN model was trained on a set of 6465 masks derived from 182 manually segmented near-infrared (NIR) maize root images. Our experimental results show that the proposed approach achieves a Dice coefficient of 0.87 and outperforms existing tools (e.g., SegRoot) with Dice coefficient of 0.67 by application not only to NIR but also to other imaging modalities and plant species such as barley and arabidopsis soil-root images from LED-rhizotron and UV imaging systems, respectively. In summary, the developed software framework enables users to efficiently analyse soil-root images in an automated manner (i.e. without manual interaction with data and/or parameter tuning) providing quantitative plant scientists with a powerful analytical tool.

UR - http://www.scopus.com/inward/record.url?scp=85112629309&partnerID=8YFLogxK

U2 - 10.1038/s41598-021-95480-y

DO - 10.1038/s41598-021-95480-y

M3 - Article

C2 - 34362967

AN - SCOPUS:85112629309

VL - 11

JO - Scientific reports

JF - Scientific reports

SN - 2045-2322

IS - 1

M1 - 16047

ER -

Von denselben Autoren