Automated image registration of RGB, hyperspectral and chlorophyll fluorescence imaging data

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Aufsatznummer175
FachzeitschriftPlant Methods
Jahrgang20
Ausgabenummer1
PublikationsstatusVeröffentlicht - 17 Nov. 2024

Abstract

Background: The early and specific detection of abiotic and biotic stresses, particularly their combinations, is a major challenge for maintaining and increasing plant productivity in sustainable agriculture under changing environmental conditions. Optical imaging techniques enable cost-efficient and non-destructive quantification of plant stress states. Monomodal detection of certain stressors is usually based on non-specific/indirect features and therefore is commonly limited in their cross-specificity to other stressors. The fusion of multi-domain sensor systems can provide more potentially discriminative features for machine learning models and potentially provide synergistic information to increase cross-specificity in plant disease detection when image data are fused at the pixel level. Results: In this study, we demonstrate successful multi-modal image registration of RGB, hyperspectral (HSI) and chlorophyll fluorescence (ChlF) kinetics data at the pixel level for high-throughput phenotyping of A. thaliana grown in Multi-well plates and an assay with detached leaf discs of Rosa × hybrida inoculated with the black spot disease-inducing fungus Diplocarpon rosae. Here, we showcase the effects of (i) selection of reference image selection, (ii) different registrations methods and (iii) frame selection on the performance of image registration via affine transform. In addition, we developed a combined approach for registration methods through NCC-based selection for each file, resulting in a robust and accurate approach that sacrifices computational time. Since image data encompass multiple objects, the initial coarse image registration using a global transformation matrix exhibited heterogeneity across different image regions. By employing an additional fine registration on the object-separated image data, we achieved a high overlap ratio. Specifically, for the A. thaliana test set, the overlap ratios (OR Convex) were 98.0 ± 2.3% for RGB-to-ChlF and 96.6 ± 4.2% for HSI-to-ChlF. For the Rosa × hybrida test set, the values were 98.9 ± 0.5% for RGB-to-ChlF and 98.3 ± 1.3% for HSI-to-ChlF. Conclusion: The presented multi-modal imaging pipeline enables high-throughput, high-dimensional phenotyping of different plant species with respect to various biotic or abiotic stressors. This paves the way for in-depth studies investigating the correlative relationships of the multi-domain data or the performance enhancement of machine learning models via multi modal image fusion.

ASJC Scopus Sachgebiete

Zitieren

Automated image registration of RGB, hyperspectral and chlorophyll fluorescence imaging data. / Bethge, Hans Lukas; Weisheit, Inga; Dortmund, Mauritz Sandro et al.
in: Plant Methods, Jahrgang 20, Nr. 1, 175, 17.11.2024.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Bethge HL, Weisheit I, Dortmund MS, Landes T, Zabic M, Linde M et al. Automated image registration of RGB, hyperspectral and chlorophyll fluorescence imaging data. Plant Methods. 2024 Nov 17;20(1):175. doi: 10.1186/s13007-024-01296-y
Bethge, Hans Lukas ; Weisheit, Inga ; Dortmund, Mauritz Sandro et al. / Automated image registration of RGB, hyperspectral and chlorophyll fluorescence imaging data. in: Plant Methods. 2024 ; Jahrgang 20, Nr. 1.
Download
@article{b1ad58dae5f14b988e3a01f2c3d47021,
title = "Automated image registration of RGB, hyperspectral and chlorophyll fluorescence imaging data",
abstract = "Background: The early and specific detection of abiotic and biotic stresses, particularly their combinations, is a major challenge for maintaining and increasing plant productivity in sustainable agriculture under changing environmental conditions. Optical imaging techniques enable cost-efficient and non-destructive quantification of plant stress states. Monomodal detection of certain stressors is usually based on non-specific/indirect features and therefore is commonly limited in their cross-specificity to other stressors. The fusion of multi-domain sensor systems can provide more potentially discriminative features for machine learning models and potentially provide synergistic information to increase cross-specificity in plant disease detection when image data are fused at the pixel level. Results: In this study, we demonstrate successful multi-modal image registration of RGB, hyperspectral (HSI) and chlorophyll fluorescence (ChlF) kinetics data at the pixel level for high-throughput phenotyping of A. thaliana grown in Multi-well plates and an assay with detached leaf discs of Rosa × hybrida inoculated with the black spot disease-inducing fungus Diplocarpon rosae. Here, we showcase the effects of (i) selection of reference image selection, (ii) different registrations methods and (iii) frame selection on the performance of image registration via affine transform. In addition, we developed a combined approach for registration methods through NCC-based selection for each file, resulting in a robust and accurate approach that sacrifices computational time. Since image data encompass multiple objects, the initial coarse image registration using a global transformation matrix exhibited heterogeneity across different image regions. By employing an additional fine registration on the object-separated image data, we achieved a high overlap ratio. Specifically, for the A. thaliana test set, the overlap ratios (OR Convex) were 98.0 ± 2.3% for RGB-to-ChlF and 96.6 ± 4.2% for HSI-to-ChlF. For the Rosa × hybrida test set, the values were 98.9 ± 0.5% for RGB-to-ChlF and 98.3 ± 1.3% for HSI-to-ChlF. Conclusion: The presented multi-modal imaging pipeline enables high-throughput, high-dimensional phenotyping of different plant species with respect to various biotic or abiotic stressors. This paves the way for in-depth studies investigating the correlative relationships of the multi-domain data or the performance enhancement of machine learning models via multi modal image fusion.",
keywords = "Affine transform, Chlorophyll fluorescence, High-throughput phenotyping, Hyperspectral imaging, Multi-modal image registration, RGB imaging, Sensor fusion",
author = "Bethge, {Hans Lukas} and Inga Weisheit and Dortmund, {Mauritz Sandro} and Timm Landes and Miroslav Zabic and Marcus Linde and Thomas Debener and Dag Heinemann",
note = "Publisher Copyright: {\textcopyright} The Author(s) 2024.",
year = "2024",
month = nov,
day = "17",
doi = "10.1186/s13007-024-01296-y",
language = "English",
volume = "20",
journal = "Plant Methods",
issn = "1746-4811",
publisher = "BioMed Central Ltd.",
number = "1",

}

Download

TY - JOUR

T1 - Automated image registration of RGB, hyperspectral and chlorophyll fluorescence imaging data

AU - Bethge, Hans Lukas

AU - Weisheit, Inga

AU - Dortmund, Mauritz Sandro

AU - Landes, Timm

AU - Zabic, Miroslav

AU - Linde, Marcus

AU - Debener, Thomas

AU - Heinemann, Dag

N1 - Publisher Copyright: © The Author(s) 2024.

PY - 2024/11/17

Y1 - 2024/11/17

N2 - Background: The early and specific detection of abiotic and biotic stresses, particularly their combinations, is a major challenge for maintaining and increasing plant productivity in sustainable agriculture under changing environmental conditions. Optical imaging techniques enable cost-efficient and non-destructive quantification of plant stress states. Monomodal detection of certain stressors is usually based on non-specific/indirect features and therefore is commonly limited in their cross-specificity to other stressors. The fusion of multi-domain sensor systems can provide more potentially discriminative features for machine learning models and potentially provide synergistic information to increase cross-specificity in plant disease detection when image data are fused at the pixel level. Results: In this study, we demonstrate successful multi-modal image registration of RGB, hyperspectral (HSI) and chlorophyll fluorescence (ChlF) kinetics data at the pixel level for high-throughput phenotyping of A. thaliana grown in Multi-well plates and an assay with detached leaf discs of Rosa × hybrida inoculated with the black spot disease-inducing fungus Diplocarpon rosae. Here, we showcase the effects of (i) selection of reference image selection, (ii) different registrations methods and (iii) frame selection on the performance of image registration via affine transform. In addition, we developed a combined approach for registration methods through NCC-based selection for each file, resulting in a robust and accurate approach that sacrifices computational time. Since image data encompass multiple objects, the initial coarse image registration using a global transformation matrix exhibited heterogeneity across different image regions. By employing an additional fine registration on the object-separated image data, we achieved a high overlap ratio. Specifically, for the A. thaliana test set, the overlap ratios (OR Convex) were 98.0 ± 2.3% for RGB-to-ChlF and 96.6 ± 4.2% for HSI-to-ChlF. For the Rosa × hybrida test set, the values were 98.9 ± 0.5% for RGB-to-ChlF and 98.3 ± 1.3% for HSI-to-ChlF. Conclusion: The presented multi-modal imaging pipeline enables high-throughput, high-dimensional phenotyping of different plant species with respect to various biotic or abiotic stressors. This paves the way for in-depth studies investigating the correlative relationships of the multi-domain data or the performance enhancement of machine learning models via multi modal image fusion.

AB - Background: The early and specific detection of abiotic and biotic stresses, particularly their combinations, is a major challenge for maintaining and increasing plant productivity in sustainable agriculture under changing environmental conditions. Optical imaging techniques enable cost-efficient and non-destructive quantification of plant stress states. Monomodal detection of certain stressors is usually based on non-specific/indirect features and therefore is commonly limited in their cross-specificity to other stressors. The fusion of multi-domain sensor systems can provide more potentially discriminative features for machine learning models and potentially provide synergistic information to increase cross-specificity in plant disease detection when image data are fused at the pixel level. Results: In this study, we demonstrate successful multi-modal image registration of RGB, hyperspectral (HSI) and chlorophyll fluorescence (ChlF) kinetics data at the pixel level for high-throughput phenotyping of A. thaliana grown in Multi-well plates and an assay with detached leaf discs of Rosa × hybrida inoculated with the black spot disease-inducing fungus Diplocarpon rosae. Here, we showcase the effects of (i) selection of reference image selection, (ii) different registrations methods and (iii) frame selection on the performance of image registration via affine transform. In addition, we developed a combined approach for registration methods through NCC-based selection for each file, resulting in a robust and accurate approach that sacrifices computational time. Since image data encompass multiple objects, the initial coarse image registration using a global transformation matrix exhibited heterogeneity across different image regions. By employing an additional fine registration on the object-separated image data, we achieved a high overlap ratio. Specifically, for the A. thaliana test set, the overlap ratios (OR Convex) were 98.0 ± 2.3% for RGB-to-ChlF and 96.6 ± 4.2% for HSI-to-ChlF. For the Rosa × hybrida test set, the values were 98.9 ± 0.5% for RGB-to-ChlF and 98.3 ± 1.3% for HSI-to-ChlF. Conclusion: The presented multi-modal imaging pipeline enables high-throughput, high-dimensional phenotyping of different plant species with respect to various biotic or abiotic stressors. This paves the way for in-depth studies investigating the correlative relationships of the multi-domain data or the performance enhancement of machine learning models via multi modal image fusion.

KW - Affine transform

KW - Chlorophyll fluorescence

KW - High-throughput phenotyping

KW - Hyperspectral imaging

KW - Multi-modal image registration

KW - RGB imaging

KW - Sensor fusion

UR - http://www.scopus.com/inward/record.url?scp=85209544973&partnerID=8YFLogxK

U2 - 10.1186/s13007-024-01296-y

DO - 10.1186/s13007-024-01296-y

M3 - Article

VL - 20

JO - Plant Methods

JF - Plant Methods

SN - 1746-4811

IS - 1

M1 - 175

ER -

Von denselben Autoren