Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Maryam Teimouri
  • Mehdi Mokhtarzade
  • Nicolas Baghdadi
  • Christian Heipke

External Research Organisations

  • K.N. Toosi University of Technology
  • Université Montpellier
View graph of relations

Details

Original languageEnglish
Pages (from-to)15143-15160
Number of pages18
JournalGeocarto international
Volume37
Issue number27
Early online date14 Jul 2022
Publication statusPublished - 19 Jul 2022

Abstract

Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.

Keywords

    3D-CNN, Crop classification, fusion, time-series optical images, time-series radar images

ASJC Scopus subject areas

Cite this

Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification. / Teimouri, Maryam; Mokhtarzade, Mehdi; Baghdadi, Nicolas et al.
In: Geocarto international, Vol. 37, No. 27, 19.07.2022, p. 15143-15160.

Research output: Contribution to journalArticleResearchpeer review

Teimouri M, Mokhtarzade M, Baghdadi N, Heipke C. Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification. Geocarto international. 2022 Jul 19;37(27):15143-15160. Epub 2022 Jul 14. doi: 10.1080/10106049.2022.2095446
Teimouri, Maryam ; Mokhtarzade, Mehdi ; Baghdadi, Nicolas et al. / Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification. In: Geocarto international. 2022 ; Vol. 37, No. 27. pp. 15143-15160.
Download
@article{cd4d42fce6e74c6ea0b93c7caeebdd28,
title = "Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification",
abstract = "Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.",
keywords = "3D-CNN, Crop classification, fusion, time-series optical images, time-series radar images",
author = "Maryam Teimouri and Mehdi Mokhtarzade and Nicolas Baghdadi and Christian Heipke",
note = "Funding Information: The authors would like to acknowledge ESA for providing the S1 and S2 data, as well as, the Department of Agriculture, Livestock, Fishing and Food of the Generalitat of Catalonia for providing the field data.",
year = "2022",
month = jul,
day = "19",
doi = "10.1080/10106049.2022.2095446",
language = "English",
volume = "37",
pages = "15143--15160",
journal = "Geocarto international",
issn = "1010-6049",
publisher = "Taylor and Francis Ltd.",
number = "27",

}

Download

TY - JOUR

T1 - Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification

AU - Teimouri, Maryam

AU - Mokhtarzade, Mehdi

AU - Baghdadi, Nicolas

AU - Heipke, Christian

N1 - Funding Information: The authors would like to acknowledge ESA for providing the S1 and S2 data, as well as, the Department of Agriculture, Livestock, Fishing and Food of the Generalitat of Catalonia for providing the field data.

PY - 2022/7/19

Y1 - 2022/7/19

N2 - Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.

AB - Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.

KW - 3D-CNN

KW - Crop classification

KW - fusion

KW - time-series optical images

KW - time-series radar images

UR - http://www.scopus.com/inward/record.url?scp=85136349763&partnerID=8YFLogxK

U2 - 10.1080/10106049.2022.2095446

DO - 10.1080/10106049.2022.2095446

M3 - Article

AN - SCOPUS:85136349763

VL - 37

SP - 15143

EP - 15160

JO - Geocarto international

JF - Geocarto international

SN - 1010-6049

IS - 27

ER -