Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 15143-15160 |
Seitenumfang | 18 |
Fachzeitschrift | Geocarto international |
Jahrgang | 37 |
Ausgabenummer | 27 |
Frühes Online-Datum | 14 Juli 2022 |
Publikationsstatus | Veröffentlicht - 19 Juli 2022 |
Abstract
Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.
ASJC Scopus Sachgebiete
- Sozialwissenschaften (insg.)
- Geografie, Planung und Entwicklung
- Umweltwissenschaften (insg.)
- Gewässerkunde und -technologie
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: Geocarto international, Jahrgang 37, Nr. 27, 19.07.2022, S. 15143-15160.
Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
}
TY - JOUR
T1 - Fusion of time-series optical and SAR images using 3D convolutional neural networks for crop classification
AU - Teimouri, Maryam
AU - Mokhtarzade, Mehdi
AU - Baghdadi, Nicolas
AU - Heipke, Christian
N1 - Funding Information: The authors would like to acknowledge ESA for providing the S1 and S2 data, as well as, the Department of Agriculture, Livestock, Fishing and Food of the Generalitat of Catalonia for providing the field data.
PY - 2022/7/19
Y1 - 2022/7/19
N2 - Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.
AB - Remote sensing is a most promising technique for providing crop maps, thanks to the development of satellite images at various temporal and spatial resolutions. Three-dimensional (3D) convolutional neural networks (CNNs) have the potential to provide rich features that represent the spatial and temporal patterns of crops when applied to time series. This study presents a novel 3D-CNN framework for classifying crops that is based on the fusion of radar and optical time series and also fully exploits 3D spatial-temporal information. To extract deep convolutional maps, the proposed technique uses one separate sequence for each time series dataset. To determine the label of each pixel, the extracted feature maps are passed to the concatenating layer and subsequent transmitted to the sequential fully connected layers. The proposed approach not only takes advantage of CNNs, i.e. automatic feature extraction, but also discovers discriminative feature maps in both, spatial and temporal dimensions and preserves the growth dynamics of crop cycles. An overall accuracy of 91.3% and a kappa coefficient of 89.9% confirm the proposed method's potential. It is also shown that the suggested approach outperforms other methods.
KW - 3D-CNN
KW - Crop classification
KW - fusion
KW - time-series optical images
KW - time-series radar images
UR - http://www.scopus.com/inward/record.url?scp=85136349763&partnerID=8YFLogxK
U2 - 10.1080/10106049.2022.2095446
DO - 10.1080/10106049.2022.2095446
M3 - Article
AN - SCOPUS:85136349763
VL - 37
SP - 15143
EP - 15160
JO - Geocarto international
JF - Geocarto international
SN - 1010-6049
IS - 27
ER -