Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 181-189 |
Seitenumfang | 9 |
Fachzeitschrift | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Jahrgang | 5 |
Ausgabenummer | 3 |
Publikationsstatus | Veröffentlicht - 17 Juni 2021 |
Veranstaltung | 24th ISPRS Congress on Imaging today, foreseeing tomorrow, Commission III - Nice, Frankreich Dauer: 5 Juli 2021 → 9 Juli 2021 |
Abstract
Fully convolutional neural networks (FCN) are successfully used for pixel-wise land cover classification - the task of identifying the physical material of the Earth's surface for every pixel in an image. The acquisition of large training datasets is challenging, especially in remote sensing, but necessary for a FCN to perform well. One way to circumvent manual labelling is the usage of existing databases, which usually contain a certain amount of label noise when combined with another data source. As a first part of this work, we investigate the impact of training data on a FCN. We experiment with different amounts of training data, varying w.r.t. the covered area, the available acquisition dates and the amount of label noise. We conclude that the more data is used for training, the better is the generalization performance of the model, and the FCN is able to mitigate the effect of label noise to a high degree. Another challenge is the imbalanced class distribution in most real-world datasets, which can cause the classifier to focus on the majority classes, leading to poor classification performance for minority classes. To tackle this problem, in this paper, we use the cosine similarity loss to force feature vectors of the same class to be close to each other in feature space. Our experiments show that the cosine loss helps to obtain more similar feature vectors, but the similarity of the cluster centers also increases.
ASJC Scopus Sachgebiete
- Physik und Astronomie (insg.)
- Instrumentierung
- Umweltwissenschaften (insg.)
- Umweltwissenschaften (sonstige)
- Erdkunde und Planetologie (insg.)
- Erdkunde und Planetologie (sonstige)
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Jahrgang 5, Nr. 3, 17.06.2021, S. 181-189.
Publikation: Beitrag in Fachzeitschrift › Konferenzaufsatz in Fachzeitschrift › Forschung › Peer-Review
}
TY - JOUR
T1 - Investigations on feature similarity and the impact of training data for land cover classification
AU - Voelsen, M.
AU - Torres, D. Lobo
AU - Feitosa, R. Q.
AU - Rottensteiner, F.
AU - Heipke, C.
PY - 2021/6/17
Y1 - 2021/6/17
N2 - Fully convolutional neural networks (FCN) are successfully used for pixel-wise land cover classification - the task of identifying the physical material of the Earth's surface for every pixel in an image. The acquisition of large training datasets is challenging, especially in remote sensing, but necessary for a FCN to perform well. One way to circumvent manual labelling is the usage of existing databases, which usually contain a certain amount of label noise when combined with another data source. As a first part of this work, we investigate the impact of training data on a FCN. We experiment with different amounts of training data, varying w.r.t. the covered area, the available acquisition dates and the amount of label noise. We conclude that the more data is used for training, the better is the generalization performance of the model, and the FCN is able to mitigate the effect of label noise to a high degree. Another challenge is the imbalanced class distribution in most real-world datasets, which can cause the classifier to focus on the majority classes, leading to poor classification performance for minority classes. To tackle this problem, in this paper, we use the cosine similarity loss to force feature vectors of the same class to be close to each other in feature space. Our experiments show that the cosine loss helps to obtain more similar feature vectors, but the similarity of the cluster centers also increases.
AB - Fully convolutional neural networks (FCN) are successfully used for pixel-wise land cover classification - the task of identifying the physical material of the Earth's surface for every pixel in an image. The acquisition of large training datasets is challenging, especially in remote sensing, but necessary for a FCN to perform well. One way to circumvent manual labelling is the usage of existing databases, which usually contain a certain amount of label noise when combined with another data source. As a first part of this work, we investigate the impact of training data on a FCN. We experiment with different amounts of training data, varying w.r.t. the covered area, the available acquisition dates and the amount of label noise. We conclude that the more data is used for training, the better is the generalization performance of the model, and the FCN is able to mitigate the effect of label noise to a high degree. Another challenge is the imbalanced class distribution in most real-world datasets, which can cause the classifier to focus on the majority classes, leading to poor classification performance for minority classes. To tackle this problem, in this paper, we use the cosine similarity loss to force feature vectors of the same class to be close to each other in feature space. Our experiments show that the cosine loss helps to obtain more similar feature vectors, but the similarity of the cluster centers also increases.
KW - Cosine similarity loss
KW - FCN
KW - Land cover classification
KW - Remote sensing
UR - http://www.scopus.com/inward/record.url?scp=85113134867&partnerID=8YFLogxK
U2 - 10.5194/isprs-annals-V-3-2021-181-2021
DO - 10.5194/isprs-annals-V-3-2021-181-2021
M3 - Conference article
AN - SCOPUS:85113134867
VL - 5
SP - 181
EP - 189
JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
SN - 2194-9042
IS - 3
T2 - 24th ISPRS Congress on Imaging today, foreseeing tomorrow, Commission III
Y2 - 5 July 2021 through 9 July 2021
ER -