Details
Original language | English |
---|---|
Pages (from-to) | 249-261 |
Number of pages | 13 |
Journal | Photogrammetric Engineering and Remote Sensing |
Volume | 84 |
Issue number | 5 |
Publication status | Published - May 2018 |
Abstract
The creation of training sets for supervised machine learning often incurs unsustainable manual costs. Transfer learning (TL) techniques have been proposed as a way to solve this issue by adapting training data from different, but related (source) datasets to the test (target) dataset. A problem in TL is how to quantify the relatedness of a source quickly and robustly. In this work, we present a fast domain similarity measure that captures the relatedness between datasets purely based on unlabeled data. Our method transfers knowledge from multiple sources by generating a weighted combination of domains. We show for multiple datasets that learning on such sources achieves an average overall accuracy closer than 2.5 percent to the results of the target classifier for semantic segmentation tasks. We further apply our method to the task of choosing informative patches from unlabeled datasets. Only labeling these patches enables a reduction in manual work of up to 85 percent.
ASJC Scopus subject areas
- Earth and Planetary Sciences(all)
- Computers in Earth Sciences
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Photogrammetric Engineering and Remote Sensing, Vol. 84, No. 5, 05.2018, p. 249-261.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Unsupervised source selection for domain adaptation
AU - Vogt, Karsten
AU - Paul, Andreas
AU - Ostermann, Jörn
AU - Rottensteiner, Franz
AU - Heipke, Christian
N1 - © 2018 American Society for Photogrammetry and Remote Sensing
PY - 2018/5
Y1 - 2018/5
N2 - The creation of training sets for supervised machine learning often incurs unsustainable manual costs. Transfer learning (TL) techniques have been proposed as a way to solve this issue by adapting training data from different, but related (source) datasets to the test (target) dataset. A problem in TL is how to quantify the relatedness of a source quickly and robustly. In this work, we present a fast domain similarity measure that captures the relatedness between datasets purely based on unlabeled data. Our method transfers knowledge from multiple sources by generating a weighted combination of domains. We show for multiple datasets that learning on such sources achieves an average overall accuracy closer than 2.5 percent to the results of the target classifier for semantic segmentation tasks. We further apply our method to the task of choosing informative patches from unlabeled datasets. Only labeling these patches enables a reduction in manual work of up to 85 percent.
AB - The creation of training sets for supervised machine learning often incurs unsustainable manual costs. Transfer learning (TL) techniques have been proposed as a way to solve this issue by adapting training data from different, but related (source) datasets to the test (target) dataset. A problem in TL is how to quantify the relatedness of a source quickly and robustly. In this work, we present a fast domain similarity measure that captures the relatedness between datasets purely based on unlabeled data. Our method transfers knowledge from multiple sources by generating a weighted combination of domains. We show for multiple datasets that learning on such sources achieves an average overall accuracy closer than 2.5 percent to the results of the target classifier for semantic segmentation tasks. We further apply our method to the task of choosing informative patches from unlabeled datasets. Only labeling these patches enables a reduction in manual work of up to 85 percent.
UR - http://www.scopus.com/inward/record.url?scp=85047387296&partnerID=8YFLogxK
U2 - 10.14358/PERS.84.5.249
DO - 10.14358/PERS.84.5.249
M3 - Article
AN - SCOPUS:85047387296
VL - 84
SP - 249
EP - 261
JO - Photogrammetric Engineering and Remote Sensing
JF - Photogrammetric Engineering and Remote Sensing
SN - 0099-1112
IS - 5
ER -