Open set task augmentation facilitates generalization of deep neural networks trained on small data sets

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autorschaft

Externe Organisationen

  • Hochschule Bielefeld (HSBI)
  • Miele & Cie. KG
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)6067-6083
Seitenumfang17
FachzeitschriftNeural Computing and Applications
Jahrgang34
Ausgabenummer8
Frühes Online-Datum9 Dez. 2021
PublikationsstatusVeröffentlicht - Apr. 2022
Extern publiziertJa

Abstract

Many application scenarios for image recognition require learning of deep networks from small sample sizes in the order of a few hundred samples per class. Then, avoiding overfitting is critical. Common techniques to address overfitting are transfer learning, reduction of model complexity and artificial enrichment of the available data by, e.g., data augmentation. A key idea proposed in this paper is to incorporate additional samples into the training that do not belong to the classes of the target task. This can be accomplished by formulating the original classification task as an open set classification task. While the original closed set classification task is not altered at inference time, the recast as open set classification task enables the inclusion of additional data during training. Hence, the original closed set classification task is augmented with an open set task during training. We therefore call the proposed approach open set task augmentation. In order to integrate additional task-unrelated samples into the training, we employ the entropic open set loss originally proposed for open set classification tasks and also show that similar results can be obtained with a modified sum of squared errors loss function. Learning with the proposed approach benefits from the integration of additional “unknown” samples, which are often available, e.g., from open data sets, and can then be easily integrated into the learning process. We show that this open set task augmentation can improve model performance even when these additional samples are rather few or far from the domain of the target task. The proposed approach is demonstrated on two exemplary scenarios based on subsets of the ImageNet and Food-101 data sets as well as with several network architectures and two loss functions. We further shed light on the impact of the entropic open set loss on the internal representations formed by the networks. Open set task augmentation is particularly valuable when no additional data from the target classes are available—a scenario often faced in practice.

ASJC Scopus Sachgebiete

Zitieren

Open set task augmentation facilitates generalization of deep neural networks trained on small data sets. / Zai El Amri, Wadhah; Reinhart, Felix; Schenck, Wolfram.
in: Neural Computing and Applications, Jahrgang 34, Nr. 8, 04.2022, S. 6067-6083.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Zai El Amri W, Reinhart F, Schenck W. Open set task augmentation facilitates generalization of deep neural networks trained on small data sets. Neural Computing and Applications. 2022 Apr;34(8):6067-6083. Epub 2021 Dez 9. doi: 10.1007/s00521-021-06753-6
Download
@article{f95f80b87658439b8461be9667f13493,
title = "Open set task augmentation facilitates generalization of deep neural networks trained on small data sets",
abstract = "Many application scenarios for image recognition require learning of deep networks from small sample sizes in the order of a few hundred samples per class. Then, avoiding overfitting is critical. Common techniques to address overfitting are transfer learning, reduction of model complexity and artificial enrichment of the available data by, e.g., data augmentation. A key idea proposed in this paper is to incorporate additional samples into the training that do not belong to the classes of the target task. This can be accomplished by formulating the original classification task as an open set classification task. While the original closed set classification task is not altered at inference time, the recast as open set classification task enables the inclusion of additional data during training. Hence, the original closed set classification task is augmented with an open set task during training. We therefore call the proposed approach open set task augmentation. In order to integrate additional task-unrelated samples into the training, we employ the entropic open set loss originally proposed for open set classification tasks and also show that similar results can be obtained with a modified sum of squared errors loss function. Learning with the proposed approach benefits from the integration of additional “unknown” samples, which are often available, e.g., from open data sets, and can then be easily integrated into the learning process. We show that this open set task augmentation can improve model performance even when these additional samples are rather few or far from the domain of the target task. The proposed approach is demonstrated on two exemplary scenarios based on subsets of the ImageNet and Food-101 data sets as well as with several network architectures and two loss functions. We further shed light on the impact of the entropic open set loss on the internal representations formed by the networks. Open set task augmentation is particularly valuable when no additional data from the target classes are available—a scenario often faced in practice.",
keywords = "Convolutional neural networks, Image recognition, Open set classification, Transfer learning",
author = "{Zai El Amri}, Wadhah and Felix Reinhart and Wolfram Schenck",
note = "Publisher Copyright: {\textcopyright} 2021, The Author(s).",
year = "2022",
month = apr,
doi = "10.1007/s00521-021-06753-6",
language = "English",
volume = "34",
pages = "6067--6083",
journal = "Neural Computing and Applications",
issn = "0941-0643",
publisher = "Springer London",
number = "8",

}

Download

TY - JOUR

T1 - Open set task augmentation facilitates generalization of deep neural networks trained on small data sets

AU - Zai El Amri, Wadhah

AU - Reinhart, Felix

AU - Schenck, Wolfram

N1 - Publisher Copyright: © 2021, The Author(s).

PY - 2022/4

Y1 - 2022/4

N2 - Many application scenarios for image recognition require learning of deep networks from small sample sizes in the order of a few hundred samples per class. Then, avoiding overfitting is critical. Common techniques to address overfitting are transfer learning, reduction of model complexity and artificial enrichment of the available data by, e.g., data augmentation. A key idea proposed in this paper is to incorporate additional samples into the training that do not belong to the classes of the target task. This can be accomplished by formulating the original classification task as an open set classification task. While the original closed set classification task is not altered at inference time, the recast as open set classification task enables the inclusion of additional data during training. Hence, the original closed set classification task is augmented with an open set task during training. We therefore call the proposed approach open set task augmentation. In order to integrate additional task-unrelated samples into the training, we employ the entropic open set loss originally proposed for open set classification tasks and also show that similar results can be obtained with a modified sum of squared errors loss function. Learning with the proposed approach benefits from the integration of additional “unknown” samples, which are often available, e.g., from open data sets, and can then be easily integrated into the learning process. We show that this open set task augmentation can improve model performance even when these additional samples are rather few or far from the domain of the target task. The proposed approach is demonstrated on two exemplary scenarios based on subsets of the ImageNet and Food-101 data sets as well as with several network architectures and two loss functions. We further shed light on the impact of the entropic open set loss on the internal representations formed by the networks. Open set task augmentation is particularly valuable when no additional data from the target classes are available—a scenario often faced in practice.

AB - Many application scenarios for image recognition require learning of deep networks from small sample sizes in the order of a few hundred samples per class. Then, avoiding overfitting is critical. Common techniques to address overfitting are transfer learning, reduction of model complexity and artificial enrichment of the available data by, e.g., data augmentation. A key idea proposed in this paper is to incorporate additional samples into the training that do not belong to the classes of the target task. This can be accomplished by formulating the original classification task as an open set classification task. While the original closed set classification task is not altered at inference time, the recast as open set classification task enables the inclusion of additional data during training. Hence, the original closed set classification task is augmented with an open set task during training. We therefore call the proposed approach open set task augmentation. In order to integrate additional task-unrelated samples into the training, we employ the entropic open set loss originally proposed for open set classification tasks and also show that similar results can be obtained with a modified sum of squared errors loss function. Learning with the proposed approach benefits from the integration of additional “unknown” samples, which are often available, e.g., from open data sets, and can then be easily integrated into the learning process. We show that this open set task augmentation can improve model performance even when these additional samples are rather few or far from the domain of the target task. The proposed approach is demonstrated on two exemplary scenarios based on subsets of the ImageNet and Food-101 data sets as well as with several network architectures and two loss functions. We further shed light on the impact of the entropic open set loss on the internal representations formed by the networks. Open set task augmentation is particularly valuable when no additional data from the target classes are available—a scenario often faced in practice.

KW - Convolutional neural networks

KW - Image recognition

KW - Open set classification

KW - Transfer learning

UR - http://www.scopus.com/inward/record.url?scp=85120918189&partnerID=8YFLogxK

U2 - 10.1007/s00521-021-06753-6

DO - 10.1007/s00521-021-06753-6

M3 - Article

VL - 34

SP - 6067

EP - 6083

JO - Neural Computing and Applications

JF - Neural Computing and Applications

SN - 0941-0643

IS - 8

ER -

Von denselben Autoren