Reduction stumps for multi-class classification

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

External Research Organisations

  • Paderborn University
  • Heinz Nixdorf Institute
View graph of relations

Details

Original languageEnglish
Title of host publicationAdvances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings
EditorsArno Siebes, Wouter Duivesteijn, Antti Ukkonen
PublisherSpringer Verlag
Pages225-237
Number of pages13
ISBN (print)9783030017675
Publication statusPublished - 2018
Externally publishedYes
Event17th International Symposium on Intelligent Data Analysis, IDA 2018 - ‘s-Hertogenbosch, Netherlands
Duration: 24 Oct 201826 Oct 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11191 LNCS
ISSN (Print)0302-9743
ISSN (electronic)1611-3349

Abstract

Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.

Keywords

    Automated machine learning, Ensembles, Multi-class classification, Reduction

ASJC Scopus subject areas

Cite this

Reduction stumps for multi-class classification. / Mohr, Felix; Wever, Marcel; Hüllermeier, Eyke.
Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings. ed. / Arno Siebes; Wouter Duivesteijn; Antti Ukkonen. Springer Verlag, 2018. p. 225-237 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11191 LNCS).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Mohr, F, Wever, M & Hüllermeier, E 2018, Reduction stumps for multi-class classification. in A Siebes, W Duivesteijn & A Ukkonen (eds), Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11191 LNCS, Springer Verlag, pp. 225-237, 17th International Symposium on Intelligent Data Analysis, IDA 2018, ‘s-Hertogenbosch, Netherlands, 24 Oct 2018. https://doi.org/10.1007/978-3-030-01768-2_19
Mohr, F., Wever, M., & Hüllermeier, E. (2018). Reduction stumps for multi-class classification. In A. Siebes, W. Duivesteijn, & A. Ukkonen (Eds.), Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings (pp. 225-237). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11191 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-01768-2_19
Mohr F, Wever M, Hüllermeier E. Reduction stumps for multi-class classification. In Siebes A, Duivesteijn W, Ukkonen A, editors, Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings. Springer Verlag. 2018. p. 225-237. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-030-01768-2_19
Mohr, Felix ; Wever, Marcel ; Hüllermeier, Eyke. / Reduction stumps for multi-class classification. Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings. editor / Arno Siebes ; Wouter Duivesteijn ; Antti Ukkonen. Springer Verlag, 2018. pp. 225-237 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Download
@inproceedings{a520717d98d641549e4a065fa289a552,
title = "Reduction stumps for multi-class classification",
abstract = "Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.",
keywords = "Automated machine learning, Ensembles, Multi-class classification, Reduction",
author = "Felix Mohr and Marcel Wever and Eyke H{\"u}llermeier",
note = "Publisher Copyright: {\textcopyright} Springer Nature Switzerland AG 2018.; 17th International Symposium on Intelligent Data Analysis, IDA 2018 ; Conference date: 24-10-2018 Through 26-10-2018",
year = "2018",
doi = "10.1007/978-3-030-01768-2_19",
language = "English",
isbn = "9783030017675",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "225--237",
editor = "Arno Siebes and Wouter Duivesteijn and Antti Ukkonen",
booktitle = "Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings",
address = "Germany",

}

Download

TY - GEN

T1 - Reduction stumps for multi-class classification

AU - Mohr, Felix

AU - Wever, Marcel

AU - Hüllermeier, Eyke

N1 - Publisher Copyright: © Springer Nature Switzerland AG 2018.

PY - 2018

Y1 - 2018

N2 - Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.

AB - Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.

KW - Automated machine learning

KW - Ensembles

KW - Multi-class classification

KW - Reduction

UR - http://www.scopus.com/inward/record.url?scp=85055693021&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-01768-2_19

DO - 10.1007/978-3-030-01768-2_19

M3 - Conference contribution

AN - SCOPUS:85055693021

SN - 9783030017675

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 225

EP - 237

BT - Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings

A2 - Siebes, Arno

A2 - Duivesteijn, Wouter

A2 - Ukkonen, Antti

PB - Springer Verlag

T2 - 17th International Symposium on Intelligent Data Analysis, IDA 2018

Y2 - 24 October 2018 through 26 October 2018

ER -

By the same author(s)