Details
Original language | English |
---|---|
Title of host publication | Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings |
Editors | Arno Siebes, Wouter Duivesteijn, Antti Ukkonen |
Publisher | Springer Verlag |
Pages | 225-237 |
Number of pages | 13 |
ISBN (print) | 9783030017675 |
Publication status | Published - 2018 |
Externally published | Yes |
Event | 17th International Symposium on Intelligent Data Analysis, IDA 2018 - ‘s-Hertogenbosch, Netherlands Duration: 24 Oct 2018 → 26 Oct 2018 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 11191 LNCS |
ISSN (Print) | 0302-9743 |
ISSN (electronic) | 1611-3349 |
Abstract
Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.
Keywords
- Automated machine learning, Ensembles, Multi-class classification, Reduction
ASJC Scopus subject areas
- Mathematics(all)
- Theoretical Computer Science
- Computer Science(all)
- General Computer Science
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings. ed. / Arno Siebes; Wouter Duivesteijn; Antti Ukkonen. Springer Verlag, 2018. p. 225-237 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11191 LNCS).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Reduction stumps for multi-class classification
AU - Mohr, Felix
AU - Wever, Marcel
AU - Hüllermeier, Eyke
N1 - Publisher Copyright: © Springer Nature Switzerland AG 2018.
PY - 2018
Y1 - 2018
N2 - Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.
AB - Multi-class classification problems are often solved via reduction, i.e., by breaking the original problem into a set of presumably simpler subproblems (and aggregating the solutions of these problems later on). Typical examples of this approach include decomposition schemes such as one-vs-rest, all-pairs, and nested dichotomies. While all these techniques produce reductions to purely binary subproblems, which is reasonable when only binary classifiers ought to be used, we argue that reductions to other multi-class problems can be interesting, too. In this paper, we examine a new type of (meta-)classifier called reduction stump. A reduction stump creates a binary split among the given classes, thereby creating two subproblems, each of which is solved by a multi-class classifier in turn. On top, the two groups of classes are separated by a binary (or multi-class) classifier. In addition to simple reduction stumps, we consider ensembles of such models. Empirically, we show that this kind of reduction, in spite of its simplicity, can often lead to significant performance gains.
KW - Automated machine learning
KW - Ensembles
KW - Multi-class classification
KW - Reduction
UR - http://www.scopus.com/inward/record.url?scp=85055693021&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-01768-2_19
DO - 10.1007/978-3-030-01768-2_19
M3 - Conference contribution
AN - SCOPUS:85055693021
SN - 9783030017675
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 225
EP - 237
BT - Advances in Intelligent Data Analysis XVII - 17th International Symposium, IDA 2018, Proceedings
A2 - Siebes, Arno
A2 - Duivesteijn, Wouter
A2 - Ukkonen, Antti
PB - Springer Verlag
T2 - 17th International Symposium on Intelligent Data Analysis, IDA 2018
Y2 - 24 October 2018 through 26 October 2018
ER -