Thresholding a Random Forest classifier

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Externe Organisationen

  • Purdue University
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksAdvances in Visual Computing
Untertitel10th International Symposium, ISVC 2014, Proceedings
Herausgeber (Verlag)Springer Verlag
Seiten95-106
Seitenumfang12
ISBN (elektronisch)9783319143637
PublikationsstatusVeröffentlicht - 2014
Veranstaltung10th International Symposium on Visual Computing, ISVC 2014 - Las Vegas, USA / Vereinigte Staaten
Dauer: 8 Dez. 201410 Dez. 2014

Publikationsreihe

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Band8888
ISSN (Print)0302-9743
ISSN (elektronisch)1611-3349

Abstract

The original Random Forest derives the final result with respect to the number of leaf nodes voted for the corresponding class. Each leaf node is treated equally and the class with the most number of votes wins. Certain leaf nodes in the topology have better classification accuracies and others often lead to a wrong decision. Also the performance of the forest for different classes differs due to uneven class proportions. In this work, a novel voting mechanism is introduced: each leaf node has an individual weight. The final decision is not determined by majority voting but rather by a linear combination of individual weights leading to a better and more robust decision. This method is inspired by the construction of a strong classifier using a linear combination of small rules of thumb (AdaBoost). Small fluctuations which are caused by the use of binary decision trees are better balanced. Experimental results on several datasets for object recognition and action recognition demonstrate that our method successfully improves the classification accuracy of the original Random Forest algorithm.

ASJC Scopus Sachgebiete

Zitieren

Thresholding a Random Forest classifier. / Baumann, Florian; Li, Fangda; Ehlers, Arne et al.
Advances in Visual Computing: 10th International Symposium, ISVC 2014, Proceedings. Springer Verlag, 2014. S. 95-106 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 8888).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Baumann, F, Li, F, Ehlers, A & Rosenhahn, B 2014, Thresholding a Random Forest classifier. in Advances in Visual Computing: 10th International Symposium, ISVC 2014, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Bd. 8888, Springer Verlag, S. 95-106, 10th International Symposium on Visual Computing, ISVC 2014, Las Vegas, USA / Vereinigte Staaten, 8 Dez. 2014. https://doi.org/10.1007/978-3-319-14364-4_10
Baumann, F., Li, F., Ehlers, A., & Rosenhahn, B. (2014). Thresholding a Random Forest classifier. In Advances in Visual Computing: 10th International Symposium, ISVC 2014, Proceedings (S. 95-106). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 8888). Springer Verlag. https://doi.org/10.1007/978-3-319-14364-4_10
Baumann F, Li F, Ehlers A, Rosenhahn B. Thresholding a Random Forest classifier. in Advances in Visual Computing: 10th International Symposium, ISVC 2014, Proceedings. Springer Verlag. 2014. S. 95-106. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-319-14364-4_10
Baumann, Florian ; Li, Fangda ; Ehlers, Arne et al. / Thresholding a Random Forest classifier. Advances in Visual Computing: 10th International Symposium, ISVC 2014, Proceedings. Springer Verlag, 2014. S. 95-106 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Download
@inproceedings{d42f948b79b94f20a32834853757c215,
title = "Thresholding a Random Forest classifier",
abstract = "The original Random Forest derives the final result with respect to the number of leaf nodes voted for the corresponding class. Each leaf node is treated equally and the class with the most number of votes wins. Certain leaf nodes in the topology have better classification accuracies and others often lead to a wrong decision. Also the performance of the forest for different classes differs due to uneven class proportions. In this work, a novel voting mechanism is introduced: each leaf node has an individual weight. The final decision is not determined by majority voting but rather by a linear combination of individual weights leading to a better and more robust decision. This method is inspired by the construction of a strong classifier using a linear combination of small rules of thumb (AdaBoost). Small fluctuations which are caused by the use of binary decision trees are better balanced. Experimental results on several datasets for object recognition and action recognition demonstrate that our method successfully improves the classification accuracy of the original Random Forest algorithm.",
author = "Florian Baumann and Fangda Li and Arne Ehlers and Bodo Rosenhahn",
note = "Funding information: This work has been partially funded by the ERC within the starting grant Dynamic MinVIP.; 10th International Symposium on Visual Computing, ISVC 2014 ; Conference date: 08-12-2014 Through 10-12-2014",
year = "2014",
doi = "10.1007/978-3-319-14364-4_10",
language = "English",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "95--106",
booktitle = "Advances in Visual Computing",
address = "Germany",

}

Download

TY - GEN

T1 - Thresholding a Random Forest classifier

AU - Baumann, Florian

AU - Li, Fangda

AU - Ehlers, Arne

AU - Rosenhahn, Bodo

N1 - Funding information: This work has been partially funded by the ERC within the starting grant Dynamic MinVIP.

PY - 2014

Y1 - 2014

N2 - The original Random Forest derives the final result with respect to the number of leaf nodes voted for the corresponding class. Each leaf node is treated equally and the class with the most number of votes wins. Certain leaf nodes in the topology have better classification accuracies and others often lead to a wrong decision. Also the performance of the forest for different classes differs due to uneven class proportions. In this work, a novel voting mechanism is introduced: each leaf node has an individual weight. The final decision is not determined by majority voting but rather by a linear combination of individual weights leading to a better and more robust decision. This method is inspired by the construction of a strong classifier using a linear combination of small rules of thumb (AdaBoost). Small fluctuations which are caused by the use of binary decision trees are better balanced. Experimental results on several datasets for object recognition and action recognition demonstrate that our method successfully improves the classification accuracy of the original Random Forest algorithm.

AB - The original Random Forest derives the final result with respect to the number of leaf nodes voted for the corresponding class. Each leaf node is treated equally and the class with the most number of votes wins. Certain leaf nodes in the topology have better classification accuracies and others often lead to a wrong decision. Also the performance of the forest for different classes differs due to uneven class proportions. In this work, a novel voting mechanism is introduced: each leaf node has an individual weight. The final decision is not determined by majority voting but rather by a linear combination of individual weights leading to a better and more robust decision. This method is inspired by the construction of a strong classifier using a linear combination of small rules of thumb (AdaBoost). Small fluctuations which are caused by the use of binary decision trees are better balanced. Experimental results on several datasets for object recognition and action recognition demonstrate that our method successfully improves the classification accuracy of the original Random Forest algorithm.

UR - http://www.scopus.com/inward/record.url?scp=84916597623&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-14364-4_10

DO - 10.1007/978-3-319-14364-4_10

M3 - Conference contribution

AN - SCOPUS:84916597623

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 95

EP - 106

BT - Advances in Visual Computing

PB - Springer Verlag

T2 - 10th International Symposium on Visual Computing, ISVC 2014

Y2 - 8 December 2014 through 10 December 2014

ER -

Von denselben Autoren