Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • University of Science and Technology of China
View graph of relations

Details

Original languageEnglish
Title of host publicationProceedings
Subtitle of host publication2015 12th Conference on Computer and Robot Vision, CRV 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages155-160
Number of pages6
ISBN (electronic)9781479919864
Publication statusPublished - 14 Jul 2015
Event12th Conference on Computer and Robot Vision, CRV 2015 - Halifax, Canada
Duration: 3 Jun 20155 Jun 2015

Abstract

Random Forest is a well-known ensemble learning method that achieves high recognition accuracies while preserving a fast training procedure. To construct a Random Forest classifier, several decision trees are arranged in a forest while a majority voting leads to the final decision. In order to split each node of a decision tree into two children, several possible variables are randomly selected while a splitting criterion is computed for each of them. Using this pool of possible splits, the Random Forest algorithm selects the best variable according to the splitting criterion. Often, this splitting is not reliable leading to a reduced recognition accuracy. In this paper, we propose to introduce an additional condition for selecting the best variable leading to an improvement of the recognition accuracy, especially for a smaller number of trees. We enhance the standard threshold selection by a quality estimation that is computed using a calibration method. The proposed method is evaluated on datasets for machine learning as well as object recognition.

Keywords

    Machine Learning, Object Recognition, Random Forest, Splitting

ASJC Scopus subject areas

Cite this

Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers. / Baumann, Florian; Chen, Jinghui; Vogt, Karsten et al.
Proceedings: 2015 12th Conference on Computer and Robot Vision, CRV 2015. Institute of Electrical and Electronics Engineers Inc., 2015. p. 155-160 7158334.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Baumann, F, Chen, J, Vogt, K & Rosenhahn, B 2015, Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers. in Proceedings: 2015 12th Conference on Computer and Robot Vision, CRV 2015., 7158334, Institute of Electrical and Electronics Engineers Inc., pp. 155-160, 12th Conference on Computer and Robot Vision, CRV 2015, Halifax, Canada, 3 Jun 2015. https://doi.org/10.1109/crv.2015.28
Baumann, F., Chen, J., Vogt, K., & Rosenhahn, B. (2015). Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers. In Proceedings: 2015 12th Conference on Computer and Robot Vision, CRV 2015 (pp. 155-160). Article 7158334 Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/crv.2015.28
Baumann F, Chen J, Vogt K, Rosenhahn B. Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers. In Proceedings: 2015 12th Conference on Computer and Robot Vision, CRV 2015. Institute of Electrical and Electronics Engineers Inc. 2015. p. 155-160. 7158334 doi: 10.1109/crv.2015.28
Baumann, Florian ; Chen, Jinghui ; Vogt, Karsten et al. / Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers. Proceedings: 2015 12th Conference on Computer and Robot Vision, CRV 2015. Institute of Electrical and Electronics Engineers Inc., 2015. pp. 155-160
Download
@inproceedings{86ab45d006eb4a5d8d2809f58d83bbdf,
title = "Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers",
abstract = "Random Forest is a well-known ensemble learning method that achieves high recognition accuracies while preserving a fast training procedure. To construct a Random Forest classifier, several decision trees are arranged in a forest while a majority voting leads to the final decision. In order to split each node of a decision tree into two children, several possible variables are randomly selected while a splitting criterion is computed for each of them. Using this pool of possible splits, the Random Forest algorithm selects the best variable according to the splitting criterion. Often, this splitting is not reliable leading to a reduced recognition accuracy. In this paper, we propose to introduce an additional condition for selecting the best variable leading to an improvement of the recognition accuracy, especially for a smaller number of trees. We enhance the standard threshold selection by a quality estimation that is computed using a calibration method. The proposed method is evaluated on datasets for machine learning as well as object recognition.",
keywords = "Machine Learning, Object Recognition, Random Forest, Splitting",
author = "Florian Baumann and Jinghui Chen and Karsten Vogt and Bodo Rosenhahn",
year = "2015",
month = jul,
day = "14",
doi = "10.1109/crv.2015.28",
language = "English",
pages = "155--160",
booktitle = "Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",
note = "12th Conference on Computer and Robot Vision, CRV 2015 ; Conference date: 03-06-2015 Through 05-06-2015",

}

Download

TY - GEN

T1 - Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers

AU - Baumann, Florian

AU - Chen, Jinghui

AU - Vogt, Karsten

AU - Rosenhahn, Bodo

PY - 2015/7/14

Y1 - 2015/7/14

N2 - Random Forest is a well-known ensemble learning method that achieves high recognition accuracies while preserving a fast training procedure. To construct a Random Forest classifier, several decision trees are arranged in a forest while a majority voting leads to the final decision. In order to split each node of a decision tree into two children, several possible variables are randomly selected while a splitting criterion is computed for each of them. Using this pool of possible splits, the Random Forest algorithm selects the best variable according to the splitting criterion. Often, this splitting is not reliable leading to a reduced recognition accuracy. In this paper, we propose to introduce an additional condition for selecting the best variable leading to an improvement of the recognition accuracy, especially for a smaller number of trees. We enhance the standard threshold selection by a quality estimation that is computed using a calibration method. The proposed method is evaluated on datasets for machine learning as well as object recognition.

AB - Random Forest is a well-known ensemble learning method that achieves high recognition accuracies while preserving a fast training procedure. To construct a Random Forest classifier, several decision trees are arranged in a forest while a majority voting leads to the final decision. In order to split each node of a decision tree into two children, several possible variables are randomly selected while a splitting criterion is computed for each of them. Using this pool of possible splits, the Random Forest algorithm selects the best variable according to the splitting criterion. Often, this splitting is not reliable leading to a reduced recognition accuracy. In this paper, we propose to introduce an additional condition for selecting the best variable leading to an improvement of the recognition accuracy, especially for a smaller number of trees. We enhance the standard threshold selection by a quality estimation that is computed using a calibration method. The proposed method is evaluated on datasets for machine learning as well as object recognition.

KW - Machine Learning

KW - Object Recognition

KW - Random Forest

KW - Splitting

UR - http://www.scopus.com/inward/record.url?scp=84943159205&partnerID=8YFLogxK

U2 - 10.1109/crv.2015.28

DO - 10.1109/crv.2015.28

M3 - Conference contribution

AN - SCOPUS:84943159205

SP - 155

EP - 160

BT - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 12th Conference on Computer and Robot Vision, CRV 2015

Y2 - 3 June 2015 through 5 June 2015

ER -

By the same author(s)