Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • harvard-cite-them-right
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
One Tree to Explain Them All
University of Borås, School of Business and IT. (CSL@BS)
University of Borås, School of Business and IT. (CSL@BS)
University of Borås, School of Business and IT. (CSL@BS)ORCID iD: 0000-0003-0274-9026
2011 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Random forest is an often used ensemble technique, renowned for its high predictive performance. Random forests models are, however, due to their sheer complexity inherently opaque, making human interpretation and analysis impossible. This paper presents a method of approximating the random forest with just one decision tree. The approach uses oracle coaching, a recently suggested technique where a weaker but transparent model is generated using combinations of regular training data and test data initially labeled by a strong classifier, called the oracle. In this study, the random forest plays the part of the oracle, while the transparent models are decision trees generated by either the standard tree inducer J48, or by evolving genetic programs. Evaluation on 30 data sets from the UCI repository shows that oracle coaching significantly improves both accuracy and area under ROC curve, compared to using training data only. As a matter of fact, resulting single tree models are as accurate as the random forest, on the specific test instances. Most importantly, this is not achieved by inducing or evolving huge trees having perfect fidelity; a large majority of all trees are instead rather compact and clearly comprehensible. The experiments also show that the evolution outperformed J48, with regard to accuracy, but that this came at the expense of slightly larger trees.

Place, publisher, year, edition, pages
IEEE , 2011.
Keywords [en]
genetic programming, random forest, oracle coaching, decision trees, Machine learning
Keywords [sv]
Data mining
National Category
Computer Sciences Computer and Information Sciences
Research subject
Bussiness and IT
Identifiers
URN: urn:nbn:se:hb:diva-6680Local ID: 2320/9855ISBN: 978-1-4244-7834-7 (print)OAI: oai:DiVA.org:hb-6680DiVA, id: diva2:887380
Conference
IEEE Congress on Evolutionary Computation (CEC)
Note

Sponsorship:

This work was supported by the INFUSIS project www.

his.se/infusis at the University of Skövde, Sweden, in partnership

with the Swedish Knowledge Foundation under grant

2008/0502.

Available from: 2015-12-22 Created: 2015-12-22 Last updated: 2020-01-29

Open Access in DiVA

fulltext(116 kB)444 downloads
File information
File name FULLTEXT01.pdfFile size 116 kBChecksum SHA-512
9e110b053bb404bc781ba62a064543de61f921063b10402e7b68a87cd9bdfbb0594fea1e25d87d74a0aaf19027c485c6f96fb8b2b513a3115728c9195d7e3e5a
Type fulltextMimetype application/pdf

Authority records

Johansson, UlfSönströd, CeciliaLöfström, Tuve

Search in DiVA

By author/editor
Johansson, UlfSönströd, CeciliaLöfström, Tuve
By organisation
School of Business and IT
Computer SciencesComputer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 444 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 247 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • harvard-cite-them-right
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf