Using Imaginary Ensembles to Select GP Classifiers
2010 (English)In: Genetic Programming: 13th European Conference, EuroGP 2010, Istanbul, Turkey, April 7-9, 2010, Proceedings / [ed] A.I. et al. Esparcia-Alcazar, Springer-Verlag Berlin Heidelberg , 2010, p. 278-288Conference paper, Published paper (Refereed)
Abstract [en]
When predictive modeling requires comprehensible models, most data
miners will use specialized techniques producing rule sets or decision trees.
This study, however, shows that genetically evolved decision trees may very
well outperform the more specialized techniques. The proposed approach
evolves a number of decision trees and then uses one of several suggested selection
strategies to pick one specific tree from that pool. The inherent inconsistency
of evolution makes it possible to evolve each tree using all data, and still obtain
somewhat different models. The main idea is to use these quite accurate
and slightly diverse trees to form an imaginary ensemble, which is then used as
a guide when selecting one specific tree. Simply put, the tree classifying the
largest number of instances identically to the ensemble is chosen. In the experimentation,
using 25 UCI data sets, two selection strategies obtained significantly
higher accuracy than the standard rule inducer J48.
Place, publisher, year, edition, pages
Springer-Verlag Berlin Heidelberg , 2010. p. 278-288
Series
LNCS ; 6021
Keywords [en]
classification, decision trees, ensembles, genetic programming, Machine learning
National Category
Computer Sciences Information Systems
Identifiers
URN: urn:nbn:se:hb:diva-6401Local ID: 2320/6793ISBN: 978-3-642-12147-0 (print)OAI: oai:DiVA.org:hb-6401DiVA, id: diva2:887089
Note
Sponsorship:
This work was supported by the INFUSIS project (www.his.se/
infusis) at the University of Skövde, Sweden, in partnership with the Swedish Knowledge
Foundation under grant 2008/0502.
2015-12-222015-12-222020-01-29