Efficiency Comparison of Unstable Transductive and Inductive Conformal Classifiers
2014 (English) Conference paper, Published paper (Refereed)
Abstract [en]
In the conformal prediction literature, it appears axiomatic
that transductive conformal classifiers possess a higher predictive efficiency
than inductive conformal classifiers, however, this depends on
whether or not the nonconformity function tends to overfit misclassified
test examples. With the conformal prediction framework’s increasing
popularity, it thus becomes necessary to clarify the settings in which this
claim holds true. In this paper, the efficiency of transductive conformal
classifiers based on decision tree, random forest and support vector machine
classification models is compared to the efficiency of corresponding
inductive conformal classifiers. The results show that the efficiency of
conformal classifiers based on standard decision trees or random forests
is substantially improved when used in the inductive mode, while conformal
classifiers based on support vector machines are more efficient in
the transductive mode. In addition, an analysis is presented that discusses
the effects of calibration set size on inductive conformal classifier
efficiency.
Place, publisher, year, edition, pages Springer , 2014.
Series
IFIP Advances in Information and Communication Technology, ISSN 1868-4238 ; 437
Keywords [en]
Conformal Prediction, Machine learning, Data mining
National Category
Computer Sciences Computer and Information Sciences
Identifiers URN: urn:nbn:se:hb:diva-7323 DOI: 10.1007/978-3-662-44722-2_28 Local ID: 2320/14626 ISBN: 978-3-662-44721-5 (print) ISBN: 978-3-662-44722-2 (print) OAI: oai:DiVA.org:hb-7323 DiVA, id: diva2:888036
Conference Artificial Intelligence Applications and Innovations
Note Sponsorship :
This work was supported by the Swedish Foundation
for Strategic Research through the project High-Performance Data Mining for
Drug Effect Detection (IIS11-0053) and the Knowledge Foundation through the
project Big Data Analytics by Online Ensemble Learning (20120192).
2015-12-222015-12-222020-01-29