Change search
Link to record
Permanent link

Direct link
BETA
Linnusson, Henrik
Alternative names
Publications (10 of 12) Show all publications
Linnusson, H., Johansson, U., Boström, H. & Tuve, L. (2018). Classification With Reject Option Using Conformal Prediction. In: : . Paper presented at Pacific-Asia Conference of Knowledge Discovery and Data Mining, Melbourne, Australia, May 15-18, 2018. Cham
Open this publication in new window or tab >>Classification With Reject Option Using Conformal Prediction
2018 (English)Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we propose a practically useful means of interpreting the predictions produced by a conformal classifier. The proposed interpretation leads to a classifier with a reject option, that allows the user to limit the number of erroneous predictions made on the test set, without any need to reveal the true labels of the test objects. The method described in this paper works by estimating the cumulative error count on a set of predictions provided by a conformal classifier, ordered by their confidence. Given a test set and a user-specified parameter k, the proposed classification procedure outputs the largest possible amount of predictions containing on average at most k errors, while refusing to make predictions for test objects where it is too uncertain. We conduct an empirical evaluation using benchmark datasets, and show that we are able to provide accurate estimates for the error rate on the test set.

Place, publisher, year, edition, pages
Cham: , 2018
National Category
Computer Sciences
Identifiers
urn:nbn:se:hb:diva-15265 (URN)
Conference
Pacific-Asia Conference of Knowledge Discovery and Data Mining, Melbourne, Australia, May 15-18, 2018
Projects
DASTARD
Funder
Knowledge Foundation, 20150185
Available from: 2018-10-30 Created: 2018-10-30 Last updated: 2018-11-16Bibliographically approved
Johansson, U., Löfström, T., Sundell, H., Linnusson, H., Gidenstam, A. & Boström, H. (2018). Venn predictors for well-calibrated probability estimation trees. In: Alex J. Gammerman and Vladimir Vovk and Zhiyuan Luo and Evgueni N. Smirnov and Ralf L. M. Peeter (Ed.), 7th Symposium on Conformal and Probabilistic Prediction and Applications: COPA 2018, 11-13 June 2018, Maastricht, The Netherlands. Paper presented at 7th Symposium on Conformal and Probabilistic Prediction and Applications, London, June 11th - 13th, 2018 (pp. 3-14).
Open this publication in new window or tab >>Venn predictors for well-calibrated probability estimation trees
Show others...
2018 (English)In: 7th Symposium on Conformal and Probabilistic Prediction and Applications: COPA 2018, 11-13 June 2018, Maastricht, The Netherlands / [ed] Alex J. Gammerman and Vladimir Vovk and Zhiyuan Luo and Evgueni N. Smirnov and Ralf L. M. Peeter, 2018, p. 3-14Conference paper, Published paper (Refereed)
Abstract [en]

Successful use of probabilistic classification requires well-calibrated probability estimates, i.e., the predicted class probabilities must correspond to the true probabilities. The standard solution is to employ an additional step, transforming the outputs from a classifier into probability estimates. In this paper, Venn predictors are compared to Platt scaling and isotonic regression, for the purpose of producing well-calibrated probabilistic predictions from decision trees. The empirical investigation, using 22 publicly available datasets, showed that the probability estimates from the Venn predictor were extremely well-calibrated. In fact, in a direct comparison using the accepted reliability metric, the Venn predictor estimates were the most exact on every data set.

Series
Proceedings of Machine Learning Research
Keywords
Venn predictors, Calibration, Decision trees, Reliability
National Category
Computer Sciences
Research subject
Business and IT
Identifiers
urn:nbn:se:hb:diva-15061 (URN)
Conference
7th Symposium on Conformal and Probabilistic Prediction and Applications, London, June 11th - 13th, 2018
Funder
Knowledge Foundation
Available from: 2018-09-04 Created: 2018-09-04 Last updated: 2020-01-29Bibliographically approved
Linusson, H., Norinder, U., Boström, H., Johansson, U. & Löfström, T. (2017). On the Calibration of Aggregated Conformal Predictors. In: Proceedings of Machine Learning Research: . Paper presented at Conformal and Probabilistic Prediction and Applications, Stockholm Sweden 13-16 June, 2017.
Open this publication in new window or tab >>On the Calibration of Aggregated Conformal Predictors
Show others...
2017 (English)In: Proceedings of Machine Learning Research, 2017Conference paper, Published paper (Refereed)
Abstract [en]

Conformal prediction is a learning framework that produces models that associate witheach of their predictions a measure of statistically valid confidence. These models are typi-cally constructed on top of traditional machine learning algorithms. An important result ofconformal prediction theory is that the models produced are provably valid under relativelyweak assumptions—in particular, their validity is independent of the specific underlyinglearning algorithm on which they are based. Since validity is automatic, much research onconformal predictors has been focused on improving their informational and computationalefficiency. As part of the efforts in constructing efficient conformal predictors, aggregatedconformal predictors were developed, drawing inspiration from the field of classification andregression ensembles. Unlike early definitions of conformal prediction procedures, the va-lidity of aggregated conformal predictors is not fully understood—while it has been shownthat they might attain empirical exact validity under certain circumstances, their theo-retical validity is conditional on additional assumptions that require further clarification.In this paper, we show why validity is not automatic for aggregated conformal predictors,and provide a revised definition of aggregated conformal predictors that gains approximatevalidity conditional on properties of the underlying learning algorithm.

National Category
Computer Sciences
Identifiers
urn:nbn:se:hb:diva-13636 (URN)
Conference
Conformal and Probabilistic Prediction and Applications, Stockholm Sweden 13-16 June, 2017
Available from: 2018-02-09 Created: 2018-02-09 Last updated: 2020-01-29Bibliographically approved
Linusson, H., Johansson, U., Boström, H. & Löfström, T. (2016). Reliable Confidence Predictions Using Conformal Prediction. In: Lecture Notes in Computer Science: . Paper presented at PAKDD 2016: Advances in Knowledge Discovery and Data Mining, Auckland, April 19-22, 2016 (pp. 77-88). , 9651
Open this publication in new window or tab >>Reliable Confidence Predictions Using Conformal Prediction
2016 (Swedish)In: Lecture Notes in Computer Science, 2016, Vol. 9651, p. 77-88Conference paper, Published paper (Refereed)
Abstract [en]

Conformal classiers output condence prediction regions, i.e., multi-valued predictions that are guaranteed to contain the true output value of each test pattern with some predened probability. In order to fully utilize the predictions provided by a conformal classier, it is essential that those predictions are reliable, i.e., that a user is able to assess the quality of the predictions made. Although conformal classiers are statistically valid by default, the error probability of the prediction regions output are dependent on their size in such a way that smaller, and thus potentially more interesting, predictions are more likely to be incorrect. This paper proposes, and evaluates, a method for producing rened error probability estimates of prediction regions, that takes their size into account. The end result is a binary conformal condence predictor that is able to provide accurate error probability estimates for those prediction regions containing only a single class label.

National Category
Computer Sciences
Research subject
Bussiness and IT
Identifiers
urn:nbn:se:hb:diva-11963 (URN)
Conference
PAKDD 2016: Advances in Knowledge Discovery and Data Mining, Auckland, April 19-22, 2016
Available from: 2017-03-01 Created: 2017-03-01 Last updated: 2020-01-29Bibliographically approved
Löfström, T., Boström, H., Linusson, H. & Johansson, U. (2015). Bias Reduction through Conditional Conformal Prediction. Intelligent Data Analysis, 19(6), 1355-1375
Open this publication in new window or tab >>Bias Reduction through Conditional Conformal Prediction
2015 (English)In: Intelligent Data Analysis, ISSN 1088-467X, E-ISSN 1571-4128, Vol. 19, no 6, p. 1355-1375Article in journal (Refereed) In press
National Category
Computer Sciences
Identifiers
urn:nbn:se:hb:diva-756 (URN)10.3233/IDA-150786 (DOI)
Projects
High-Performance Data Mining for Drug Effect Detection (DADEL)
Available from: 2015-09-11 Created: 2015-09-11 Last updated: 2020-01-29Bibliographically approved
Löfström, T., Zhao, J., Linnusson, H. & Jansson, K. (2015). Predicting Adverse Drug Events with Confidence. In: Sławomir Nowaczyk (Ed.), Thirteenth Scandinavian Conference on Artificial Intelligence: . Paper presented at Thirteenth Scandinavian Conference on Artificial Intelligence. IOS Press
Open this publication in new window or tab >>Predicting Adverse Drug Events with Confidence
2015 (English)In: Thirteenth Scandinavian Conference on Artificial Intelligence / [ed] Sławomir Nowaczyk, IOS Press, 2015Conference paper, Published paper (Refereed)
Abstract [en]

This study introduces the conformal prediction framework to the task of predicting the presence of adverse drug events in electronic health records with an associated measure of statistically valid confidence. The imbalanced nature of the problem was addressed both by evaluating different machine learning algorithms, and by comparing different types of conformal predictors. A novel solution was also evaluated, where different underlying models, each model optimized towards one particular class, were combined into a single conformal predictor. This novel solution proved to be superior to previously existing approaches.

Place, publisher, year, edition, pages
IOS Press, 2015
Series
Frontiers in Artificial Intelligence and Applications
Keywords
Adverse Drug Events, Class Imbalance, Conformal Prediction, Predicting with Confidence.
National Category
Computer Sciences
Research subject
Bussiness and IT
Identifiers
urn:nbn:se:hb:diva-3807 (URN)10.3233/978-1-61499-589-0-88 (DOI)978-1-61499-589-0 (ISBN)
Conference
Thirteenth Scandinavian Conference on Artificial Intelligence
Projects
High-Performance, Data Mining, Drug Effect Detection
Funder
Swedish Foundation for Strategic Research
Available from: 2015-12-08 Created: 2015-12-08 Last updated: 2020-01-29Bibliographically approved
Löfström, T., Linnusson, H., Sönströd, C. & Johansson, U. (2015). System Health Monitoring using Conformal Anomaly Detection. Högskolan i Borås
Open this publication in new window or tab >>System Health Monitoring using Conformal Anomaly Detection
2015 (English)Report (Other (popular science, discussion, etc.))
Place, publisher, year, edition, pages
Högskolan i Borås, 2015. p. 20
Keywords
System health monitoring, conformal anomaly detection
National Category
Computer Sciences
Research subject
Bussiness and IT
Identifiers
urn:nbn:se:hb:diva-9951 (URN)
Projects
Promoting and Enhancing Reuse of Information throughout the Content Lifecycle taking account of Evolving Semantics (PERICLES)
Funder
EU, FP7, Seventh Framework Programme
Note

Technical report

Available from: 2016-05-25 Created: 2016-05-25 Last updated: 2020-01-29Bibliographically approved
Linusson, H., Johansson, U., Boström, H. & Löfström, T. (2014). Efficiency Comparison of Unstable Transductive and Inductive Conformal Classifiers. Paper presented at Artificial Intelligence Applications and Innovations. Paper presented at Artificial Intelligence Applications and Innovations. Springer
Open this publication in new window or tab >>Efficiency Comparison of Unstable Transductive and Inductive Conformal Classifiers
2014 (English)Conference paper, Published paper (Refereed)
Abstract [en]

In the conformal prediction literature, it appears axiomatic that transductive conformal classifiers possess a higher predictive efficiency than inductive conformal classifiers, however, this depends on whether or not the nonconformity function tends to overfit misclassified test examples. With the conformal prediction framework’s increasing popularity, it thus becomes necessary to clarify the settings in which this claim holds true. In this paper, the efficiency of transductive conformal classifiers based on decision tree, random forest and support vector machine classification models is compared to the efficiency of corresponding inductive conformal classifiers. The results show that the efficiency of conformal classifiers based on standard decision trees or random forests is substantially improved when used in the inductive mode, while conformal classifiers based on support vector machines are more efficient in the transductive mode. In addition, an analysis is presented that discusses the effects of calibration set size on inductive conformal classifier efficiency.

Place, publisher, year, edition, pages
Springer, 2014
Series
IFIP Advances in Information and Communication Technology, ISSN 1868-4238 ; 437
Keywords
Conformal Prediction, Machine learning, Data mining
National Category
Computer Sciences Computer and Information Sciences
Identifiers
urn:nbn:se:hb:diva-7323 (URN)10.1007/978-3-662-44722-2_28 (DOI)2320/14626 (Local ID)978-3-662-44721-5 (ISBN)978-3-662-44722-2 (ISBN)2320/14626 (Archive number)2320/14626 (OAI)
Conference
Artificial Intelligence Applications and Innovations
Note

Sponsorship:

This work was supported by the Swedish Foundation

for Strategic Research through the project High-Performance Data Mining for

Drug Effect Detection (IIS11-0053) and the Knowledge Foundation through the

project Big Data Analytics by Online Ensemble Learning (20120192).

Available from: 2015-12-22 Created: 2015-12-22 Last updated: 2020-01-29
Johansson, U., Boström, H., Löfström, T. & Linusson, H. (2014). Regression conformal prediction with random forests. Machine Learning, 97(1-2), 155-176
Open this publication in new window or tab >>Regression conformal prediction with random forests
2014 (English)In: Machine Learning, ISSN 0885-6125, E-ISSN 1573-0565, Vol. 97, no 1-2, p. 155-176Article in journal (Refereed)
Abstract [en]

Regression conformal prediction produces prediction intervals that are valid, i.e., the probability of excluding the correct target value is bounded by a predefined confidence level. The most important criterion when comparing conformal regressors is efficiency; the prediction intervals should be as tight (informative) as possible. In this study, the use of random forests as the underlying model for regression conformal prediction is investigated and compared to existing state-of-the-art techniques, which are based on neural networks and k-nearest neighbors. In addition to their robust predictive performance, random forests allow for determining the size of the prediction intervals by using out-of-bag estimates instead of requiring a separate calibration set. An extensive empirical investigation, using 33 publicly available data sets, was undertaken to compare the use of random forests to existing stateof- the-art conformal predictors. The results show that the suggested approach, on almost all confidence levels and using both standard and normalized nonconformity functions, produced significantly more efficient conformal predictors than the existing alternatives.

Place, publisher, year, edition, pages
Springer New York LLC, 2014
Keywords
Conformal prediction, Random forests, Regression, Machine learning, Data mining
National Category
Computer Sciences Computer and Information Sciences
Identifiers
urn:nbn:se:hb:diva-2030 (URN)10.1007/s10994-014-5453-0 (DOI)2320/14623 (Local ID)2320/14623 (Archive number)2320/14623 (OAI)
Note

Sponsorship:

This

work was supported by the Swedish Foundation for Strategic Research through the project High-Performance

Data Mining for Drug Effect Detection (IIS11-0053) and the Knowledge Foundation through the project Big

Data Analytics by Online Ensemble Learning (20120192).

Available from: 2015-11-13 Created: 2015-11-13 Last updated: 2020-01-29
Johansson, U., Sönströd, C., Linusson, H. & Boström, H. (2014). Regression Trees for Streaming Data with Local Performance Guarantees. Paper presented at IEEE International Conference on Big Data, 27-30 October, 2014, Washington, DC, USA. Paper presented at IEEE International Conference on Big Data, 27-30 October, 2014, Washington, DC, USA. IEEE
Open this publication in new window or tab >>Regression Trees for Streaming Data with Local Performance Guarantees
2014 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Online predictive modeling of streaming data is a key task for big data analytics. In this paper, a novel approach for efficient online learning of regression trees is proposed, which continuously updates, rather than retrains, the tree as more labeled data become available. A conformal predictor outputs prediction sets instead of point predictions; which for regression translates into prediction intervals. The key property of a conformal predictor is that it is always valid, i.e., the error rate, on novel data, is bounded by a preset significance level. Here, we suggest applying Mondrian conformal prediction on top of the resulting models, in order to obtain regression trees where not only the tree, but also each and every rule, corresponding to a path from the root node to a leaf, is valid. Using Mondrian conformal prediction, it becomes possible to analyze and explore the different rules separately, knowing that their accuracy, in the long run, will not be below the preset significance level. An empirical investigation, using 17 publicly available data sets, confirms that the resulting rules are independently valid, but also shows that the prediction intervals are smaller, on average, than when only the global model is required to be valid. All-in-all, the suggested method provides a data miner or a decision maker with highly informative predictive models of streaming data.

Place, publisher, year, edition, pages
IEEE, 2014
Keywords
Conformal Prediction, Streaming data, Regression trees, Interpretable models, Machine learning, Data mining
National Category
Computer Sciences Computer and Information Sciences
Identifiers
urn:nbn:se:hb:diva-7324 (URN)10.1109/BigData.2014.7004263 (DOI)2320/14627 (Local ID)978-1-4799-5666-1/14 (ISBN)2320/14627 (Archive number)2320/14627 (OAI)
Conference
IEEE International Conference on Big Data, 27-30 October, 2014, Washington, DC, USA
Note

Sponsorship:

This work was supported by the Swedish Foundation for Strategic

Research through the project High-Performance Data Mining for Drug Effect

Detection (IIS11-0053), the Swedish Retail and Wholesale Development

Council through the project Innovative Business Intelligence Tools (2013:5)

and the Knowledge Foundation through the project Big Data Analytics by

Online Ensemble Learning (20120192).

Available from: 2015-12-22 Created: 2015-12-22 Last updated: 2018-01-10
Organisations

Search in DiVA

Show all publications