University of Birmingham > Talks@bham > Artificial Intelligence and Natural Computation seminars > Cost-Sensitive Boosting: A Unifying Perspective

Cost-Sensitive Boosting: A Unifying Perspective

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Lars Kunze.

PLEASE NOTE: This AINC seminar will take place on Friday 18 Nov

Speaker’s homepage: http://www.cs.man.ac.uk/~gbrown/

Host: Prof. Jeremy Wyatt

“Boosting” is one of the most studied and deployed algorithms in Machine Learning history. In safety-critical data science applications such as health Informatics, cost-sensitive predictions are essential. It is therefore unsurprising that “cost-sensitive Boosting” has been widely studied. In this work we provide a unifying perspective for two decades of work on cost-sensitive Boosting algorithms. When analyzing the literature 1997–2016, we find 15 distinct cost-sensitive variants of the original algorithm. Each of these has its own motivation and claims to superiority—so who should we believe? We proceed to critique the literature using four theoretical frameworks: Bayesian decision theory, functional gradient descent, margin theory, and probabilistic modelling. We find that all algorithms are inconsistent with at least one theory. The conclusion? All experiments and theory suggest taking the original 1997 algorithm, applying the rules of standard decision theory, and sweeping aside almost 20 years of heuristics.

This talk is part of the Artificial Intelligence and Natural Computation seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Talks@bham, University of Birmingham. Contact Us | Help and Documentation | Privacy and Publicity.
talks@bham is based on talks.cam from the University of Cambridge.