Pruning Decision Trees with Misclassification Costs
decision tree classifiers in two learning situations: minimizing loss and probability estimation. In addition to the two most common methods for error minimization, CART'S cost-complexity pruning and C4.5'~ errorbased pruning, we study the extension of cost-complexity pruning to loss and two pruning variants based on Laplace corrections. We perform an empirical comparison of these methods and evaluate them with respect to the following three criteria: loss, mean-squared-error (MSE), and log-loss. We provide a bias-variance decomposition of the MSE to show how pruning affects the bias and variance. We found that applying the Laplace correction to estimate the probability distributions at the leaves was beneficial to all pruning methods, both for loss minimization and for estimating probabilities. Unlike in error minimizat,ion, and somewhat surprisingly, performing no pruning led to results that were on par with other methods in ternis of the evaluation criteria. The main advantage of pruning was in the reduction of the decision tree size, sometimes by a factor of 10. While no method dominated others on all datasets, even for the same domain different pruning mechanisms are better for different loss matrices. We show this last result using Receiver Operating Characteristics (ROC) curves.
|Year of publication:||
|Authors:||Bradford, Jeffrey P. ; Kunz, Clayton ; Kohavi, Ron ; Brunk, Cliff ; Brodley, Carla E.|
|Type of publication:||Other|
ECE Technical Reports
Persistent link: https://www.econbiz.de/10009430383
Saved in favorites
Similar items by person
Integrating E-Commerce and Data Mining : Architecture and Challenges
Ansari, Suhail, (2000)
Applications of data mining to electronic commerce
Kohavi, Ron, (2001)
The surprising power of online experiments : getting the most out of A/B and other controlled tests
Kohavi, Ron, (2017)
- More ...