A Strategy for Attributes Selection in Cost-Sensitive Decision Trees Induction
Decision tree learning is one of the most widely used and practical methods for inductive inference. A fundamental issue in decision tree inductive learning is the attribute selection measure at each non-terminal node of the tree. However, existing literatures have not taken both classification ability and cost-sensitive into account well. In this paper, we present a new strategy for attributes selection, which is a trade-off method between attributes??? information and cost-sensitive learning including misclassification costs and test costs with different units, for selecting splitting attributes in cost-sensitive decision trees induction. The experimental results show our method outperform than the existing methods, such as, information gain method, total costs methods, in terms of the decrease of misclassification costs with different missing rate and various costs in UCI datasets.
Year of publication: |
2008
|
---|---|
Authors: | Zhang Shichao ; Liu Li ; Zhu Xiaofeng ; Shan Chen |
Other Persons: | Xiangjian He, Qiang Wu (contributor) |
Publisher: |
IEEE Computer Society |
Saved in:
freely available
Saved in favorites
Similar items by person
-
Mining follow-up correlation patterns from time-related databases
Zhang, Shichao, (2008)
-
Chen, Shan, (2008)
-
The impact of stochastic convenience yield on long-term forestry investment decisions
Chen, Shan, (2011)
- More ...