Institute of Informatics
Acta Cybernetica
Past Issues
Volume 15, Number 2, 2001
Learning decision trees in continuous space
# Learning decision trees in continuous space

**József Dombi and Á. Zsiros**

### Abstract (in LaTeX format)

Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a $Gain$-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measure instead of the entropy function, which comes from the measure of fuzziness using a monotone fuzzy operator. It is more natural and much simpler to compute in case of concept learning (when elements belong to only two classes: positive and negative).
Second, the well-known extension of the ID3 method for handling continuous attributes (C4.5) is based on discretization of attribute values and in it the decision space is separated with axis-parallel hyperplanes. In our proposed new method (CDT) continuous attributes are handled without discretization, and arbitrary geometric figures are used for separation of decision space, like hyperplanes in general position, spheres and ellipsoids. The power of our new method is going to be demonstrated on a few examples.

### Full text

Available electronic editions: PDF.

### DOI

DOI is not available for this article.

### BibTeX entry
`
@article{Dombi:2001:ActaCybernetica,`

author = {J{\'o}zsef Dombi and {\'A}. Zsiros},

title = {Learning decision trees in continuous space},

journal = {Acta Cybernetica},

year = {2001},

volume = {15},

pages = {213--224},

number = {2},

abstract = {Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a $Gain$-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measure instead of the entropy function, which comes from the measure of fuzziness using a monotone fuzzy operator. It is more natural and much simpler to compute in case of concept learning (when elements belong to only two classes: positive and negative).

Second, the well-known extension of the ID3 method for handling continuous attributes (C4.5) is based on discretization of attribute values and in it the decision space is separated with axis-parallel hyperplanes. In our proposed new method (CDT) continuous attributes are handled without discretization, and arbitrary geometric figures are used for separation of decision space, like hyperplanes in general position, spheres and ellipsoids. The power of our new method is going to be demonstrated on a few examples.}

}