free web hit counter

Introduction of Decision Trees

August 11, 2017
Print Friendly, PDF & Email

Decision Tree: Overview

Data mining techniques that help to make decisions using the available facts can be termed as Decision Tree. Decision trees are not only useful for decision-making applications it is also used for classification and prediction task. There are some popular decision Tree algorithms namely C4.5, ID3, and CART. These algorithms are supervised learning algorithms. During training, the input samples are represented as a tree data structure.




example

An example of a decision tree is given in figure 1, where nodes of the tree show the attributes of the data set. Additionally, edges create a relationship among two nodes using the values of available attributes. The leaf node of the tree is recognized as the decision node.

Decision-Trees example

Figure 1 decision tree example

Figure 1, demonstrates a decisions tree. Here decision labels are (yes or no), it is also known as class labels. In decision trees, the class label is placed on leaf nodes. Additionally, the nodes humidity, outlook, and wind are attributes in data-set. Because the data set contains decision and attributes and decision tree graphically represents the data. Therefore it is helpful to understand the relationship between them. Sometimes the decision trees are used in form of IF THEN ELSE rules. An example of  IF THEN ELSE rule given as:

IF (Outlook = sun & Humidity = normal) then decision = yes



Advantages

There are a number of utility and applications using decision tree algorithms. The following key advantages of decision tree make it more acceptable:

  1. A decision tree is simple to understand and after a brief exploration, we construct it.
  2. It requires modest data training. Earlier techniques demand data normalization, creation of dummy variables and removal of blank values.
  3. It can deal with numerical as well as categorical data. Other techniques have their expertise into the analysis of different sets of data having just a single category of variables. Such as neural networks, that can handle only numerical values, while association rules only used with nominal variables.
  4. Utilizes the white box model. When a given condition is able to be seen in the model then rationalization for the provided scenario is easily elaborated by Boolean logic.
  5. Validation of representation with statistical tests is also possible. It makes the explanation for the reliability of the model possible.
  6. It is Robust and generates promising outcomes even when its postulations are debased to some extent with the actual model which generates the data.
  7. Time efficient even with large data. Standard computing resources help in analyzing large data.

Disadvantages




  1. Decision-tree learning algorithms are based on heuristic algorithms which fail to offer an assurance of returning the globally optimal decision tree.
  2. It is possible that the decision tree learners can generate extra complex trees which may fail to generalize data properly. This is known as overfitting. For avoiding this, usage of some mechanisms such as the pruning of data becomes necessary.
  3. Decision tree fails to describe some complex concepts like predicaments of XOR, parity or multiplexer. In such cases, large decision trees are generated. In order to overcome this issue, two things can be done either shifting the depiction of the problem domain or by means of learning algorithms rooted in more communicative representations.
  4. For categorical variables that include data with a number of levels, decision trees give a preconception about the information gain on the side of the attributes with more levels.

References

[1]Susan Lomax, Sunil Vadera, “A Survey of Cost-Sensitive Decision Tree Induction Algorithms”, 2011, A survey of cost-sensitive decision tree induction algorithms ACM Computing Surveys, Article, 34 pages.

[2] Decision tree learning, https://en.wikipedia.org/wiki/Decision_tree_learning

[3] Gregory Hamel, Advantages & Disadvantages of Decision Trees, https://www.techwalla.com/articles/ advantages-disadvantages-of-decision-trees

[4]https://www.tutorialspoint.com/data_mining/dm_dti.htm

18 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *