# Introduction of Decision Trees

August 11, 2017

## Decision Tree:Overview

in different kinds of supervised data mining techniques the Decision Tree are one of the most popular classification and prediction technique. basically the training data samples are organized in form of tree data structure. where the nodes of tree shows the attributes of the data set and edges can be used for demonstrating the values of these attributes. additionally the leaf node of the tree contains the decisions of the Decision Tree algorithms (i.e. decision tree C4.5, ID3, CART).

example

an example of decision tree is given in figure 1.

Figure 1 decision tree example

in the above given figure 1 a tree is demonstrated that contains decisions. the decision labels are (yes or no) which is placed in leaf nodes. and nodes of tree (humidity, outlook and wind) are attributes.  which are available in data set. because the data set contains both and both the component are help to understand the relationship among the attributes. sometimes these trees can also converted into IF THEN ELSE rules. For above given example a rule can be defined as:

IF (Outlook = sun & Humidity = normal) then decision = yes

the following are the key advantages of any decision tree:

1. Decision tree are simple to understand and construct even after a brief exploration.
2. It requires modest data training. Earlier techniques demand for data normalization, creation of dummy variables and removal of blank values.
3. It can deal with numerical as well as categorical data both. Other techniques have their expertise into analysis of different sets of data having just single category of variables. Such as neural networks, that can handle only numerical values, while relation rules that deals with only nominal variables.
4. Utilizes the white box model. When a given condition is able to be seen in the model then rationalization for the provided scenario is easily elaborated by Boolean logic.
5. Validation of representation with statistical tests is also possible. It makes the explanation for the reliability of the model possible.
6. It is Robust and generates promising outcomes even when its postulations are debased to some extent with the actual model which generates the data.
7. Time efficient even with large data. Standard computing resources help in analyzing large data.

1. Decision-tree learning algorithms are based on heuristic algorithms which fail to offer an assurance of returning the globally optimal decision tree.
2. It is possible that the decision tree learners can generate extra complex trees which may fail to generalize data properly. This is known as over fitting. For avoiding this, usage of some mechanisms such as pruning of data becomes necessary.
3. Decision tree fails to describe some complex concepts like predicaments of XOR, parity or multiplexer. In such cases, large decision trees are generated. In order to overcome this issue two things can be done either shifting the depiction of the problem domain or by means of learning algorithms rooted on more communicative representations.
4. For categorical variables that includes data with a number of levels, decision trees gives a preconception about the information gain on the side of the attributes with more levels.

### References

[1]Susan Lomax, Sunil Vadera, “A Survey of Cost-Sensitive Decision Tree Induction Algorithms”, 2011, A survey of cost-sensitive decision tree induction algorithms ACM Computing Surveys, Article, 34 pages.

[2] Decision tree learning, https://en.wikipedia.org/wiki/Decision_tree_learning

[4]https://www.tutorialspoint.com/data_mining/dm_dti.htm

• ramsay dash cheats } November 1, 2017 at 12:50 am

Good Morning, yahoo lead me here, keep up great work.

• How to November 6, 2017 at 7:34 pm

I like this website its a master peace ! Glad I found this on google .

• sims 4 cats and dogs key free November 25, 2017 at 3:52 am

Hey, happy that i saw on this in google. Thanks!

• color efex pro 4 serial February 17, 2018 at 2:34 am

I’m impressed, I have to admit. Genuinely rarely should i encounter a weblog that’s both educative and entertaining, and let me tell you, you may have hit the nail about the head. Your idea is outstanding; the problem is an element that insufficient persons are speaking intelligently about. I am delighted we came across this during my look for something with this.

• summertime saga 0.14.52 apk  April 4, 2018 at 12:14 am

Very interesting points you have remarked, appreciate it for putting up.

• istripper free credits April 4, 2018 at 6:09 pm

Very interesting points you have remarked, appreciate it for putting up.

• Roblox Tix Hack 2016 (Not Patched) April 7, 2018 at 9:35 am

This is great!

• elcomsoft phone breaker 6.11 April 12, 2018 at 9:43 pm

Very interesting points you have remarked, appreciate it for putting up.

• Fortnite game April 24, 2018 at 3:24 pm

Cheers, great stuff, I like.

• MartinMit May 1, 2018 at 11:13 am

Hi All im rookie here. Good article! Thx! Love your stories!

• Clicking Here May 3, 2018 at 1:59 am

I simply want to tell you that I’m beginner to blogging and site-building and seriously savored you’re page. Almost certainly I’m likely to bookmark your website . You certainly come with impressive articles. Thanks a bunch for revealing your web-site.

• Latest Mobile Tuts August 24, 2018 at 4:30 am

Thanks For Sharing

• furtdso linopv September 9, 2018 at 11:47 am

A lot of of whatever you point out happens to be astonishingly accurate and it makes me ponder why I had not looked at this with this light previously. This article truly did turn the light on for me as far as this specific subject matter goes. But at this time there is actually one position I am not really too cozy with so whilst I attempt to reconcile that with the main idea of your issue, permit me observe just what all the rest of your visitors have to point out.Well done.

Insert math as
$${}$$