A decision tree is a tree where each node represents a feature(attribute), each link(branch) represents a decision(rule) and each leaf represents an outcome(categorical or continues value).
This is a very simple implementation of it, in python, from scratch. Works for all discrete valued variables only.
For example, following is a decision tree to approve or reject loans:
This example code takes this following table, containing data about whether or not people play in a given weather condition(and we wish to make a decision tree on the same data):
Here is what the result looks like:
- Incorporate for continuous variables also, using Kmeans clustering
- Forest Implementation based on the same
https://medium.com/deep-math-machine-learning-ai/chapter-4-decision-trees-algorithms-b93975f7a1f1