So as you saw that,
decision trees are natural classifiers.
But they can be used for regression as well.
Simply by averaging the values of each of the leaves.
As an example, it's not official example.
It's a bit silly to predict someone's age.
You can say we're asking do they have a walking stick and if they don't,
we ask how tall are they.
And that give us one of those three ages.
With more realistic examples,
regression decision trees are actually quite a reasonable tool.
But this example was probably based on maybe just a dozen records.
And when we don't give them enough data,
decision trees can easily over-fit.
You can see they're just partitioning based on whatever data they can
see at each level in the tree.
Real decision trees are often much deeper than
these artificial examples I just showed you.
So, although explainability is often given as a good reason
to use decision trees and the other tree algorithms we'll look at this week.
In practice, realistic examples tend to be large
complex and so the tree diagrams
themselves tend to be large complex and not that helpful for explainability.
But anyway, let's go on to look at some enhancements of the decision tree idea.