Gain(Weather)
I(P(Result))
P(Result = Loss|Weather = Rainy)
alle anschauen, die bei weather rainy haben
darunter alle anschauen, die bei result loss haben
—> 1/2
How does the informatiion-theoretic algorithm choose an attribute?
It chooses the node with the highest information gain.
How can such a minimal set A be exploited to find a decision tree?
z.B. A = {Opponent, Location}
Only the attributes in A are needed in the tree. So smaller sets yield smaller trees.
You want to build a decision list for the result using tests with literals of the form attribute = number. Give the shortest possible decision list.
If Opponent = Weak then Result = Win else Result = Loss.
SVM: support vector machine
Give the hypothesis space for finding a linear separator.
The set of functions w · x + b for real numbers w1, w2, b.
Last changed8 months ago