决策树深度

时间:2016-10-25 13:54:10

标签: cart decision-tree

作为我项目的一部分,我必须使用决策树,我正在使用“fitctree”函数,这是Matlab函数,用于分类我用PCA提取的特征。

我想在fitctree函数中控制树的数量树深度。 谁知道我怎么能这样做?例如,将树的数量更改为200,将树深度更改为10.我将如何执行此操作? 是否可以在决策树中更改这些值?

最佳,

2 个答案:

答案 0 :(得分:1)

fitctree仅提供输入参数来控制结果树的深度:

  • MaxNumSplits
  • MinLeafSize
  • MinParentSize

https://de.mathworks.com/help/stats/classification-trees-and-regression-trees.html#bsw6baj

您必须使用这些参数来控制树的深度。那是因为决策树只有在达到纯度时才会停止增长。

另一种可能性是打开修剪。修剪将通过删除树的几部分来减小树的大小,这些树几乎没有能力对实例进行分类。

答案 1 :(得分:0)

让我假设您正在使用ID3算法。它的pseudocode可以提供一种控制树深度的方法。

ID3 (Examples, Target_Attribute, Attributes, **Depth**)
// Check the depth of the tree, if it is 0, we are going to break 
if (Depth == 0) { break; }

// Else continue
Create a root node for the tree
If all examples are positive, Return the single-node tree Root, with label = +.
If all examples are negative, Return the single-node tree Root, with label = -.
If number of predicting attributes is empty, then Return the single node tree Root,
with label = most common value of the target attribute in the examples.
Otherwise Begin
    A ← The Attribute that best classifies examples.
    Decision Tree attribute for Root = A.
    For each possible value, vi, of A,
        Add a new tree branch below Root, corresponding to the test A = vi.
        Let Examples(vi) be the subset of examples that have the value vi for A
        If Examples(vi) is empty
            Then below this new branch add a leaf node with label = most common target value in the examples

        // We decrease the value of Depth by 1 so the tree stops growing when it reaches the designated depth
        Else below this new branch add the subtree ID3 (Examples(vi), Target_Attribute, Attributes – {A}, Depth - 1)
End
Return Root

您的fictree函数试图实现什么算法?