Skip to content

Commit 0f3a3f9

Browse files
committed
Pushing the docs to dev/ for branch: main, commit 0238d1f1076ec849c2836faf23cebafc3f2f9e7a
1 parent 27b27a4 commit 0f3a3f9

File tree

1,322 files changed

+7165
-7165
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,322 files changed

+7165
-7165
lines changed

dev/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: 0a7af102b32f002f14fad512c871e2cb
3+
config: bd9c0ef975f43dabbc9cb1bad322c12a
44
tags: 645f666f9bcd5a90fca523b33c5a78b7
Binary file not shown.

dev/_downloads/21a6ff17ef2837fe1cd49e63223a368d/plot_unveil_tree_structure.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@
6868
# - ``weighted_n_node_samples[i]``: the weighted number of training samples
6969
# reaching node ``i``
7070
# - ``value[i, j, k]``: the summary of the training samples that reached node i for
71-
# class j and output k.
71+
# output j and class k (for regression tree, class is set to 1).
7272
#
7373
# Using the arrays, we can traverse the tree structure to compute various
7474
# properties. Below, we will compute the depth of each node and whether or not
Binary file not shown.

dev/_downloads/f7a387851c5762610f4e8197e52bbbca/plot_unveil_tree_structure.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@
4040
"cell_type": "markdown",
4141
"metadata": {},
4242
"source": [
43-
"## Tree structure\n\nThe decision classifier has an attribute called ``tree_`` which allows access\nto low level attributes such as ``node_count``, the total number of nodes,\nand ``max_depth``, the maximal depth of the tree. The\n``tree_.compute_node_depths()`` method computes the depth of each node in the\ntree. `tree_` also stores the entire binary tree structure, represented as a\nnumber of parallel arrays. The i-th element of each array holds information\nabout the node ``i``. Node 0 is the tree's root. Some of the arrays only\napply to either leaves or split nodes. In this case the values of the nodes\nof the other type is arbitrary. For example, the arrays ``feature`` and\n``threshold`` only apply to split nodes. The values for leaf nodes in these\narrays are therefore arbitrary.\n\nAmong these arrays, we have:\n\n - ``children_left[i]``: id of the left child of node ``i`` or -1 if leaf\n node\n - ``children_right[i]``: id of the right child of node ``i`` or -1 if leaf\n node\n - ``feature[i]``: feature used for splitting node ``i``\n - ``threshold[i]``: threshold value at node ``i``\n - ``n_node_samples[i]``: the number of training samples reaching node\n ``i``\n - ``impurity[i]``: the impurity at node ``i``\n - ``weighted_n_node_samples[i]``: the weighted number of training samples\n reaching node ``i``\n - ``value[i, j, k]``: the summary of the training samples that reached node i for\n class j and output k.\n\nUsing the arrays, we can traverse the tree structure to compute various\nproperties. Below, we will compute the depth of each node and whether or not\nit is a leaf.\n\n"
43+
"## Tree structure\n\nThe decision classifier has an attribute called ``tree_`` which allows access\nto low level attributes such as ``node_count``, the total number of nodes,\nand ``max_depth``, the maximal depth of the tree. The\n``tree_.compute_node_depths()`` method computes the depth of each node in the\ntree. `tree_` also stores the entire binary tree structure, represented as a\nnumber of parallel arrays. The i-th element of each array holds information\nabout the node ``i``. Node 0 is the tree's root. Some of the arrays only\napply to either leaves or split nodes. In this case the values of the nodes\nof the other type is arbitrary. For example, the arrays ``feature`` and\n``threshold`` only apply to split nodes. The values for leaf nodes in these\narrays are therefore arbitrary.\n\nAmong these arrays, we have:\n\n - ``children_left[i]``: id of the left child of node ``i`` or -1 if leaf\n node\n - ``children_right[i]``: id of the right child of node ``i`` or -1 if leaf\n node\n - ``feature[i]``: feature used for splitting node ``i``\n - ``threshold[i]``: threshold value at node ``i``\n - ``n_node_samples[i]``: the number of training samples reaching node\n ``i``\n - ``impurity[i]``: the impurity at node ``i``\n - ``weighted_n_node_samples[i]``: the weighted number of training samples\n reaching node ``i``\n - ``value[i, j, k]``: the summary of the training samples that reached node i for\n output j and class k (for regression tree, class is set to 1).\n\nUsing the arrays, we can traverse the tree structure to compute various\nproperties. Below, we will compute the depth of each node and whether or not\nit is a leaf.\n\n"
4444
]
4545
},
4646
{

dev/_downloads/scikit-learn-docs.zip

-7.35 KB
Binary file not shown.
-166 Bytes
-37 Bytes
-157 Bytes
-79 Bytes

0 commit comments

Comments
 (0)