Skip to content

Commit d6b2210

Browse files
committed
Pushing the docs to dev/ for branch: main, commit c826fecdf85f864063fbaf4243d50ac37568f159
1 parent 5a77fcd commit d6b2210

File tree

1,365 files changed

+10136
-7361
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,365 files changed

+10136
-7361
lines changed

dev/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: ba9f24c8b347e65a19fac0887eb2d7be
3+
config: c0516397ffc1b12b66e1028264235ddf
44
tags: 645f666f9bcd5a90fca523b33c5a78b7

dev/_downloads/0785ea6d45bde062e5beedda88131215/plot_release_highlights_1_3_0.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@
5454
"cell_type": "markdown",
5555
"metadata": {},
5656
"source": [
57-
"## Missing values support in decision trees\nThe classes :class:`tree.DecisionTreeClassifier` and\n:class:`tree.DecisionTreeRegressor` now support missing values. For each potential\nthreshold on the non-missing data, the splitter will evaluate the split with all the\nmissing values going to the left node or the right node.\nMore details in the `User Guide <tree_missing_value_support>`.\n\n"
57+
"## Missing values support in decision trees\nThe classes :class:`tree.DecisionTreeClassifier` and\n:class:`tree.DecisionTreeRegressor` now support missing values. For each potential\nthreshold on the non-missing data, the splitter will evaluate the split with all the\nmissing values going to the left node or the right node.\nSee more details in the `User Guide <tree_missing_value_support>` or see\n`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for a usecase\nexample of this feature in :class:`~ensemble.HistGradientBoostingRegressor`.\n\n"
5858
]
5959
},
6060
{
Binary file not shown.

dev/_downloads/2da78c80da33b4e0d313b0a90b923ec8/plot_adaboost_regression.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,10 @@
99
regressor. As the number of boosts is increased the regressor can fit more
1010
detail.
1111
12+
See :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for an
13+
example showcasing the benefits of using more efficient regression models such
14+
as :class:`~ensemble.HistGradientBoostingRegressor`.
15+
1216
.. [1] `H. Drucker, "Improving Regressors using Boosting Techniques", 1997.
1317
<https://citeseerx.ist.psu.edu/doc_view/pid/8d49e2dedb817f2c3330e74b63c5fc86d2399ce3>`_
1418

dev/_downloads/2f3ef774a6d7e52e1e6b7ccbb75d25f0/plot_gradient_boosting_quantile.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,9 @@
44
=====================================================
55
66
This example shows how quantile regression can be used to create prediction
7-
intervals.
7+
intervals. See :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`
8+
for an example showcasing some other features of
9+
:class:`~ensemble.HistGradientBoostingRegressor`.
810
911
"""
1012

dev/_downloads/3316f301d7c7651c033565a5ae51c295/plot_release_highlights_1_3_0.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,9 @@
8888
# :class:`tree.DecisionTreeRegressor` now support missing values. For each potential
8989
# threshold on the non-missing data, the splitter will evaluate the split with all the
9090
# missing values going to the left node or the right node.
91-
# More details in the :ref:`User Guide <tree_missing_value_support>`.
91+
# See more details in the :ref:`User Guide <tree_missing_value_support>` or see
92+
# :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for a usecase
93+
# example of this feature in :class:`~ensemble.HistGradientBoostingRegressor`.
9294
import numpy as np
9395
from sklearn.tree import DecisionTreeClassifier
9496

dev/_downloads/38e826c9e3778d7de78b2fc671fd7903/plot_adaboost_regression.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"\n# Decision Tree Regression with AdaBoost\n\nA decision tree is boosted using the AdaBoost.R2 [1]_ algorithm on a 1D\nsinusoidal dataset with a small amount of Gaussian noise.\n299 boosts (300 decision trees) is compared with a single decision tree\nregressor. As the number of boosts is increased the regressor can fit more\ndetail.\n\n.. [1] [H. Drucker, \"Improving Regressors using Boosting Techniques\", 1997.](https://citeseerx.ist.psu.edu/doc_view/pid/8d49e2dedb817f2c3330e74b63c5fc86d2399ce3)\n"
7+
"\n# Decision Tree Regression with AdaBoost\n\nA decision tree is boosted using the AdaBoost.R2 [1]_ algorithm on a 1D\nsinusoidal dataset with a small amount of Gaussian noise.\n299 boosts (300 decision trees) is compared with a single decision tree\nregressor. As the number of boosts is increased the regressor can fit more\ndetail.\n\nSee `sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py` for an\nexample showcasing the benefits of using more efficient regression models such\nas :class:`~ensemble.HistGradientBoostingRegressor`.\n\n.. [1] [H. Drucker, \"Improving Regressors using Boosting Techniques\", 1997.](https://citeseerx.ist.psu.edu/doc_view/pid/8d49e2dedb817f2c3330e74b63c5fc86d2399ce3)\n"
88
]
99
},
1010
{

dev/_downloads/4cf0456267ced0f869a458ef4776d4c5/plot_release_highlights_1_1_0.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,8 @@
2222
"""
2323

2424
# %%
25+
# .. _quantile_support_hgbdt:
26+
#
2527
# Quantile loss in :class:`ensemble.HistGradientBoostingRegressor`
2628
# ----------------------------------------------------------------
2729
# :class:`~ensemble.HistGradientBoostingRegressor` can model quantiles with
@@ -51,6 +53,9 @@
5153
ax.plot(X_1d, hist.predict(X), label=quantile)
5254
_ = ax.legend(loc="lower left")
5355

56+
# %%
57+
# For a usecase example, see
58+
# :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`
5459

5560
# %%
5661
# `get_feature_names_out` Available in all Transformers

dev/_downloads/4f07b03421908788913e044918d8ed1e/plot_release_highlights_0_23_0.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,8 @@
122122
# specific features. In the following example, we construct a target that is
123123
# generally positively correlated with the first feature, with some noise.
124124
# Applying monotoinc constraints allows the prediction to capture the global
125-
# effect of the first feature, instead of fitting the noise.
125+
# effect of the first feature, instead of fitting the noise. For a usecase
126+
# example, see :ref:`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`.
126127
import numpy as np
127128
from matplotlib import pyplot as plt
128129
from sklearn.model_selection import train_test_split

dev/_downloads/68fdea23e50d165632d4bd4e36453cd5/plot_release_highlights_1_1_0.ipynb

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
"cell_type": "markdown",
1212
"metadata": {},
1313
"source": [
14-
"## Quantile loss in :class:`ensemble.HistGradientBoostingRegressor`\n:class:`~ensemble.HistGradientBoostingRegressor` can model quantiles with\n`loss=\"quantile\"` and the new parameter `quantile`.\n\n"
14+
"\n## Quantile loss in :class:`ensemble.HistGradientBoostingRegressor`\n:class:`~ensemble.HistGradientBoostingRegressor` can model quantiles with\n`loss=\"quantile\"` and the new parameter `quantile`.\n\n"
1515
]
1616
},
1717
{
@@ -25,6 +25,13 @@
2525
"from sklearn.ensemble import HistGradientBoostingRegressor\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# Simple regression function for X * cos(X)\nrng = np.random.RandomState(42)\nX_1d = np.linspace(0, 10, num=2000)\nX = X_1d.reshape(-1, 1)\ny = X_1d * np.cos(X_1d) + rng.normal(scale=X_1d / 3)\n\nquantiles = [0.95, 0.5, 0.05]\nparameters = dict(loss=\"quantile\", max_bins=32, max_iter=50)\nhist_quantiles = {\n f\"quantile={quantile:.2f}\": HistGradientBoostingRegressor(\n **parameters, quantile=quantile\n ).fit(X, y)\n for quantile in quantiles\n}\n\nfig, ax = plt.subplots()\nax.plot(X_1d, y, \"o\", alpha=0.5, markersize=1)\nfor quantile, hist in hist_quantiles.items():\n ax.plot(X_1d, hist.predict(X), label=quantile)\n_ = ax.legend(loc=\"lower left\")"
2626
]
2727
},
28+
{
29+
"cell_type": "markdown",
30+
"metadata": {},
31+
"source": [
32+
"For a usecase example, see\n`sphx_glr_auto_examples_ensemble_plot_hgbt_regression.py`\n\n"
33+
]
34+
},
2835
{
2936
"cell_type": "markdown",
3037
"metadata": {},

0 commit comments

Comments
 (0)