Skip to content

Commit c1d455e

Browse files
committed
Pushing the docs to dev/ for branch: main, commit d1e7caf47062d2a359a48a1846126dad5a60edbd
1 parent a3aaebd commit c1d455e

File tree

1,337 files changed

+7323
-7323
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,337 files changed

+7323
-7323
lines changed

dev/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: 9337fda070fef4aa74205bfebad0bc41
3+
config: 4a3cf23906c75b96d6a5527db492c402
44
tags: 645f666f9bcd5a90fca523b33c5a78b7
Binary file not shown.
Binary file not shown.

dev/_downloads/cb9a8a373677fb481fe43a11d8fa0e94/plot_hgbt_regression.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"\n# Use cases of advanced features in Histogram Gradient Boosting Trees\n\n`histogram_based_gradient_boosting` (HGBT) models may be one of the most\nuseful supervised learning models in scikit-learn. They are based on a modern\ngradient boosting implementation comparable to LightGBM and XGBoost. As such,\nHGBT models are more feature rich than and often outperform alternative models\nlike random forests, especially when the number of samples is larger than some\nten thousands (see\n`sphx_glr_auto_examples_ensemble_plot_forest_hist_grad_boosting_comparison.py`).\n\nThe top usability features of HGBT models are:\n\n1. Several available loss function for mean and quantile regression tasks, see\n `Quantile loss <quantile_support_hgbdt>`.\n2. `categorical_support_gbdt` (see\n `sphx_glr_auto_examples_ensemble_plot_gradient_boosting_categorical.py`).\n3. Early stopping.\n4. `nan_support_hgbt`, which avoids the need for an imputer.\n5. `monotonic_cst_gbdt`.\n6. `interaction_cst_hgbt`.\n\nThis example aims at showcasing all points except 2 and 6 in a real life\nsetting.\n"
7+
"\n# Features in Histogram Gradient Boosting Trees\n\n`histogram_based_gradient_boosting` (HGBT) models may be one of the most\nuseful supervised learning models in scikit-learn. They are based on a modern\ngradient boosting implementation comparable to LightGBM and XGBoost. As such,\nHGBT models are more feature rich than and often outperform alternative models\nlike random forests, especially when the number of samples is larger than some\nten thousands (see\n`sphx_glr_auto_examples_ensemble_plot_forest_hist_grad_boosting_comparison.py`).\n\nThe top usability features of HGBT models are:\n\n1. Several available loss function for mean and quantile regression tasks, see\n `Quantile loss <quantile_support_hgbdt>`.\n2. `categorical_support_gbdt` (see\n `sphx_glr_auto_examples_ensemble_plot_gradient_boosting_categorical.py`).\n3. Early stopping.\n4. `nan_support_hgbt`, which avoids the need for an imputer.\n5. `monotonic_cst_gbdt`.\n6. `interaction_cst_hgbt`.\n\nThis example aims at showcasing all points except 2 and 6 in a real life\nsetting.\n"
88
]
99
},
1010
{

dev/_downloads/d108f2283ac3905eb623b32d42217a2b/plot_hgbt_regression.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
"""
2-
===================================================================
3-
Use cases of advanced features in Histogram Gradient Boosting Trees
4-
===================================================================
2+
==============================================
3+
Features in Histogram Gradient Boosting Trees
4+
==============================================
55
66
:ref:`histogram_based_gradient_boosting` (HGBT) models may be one of the most
77
useful supervised learning models in scikit-learn. They are based on a modern

dev/_downloads/scikit-learn-docs.zip

-10.9 KB
Binary file not shown.
32 Bytes
-6 Bytes
39 Bytes
-241 Bytes

0 commit comments

Comments
 (0)