Skip to content

Commit 11b862d

Browse files
committed
Pushing the docs to dev/ for branch: main, commit c3bfe86b45577a9405a4680d9971efa9594a0657
1 parent a5fbed6 commit 11b862d

File tree

1,255 files changed

+4550
-4526
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,255 files changed

+4550
-4526
lines changed
Binary file not shown.

dev/_downloads/21b82d82985712b5de6347f382c77c86/plot_partial_dependence.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -202,7 +202,7 @@
202202
"cell_type": "markdown",
203203
"metadata": {},
204204
"source": [
205-
"## 1-way partial dependence with different models\n\nIn this section, we will compute 1-way partial dependence with two different\nmachine-learning models: (i) a multi-layer perceptron and (ii) a\ngradient-boosting model. With these two models, we illustrate how to compute and\ninterpret both partial dependence plot (PDP) for both numerical and categorical\nfeatures and individual conditional expectation (ICE).\n\n#### Multi-layer perceptron\n\nLet's fit a :class:`~sklearn.neural_network.MLPRegressor` and compute\nsingle-variable partial dependence plots.\n\n"
205+
"## 1-way partial dependence with different models\n\nIn this section, we will compute 1-way partial dependence with two different\nmachine-learning models: (i) a multi-layer perceptron and (ii) a\ngradient-boosting model. With these two models, we illustrate how to compute and\ninterpret both partial dependence plot (PDP) for both numerical and categorical\nfeatures and individual conditional expectation (ICE).\n\n### Multi-layer perceptron\n\nLet's fit a :class:`~sklearn.neural_network.MLPRegressor` and compute\nsingle-variable partial dependence plots.\n\n"
206206
]
207207
},
208208
{
@@ -238,7 +238,7 @@
238238
"cell_type": "markdown",
239239
"metadata": {},
240240
"source": [
241-
"#### Gradient boosting\n\nLet's now fit a :class:`~sklearn.ensemble.HistGradientBoostingRegressor` and\ncompute the partial dependence on the same features. We also use the\nspecific preprocessor we created for this model.\n\n"
241+
"### Gradient boosting\n\nLet's now fit a :class:`~sklearn.ensemble.HistGradientBoostingRegressor` and\ncompute the partial dependence on the same features. We also use the\nspecific preprocessor we created for this model.\n\n"
242242
]
243243
},
244244
{
@@ -274,7 +274,7 @@
274274
"cell_type": "markdown",
275275
"metadata": {},
276276
"source": [
277-
"#### Analysis of the plots\n\nWe will first look at the PDPs for the numerical features. For both models, the\ngeneral trend of the PDP of the temperature is that the number of bike rentals is\nincreasing with temperature. We can make a similar analysis but with the opposite\ntrend for the humidity features. The number of bike rentals is decreasing when the\nhumidity increases. Finally, we see the same trend for the wind speed feature. The\nnumber of bike rentals is decreasing when the wind speed is increasing for both\nmodels. We also observe that :class:`~sklearn.neural_network.MLPRegressor` has much\nsmoother predictions than :class:`~sklearn.ensemble.HistGradientBoostingRegressor`.\n\nNow, we will look at the partial dependence plots for the categorical features.\n\nWe observe that the spring season is the lowest bar for the season feature. With the\nweather feature, the rain category is the lowest bar. Regarding the hour feature,\nwe see two peaks around the 7 am and 6 pm. These findings are in line with the\nthe observations we made earlier on the dataset.\n\nHowever, it is worth noting that we are creating potential meaningless\nsynthetic samples if features are correlated.\n\n#### ICE vs. PDP\nPDP is an average of the marginal effects of the features. We are averaging the\nresponse of all samples of the provided set. Thus, some effects could be hidden. In\nthis regard, it is possible to plot each individual response. This representation is\ncalled the Individual Effect Plot (ICE). In the plot below, we plot 50 randomly\nselected ICEs for the temperature and humidity features.\n\n"
277+
"### Analysis of the plots\n\nWe will first look at the PDPs for the numerical features. For both models, the\ngeneral trend of the PDP of the temperature is that the number of bike rentals is\nincreasing with temperature. We can make a similar analysis but with the opposite\ntrend for the humidity features. The number of bike rentals is decreasing when the\nhumidity increases. Finally, we see the same trend for the wind speed feature. The\nnumber of bike rentals is decreasing when the wind speed is increasing for both\nmodels. We also observe that :class:`~sklearn.neural_network.MLPRegressor` has much\nsmoother predictions than :class:`~sklearn.ensemble.HistGradientBoostingRegressor`.\n\nNow, we will look at the partial dependence plots for the categorical features.\n\nWe observe that the spring season is the lowest bar for the season feature. With the\nweather feature, the rain category is the lowest bar. Regarding the hour feature,\nwe see two peaks around the 7 am and 6 pm. These findings are in line with the\nthe observations we made earlier on the dataset.\n\nHowever, it is worth noting that we are creating potential meaningless\nsynthetic samples if features are correlated.\n\n### ICE vs. PDP\nPDP is an average of the marginal effects of the features. We are averaging the\nresponse of all samples of the provided set. Thus, some effects could be hidden. In\nthis regard, it is possible to plot each individual response. This representation is\ncalled the Individual Effect Plot (ICE). In the plot below, we plot 50 randomly\nselected ICEs for the temperature and humidity features.\n\n"
278278
]
279279
},
280280
{
@@ -375,7 +375,7 @@
375375
"cell_type": "markdown",
376376
"metadata": {},
377377
"source": [
378-
"#### 3D representation\n\nLet's make the same partial dependence plot for the 2 features interaction,\nthis time in 3 dimensions.\n\n"
378+
"### 3D representation\n\nLet's make the same partial dependence plot for the 2 features interaction,\nthis time in 3 dimensions.\n\n"
379379
]
380380
},
381381
{
Binary file not shown.

dev/_downloads/bcd609cfe29c9da1f51c848e18b89c76/plot_partial_dependence.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -198,7 +198,7 @@
198198
# features and individual conditional expectation (ICE).
199199
#
200200
# Multi-layer perceptron
201-
# """"""""""""""""""""""
201+
# ~~~~~~~~~~~~~~~~~~~~~~
202202
#
203203
# Let's fit a :class:`~sklearn.neural_network.MLPRegressor` and compute
204204
# single-variable partial dependence plots.
@@ -278,7 +278,7 @@
278278

279279
# %%
280280
# Gradient boosting
281-
# """""""""""""""""
281+
# ~~~~~~~~~~~~~~~~~
282282
#
283283
# Let's now fit a :class:`~sklearn.ensemble.HistGradientBoostingRegressor` and
284284
# compute the partial dependence on the same features. We also use the
@@ -330,7 +330,7 @@
330330

331331
# %%
332332
# Analysis of the plots
333-
# """""""""""""""""""""
333+
# ~~~~~~~~~~~~~~~~~~~~~
334334
#
335335
# We will first look at the PDPs for the numerical features. For both models, the
336336
# general trend of the PDP of the temperature is that the number of bike rentals is
@@ -352,7 +352,7 @@
352352
# synthetic samples if features are correlated.
353353
#
354354
# ICE vs. PDP
355-
# """""""""""
355+
# ~~~~~~~~~~~
356356
# PDP is an average of the marginal effects of the features. We are averaging the
357357
# response of all samples of the provided set. Thus, some effects could be hidden. In
358358
# this regard, it is possible to plot each individual response. This representation is
@@ -521,7 +521,7 @@
521521

522522
# %%
523523
# 3D representation
524-
# """""""""""""""""
524+
# ~~~~~~~~~~~~~~~~~
525525
#
526526
# Let's make the same partial dependence plot for the 2 features interaction,
527527
# this time in 3 dimensions.

dev/_downloads/scikit-learn-docs.zip

-15.9 KB
Binary file not shown.
142 Bytes
-209 Bytes
-85 Bytes
90 Bytes
55 Bytes

0 commit comments

Comments
 (0)