Skip to content

Commit 4235c9a

Browse files
committed
Pushing the docs to dev/ for branch: main, commit ca0862a9dbc5dadab2ccd30828de6de0c6f1f69d
1 parent 5484896 commit 4235c9a

File tree

1,221 files changed

+4365
-4320
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,221 files changed

+4365
-4320
lines changed
Binary file not shown.

dev/_downloads/6d4f620ec6653356eb970c2a6ed62081/plot_calibration_curve.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@
8080
"cell_type": "markdown",
8181
"metadata": {},
8282
"source": [
83-
"Uncalibrated :class:`~sklearn.naive_bayes.GaussianNB` is poorly calibrated\nbecause of\nthe redundant features which violate the assumption of feature-independence\nand result in an overly confident classifier, which is indicated by the\ntypical transposed-sigmoid curve. Calibration of the probabilities of\n:class:`~sklearn.naive_bayes.GaussianNB` with `isotonic` can fix\nthis issue as can be seen from the nearly diagonal calibration curve.\n`Sigmoid regression <sigmoid_regressor>` also improves calibration\nslightly,\nalbeit not as strongly as the non-parametric isotonic regression. This can be\nattributed to the fact that we have plenty of calibration data such that the\ngreater flexibility of the non-parametric model can be exploited.\n\nBelow we will make a quantitative analysis considering several classification\nmetrics: `brier_score_loss`, `log_loss`,\n`precision, recall, F1 score <precision_recall_f_measure_metrics>` and\n`ROC AUC <roc_metrics>`.\n\n"
83+
"Uncalibrated :class:`~sklearn.naive_bayes.GaussianNB` is poorly calibrated\nbecause of\nthe redundant features which violate the assumption of feature-independence\nand result in an overly confident classifier, which is indicated by the\ntypical transposed-sigmoid curve. Calibration of the probabilities of\n:class:`~sklearn.naive_bayes.GaussianNB` with `isotonic` can fix\nthis issue as can be seen from the nearly diagonal calibration curve.\n:ref:sigmoid regression `<sigmoid_regressor>` also improves calibration\nslightly,\nalbeit not as strongly as the non-parametric isotonic regression. This can be\nattributed to the fact that we have plenty of calibration data such that the\ngreater flexibility of the non-parametric model can be exploited.\n\nBelow we will make a quantitative analysis considering several classification\nmetrics: `brier_score_loss`, `log_loss`,\n`precision, recall, F1 score <precision_recall_f_measure_metrics>` and\n`ROC AUC <roc_metrics>`.\n\n"
8484
]
8585
},
8686
{
Binary file not shown.

dev/_downloads/85db957603c93bd3e0a4265ea6565b13/plot_calibration_curve.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -124,7 +124,7 @@
124124
# typical transposed-sigmoid curve. Calibration of the probabilities of
125125
# :class:`~sklearn.naive_bayes.GaussianNB` with :ref:`isotonic` can fix
126126
# this issue as can be seen from the nearly diagonal calibration curve.
127-
# :ref:`Sigmoid regression <sigmoid_regressor>` also improves calibration
127+
# :ref:sigmoid regression `<sigmoid_regressor>` also improves calibration
128128
# slightly,
129129
# albeit not as strongly as the non-parametric isotonic regression. This can be
130130
# attributed to the fact that we have plenty of calibration data such that the

dev/_downloads/9b5ca5a413df494778642d75caeb33d7/plot_omp.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@
4141
plt.subplot(4, 1, 1)
4242
plt.xlim(0, 512)
4343
plt.title("Sparse signal")
44-
plt.stem(idx, w[idx])
44+
plt.stem(idx, w[idx], use_line_collection=True)
4545

4646
# plot the noise-free reconstruction
4747
omp = OrthogonalMatchingPursuit(n_nonzero_coefs=n_nonzero_coefs)
@@ -51,7 +51,7 @@
5151
plt.subplot(4, 1, 2)
5252
plt.xlim(0, 512)
5353
plt.title("Recovered signal from noise-free measurements")
54-
plt.stem(idx_r, coef[idx_r])
54+
plt.stem(idx_r, coef[idx_r], use_line_collection=True)
5555

5656
# plot the noisy reconstruction
5757
omp.fit(X, y_noisy)
@@ -60,7 +60,7 @@
6060
plt.subplot(4, 1, 3)
6161
plt.xlim(0, 512)
6262
plt.title("Recovered signal from noisy measurements")
63-
plt.stem(idx_r, coef[idx_r])
63+
plt.stem(idx_r, coef[idx_r], use_line_collection=True)
6464

6565
# plot the noisy reconstruction with number of non-zeros set by CV
6666
omp_cv = OrthogonalMatchingPursuitCV()
@@ -70,7 +70,7 @@
7070
plt.subplot(4, 1, 4)
7171
plt.xlim(0, 512)
7272
plt.title("Recovered signal from noisy measurements with CV")
73-
plt.stem(idx_r, coef[idx_r])
73+
plt.stem(idx_r, coef[idx_r], use_line_collection=True)
7474

7575
plt.subplots_adjust(0.06, 0.04, 0.94, 0.90, 0.20, 0.38)
7676
plt.suptitle("Sparse signal recovery with Orthogonal Matching Pursuit", fontsize=16)

0 commit comments

Comments
 (0)