Skip to content

Commit ca017c8

Browse files
committed
Pushing the docs to dev/ for branch: main, commit a256e262efd5ff30edd0ab8985f02e921fbc937c
1 parent ed631b1 commit ca017c8

File tree

1,302 files changed

+5252
-5252
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,302 files changed

+5252
-5252
lines changed

dev/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: a602018b4c2c14238f7cef80bec03477
3+
config: f278ed75a530e9418aa5854aa4411597
44
tags: 645f666f9bcd5a90fca523b33c5a78b7
Binary file not shown.
Binary file not shown.

dev/_downloads/95e2652922af032381167a5aa13f2b36/plot_f_test_vs_mi.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
"cell_type": "markdown",
55
"metadata": {},
66
"source": [
7-
"\n# Comparison of F-test and mutual information\n\nThis example illustrates the differences between univariate F-test statistics\nand mutual information.\n\nWe consider 3 features x_1, x_2, x_3 distributed uniformly over [0, 1], the\ntarget depends on them as follows:\n\ny = x_1 + sin(6 * pi * x_2) + 0.1 * N(0, 1), that is the third features is\ncompletely irrelevant.\n\nThe code below plots the dependency of y against individual x_i and normalized\nvalues of univariate F-tests statistics and mutual information.\n\nAs F-test captures only linear dependency, it rates x_1 as the most\ndiscriminative feature. On the other hand, mutual information can capture any\nkind of dependency between variables and it rates x_2 as the most\ndiscriminative feature, which probably agrees better with our intuitive\nperception for this example. Both methods correctly marks x_3 as irrelevant.\n"
7+
"\n# Comparison of F-test and mutual information\n\nThis example illustrates the differences between univariate F-test statistics\nand mutual information.\n\nWe consider 3 features x_1, x_2, x_3 distributed uniformly over [0, 1], the\ntarget depends on them as follows:\n\ny = x_1 + sin(6 * pi * x_2) + 0.1 * N(0, 1), that is the third feature is\ncompletely irrelevant.\n\nThe code below plots the dependency of y against individual x_i and normalized\nvalues of univariate F-tests statistics and mutual information.\n\nAs F-test captures only linear dependency, it rates x_1 as the most\ndiscriminative feature. On the other hand, mutual information can capture any\nkind of dependency between variables and it rates x_2 as the most\ndiscriminative feature, which probably agrees better with our intuitive\nperception for this example. Both methods correctly mark x_3 as irrelevant.\n"
88
]
99
},
1010
{

dev/_downloads/b15a0f93878767ffa709315b9ebf8a94/plot_f_test_vs_mi.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
We consider 3 features x_1, x_2, x_3 distributed uniformly over [0, 1], the
1010
target depends on them as follows:
1111
12-
y = x_1 + sin(6 * pi * x_2) + 0.1 * N(0, 1), that is the third features is
12+
y = x_1 + sin(6 * pi * x_2) + 0.1 * N(0, 1), that is the third feature is
1313
completely irrelevant.
1414
1515
The code below plots the dependency of y against individual x_i and normalized
@@ -19,7 +19,7 @@
1919
discriminative feature. On the other hand, mutual information can capture any
2020
kind of dependency between variables and it rates x_2 as the most
2121
discriminative feature, which probably agrees better with our intuitive
22-
perception for this example. Both methods correctly marks x_3 as irrelevant.
22+
perception for this example. Both methods correctly mark x_3 as irrelevant.
2323
2424
"""
2525

0 commit comments

Comments
 (0)