|
15 | 15 | "cell_type": "markdown",
|
16 | 16 | "metadata": {},
|
17 | 17 | "source": [
|
18 |
| - "\n# Swiss Roll reduction with LLE\n\nAn illustration of Swiss Roll reduction\nwith locally linear embedding\n" |
| 18 | + "\n# Swiss Roll And Swiss-Hole Reduction\nThis notebook seeks to compare two popular non-linear dimensionality\ntechniques, T-distributed Stochastic Neighbor Embedding (t-SNE) and\nLocally Linear Embedding (LLE), on the classic Swiss Roll dataset.\nThen, we will explore how they both deal with the addition of a hole\nin the data.\n" |
| 19 | + ] |
| 20 | + }, |
| 21 | + { |
| 22 | + "cell_type": "markdown", |
| 23 | + "metadata": {}, |
| 24 | + "source": [ |
| 25 | + "## Swiss Roll\n\nWe start by generating the Swiss Roll dataset.\n\n" |
| 26 | + ] |
| 27 | + }, |
| 28 | + { |
| 29 | + "cell_type": "code", |
| 30 | + "execution_count": null, |
| 31 | + "metadata": { |
| 32 | + "collapsed": false |
| 33 | + }, |
| 34 | + "outputs": [], |
| 35 | + "source": [ |
| 36 | + "import matplotlib.pyplot as plt\nfrom sklearn import manifold, datasets\n\n\nsr_points, sr_color = datasets.make_swiss_roll(n_samples=1500, random_state=0)" |
| 37 | + ] |
| 38 | + }, |
| 39 | + { |
| 40 | + "cell_type": "markdown", |
| 41 | + "metadata": {}, |
| 42 | + "source": [ |
| 43 | + "Now, let's take a look at our data:\n\n" |
| 44 | + ] |
| 45 | + }, |
| 46 | + { |
| 47 | + "cell_type": "code", |
| 48 | + "execution_count": null, |
| 49 | + "metadata": { |
| 50 | + "collapsed": false |
| 51 | + }, |
| 52 | + "outputs": [], |
| 53 | + "source": [ |
| 54 | + "fig = plt.figure(figsize=(8, 6))\nax = fig.add_subplot(111, projection=\"3d\")\nfig.add_axes(ax)\nax.scatter(\n sr_points[:, 0], sr_points[:, 1], sr_points[:, 2], c=sr_color, s=50, alpha=0.8\n)\nax.set_title(\"Swiss Roll in Ambient Space\")\nax.view_init(azim=-66, elev=12)\n_ = ax.text2D(0.8, 0.05, s=\"n_samples=1500\", transform=ax.transAxes)" |
| 55 | + ] |
| 56 | + }, |
| 57 | + { |
| 58 | + "cell_type": "markdown", |
| 59 | + "metadata": {}, |
| 60 | + "source": [ |
| 61 | + "Computing the LLE and t-SNE embeddings, we find that LLE seems to unroll the\nSwiss Roll pretty effectively. t-SNE on the other hand, is able\nto preserve the general structure of the data, but, poorly represents the\ncontinous nature of our original data. Instead, it seems to unnecessarily\nclump sections of points together.\n\n" |
| 62 | + ] |
| 63 | + }, |
| 64 | + { |
| 65 | + "cell_type": "code", |
| 66 | + "execution_count": null, |
| 67 | + "metadata": { |
| 68 | + "collapsed": false |
| 69 | + }, |
| 70 | + "outputs": [], |
| 71 | + "source": [ |
| 72 | + "sr_lle, sr_err = manifold.locally_linear_embedding(\n sr_points, n_neighbors=12, n_components=2\n)\n\nsr_tsne = manifold.TSNE(\n n_components=2, learning_rate=\"auto\", perplexity=40, init=\"pca\", random_state=0\n).fit_transform(sr_points)\n\nfig, axs = plt.subplots(figsize=(8, 8), nrows=2)\naxs[0].scatter(sr_lle[:, 0], sr_lle[:, 1], c=sr_color)\naxs[0].set_title(\"LLE Embedding of Swiss Roll\")\naxs[1].scatter(sr_tsne[:, 0], sr_tsne[:, 1], c=sr_color)\n_ = axs[1].set_title(\"t-SNE Embedding of Swiss Roll\")" |
| 73 | + ] |
| 74 | + }, |
| 75 | + { |
| 76 | + "cell_type": "markdown", |
| 77 | + "metadata": {}, |
| 78 | + "source": [ |
| 79 | + "<div class=\"alert alert-info\"><h4>Note</h4><p>LLE seems to be stretching the points from the center (purple)\n of the swiss roll. However, we observe that this is simply a byproduct\n of how the data was generated. There is a higher density of points near the\n center of the roll, which ultimately affects how LLE reconstructs the\n data in a lower dimension.</p></div>\n\n" |
| 80 | + ] |
| 81 | + }, |
| 82 | + { |
| 83 | + "cell_type": "markdown", |
| 84 | + "metadata": {}, |
| 85 | + "source": [ |
| 86 | + "## Swiss-Hole\n\nNow let's take a look at how both algorithms deal with us adding a hole to\nthe data. First, we generate the Swiss-Hole dataset and plot it:\n\n" |
19 | 87 | ]
|
20 | 88 | },
|
21 | 89 | {
|
|
26 | 94 | },
|
27 | 95 | "outputs": [],
|
28 | 96 | "source": [
|
29 |
| - "# Author: Fabian Pedregosa -- < [email protected]>\n# License: BSD 3 clause (C) INRIA 2011\n\nimport matplotlib.pyplot as plt\n\n# This import is needed to modify the way figure behaves\nfrom mpl_toolkits.mplot3d import Axes3D\n\nAxes3D\n\n# ----------------------------------------------------------------------\n# Locally linear embedding of the swiss roll\n\nfrom sklearn import manifold, datasets\n\nX, color = datasets.make_swiss_roll(n_samples=1500)\n\nprint(\"Computing LLE embedding\")\nX_r, err = manifold.locally_linear_embedding(X, n_neighbors=12, n_components=2)\nprint(\"Done. Reconstruction error: %g\" % err)\n\n# ----------------------------------------------------------------------\n# Plot result\n\nfig = plt.figure()\n\nax = fig.add_subplot(211, projection=\"3d\")\nax.scatter(X[:, 0], X[:, 1], X[:, 2], c=color, cmap=plt.cm.Spectral)\n\nax.set_title(\"Original data\")\nax = fig.add_subplot(212)\nax.scatter(X_r[:, 0], X_r[:, 1], c=color, cmap=plt.cm.Spectral)\nplt.axis(\"tight\")\nplt.xticks([]), plt.yticks([])\nplt.title(\"Projected data\")\nplt.show()" |
| 97 | + "sh_points, sh_color = datasets.make_swiss_roll(\n n_samples=1500, hole=True, random_state=0\n)\n\nfig = plt.figure(figsize=(8, 6))\nax = fig.add_subplot(111, projection=\"3d\")\nfig.add_axes(ax)\nax.scatter(\n sh_points[:, 0], sh_points[:, 1], sh_points[:, 2], c=sh_color, s=50, alpha=0.8\n)\nax.set_title(\"Swiss-Hole in Ambient Space\")\nax.view_init(azim=-66, elev=12)\n_ = ax.text2D(0.8, 0.05, s=\"n_samples=1500\", transform=ax.transAxes)" |
| 98 | + ] |
| 99 | + }, |
| 100 | + { |
| 101 | + "cell_type": "markdown", |
| 102 | + "metadata": {}, |
| 103 | + "source": [ |
| 104 | + "Computing the LLE and t-SNE embeddings, we obtain similar results to the\nSwiss Roll. LLE very capably unrolls the data and even preserves\nthe hole. t-SNE, again seems to clump sections of points together, but, we\nnote that it preserves the general topology of the original data.\n\n" |
| 105 | + ] |
| 106 | + }, |
| 107 | + { |
| 108 | + "cell_type": "code", |
| 109 | + "execution_count": null, |
| 110 | + "metadata": { |
| 111 | + "collapsed": false |
| 112 | + }, |
| 113 | + "outputs": [], |
| 114 | + "source": [ |
| 115 | + "sh_lle, sh_err = manifold.locally_linear_embedding(\n sh_points, n_neighbors=12, n_components=2\n)\n\nsh_tsne = manifold.TSNE(\n n_components=2, learning_rate=\"auto\", perplexity=40, init=\"random\", random_state=0\n).fit_transform(sh_points)\n\nfig, axs = plt.subplots(figsize=(8, 8), nrows=2)\naxs[0].scatter(sh_lle[:, 0], sh_lle[:, 1], c=sh_color)\naxs[0].set_title(\"LLE Embedding of Swiss-Hole\")\naxs[1].scatter(sh_tsne[:, 0], sh_tsne[:, 1], c=sh_color)\n_ = axs[1].set_title(\"t-SNE Embedding of Swiss-Hole\")" |
| 116 | + ] |
| 117 | + }, |
| 118 | + { |
| 119 | + "cell_type": "markdown", |
| 120 | + "metadata": {}, |
| 121 | + "source": [ |
| 122 | + "## Concluding remarks\n\nWe note that t-SNE benefits from testing more combinations of parameters.\nBetter results could probably have been obtained by better tuning these\nparameters.\n\nWe observe that, as seen in the \"Manifold learning on\nhandwritten digits\" example, t-SNE generally performs better than LLE\non real world data.\n\n" |
30 | 123 | ]
|
31 | 124 | }
|
32 | 125 | ],
|
|
0 commit comments