Skip to content

Commit

Permalink
Pushing the docs to dev/ for branch: main, commit deb2ac4c13ca0faf1b0…
Browse files Browse the repository at this point in the history
…28a59f793e474b696feff
  • Loading branch information
dirty-cat-ci committed Jun 7, 2024
1 parent 8832275 commit 6535818
Show file tree
Hide file tree
Showing 72 changed files with 1,227 additions and 1,186 deletions.
9 changes: 5 additions & 4 deletions dev/CHANGES.html
Original file line number Diff line number Diff line change
Expand Up @@ -494,10 +494,11 @@ <h3>Major changes<a class="headerlink" href="#major-changes" title="Link to this
transformations (ii) in <code class="docutils literal notranslate"><span class="pre">specific_transformers</span></code> the same column may not be
used twice (go through 2 different transformers).
<a class="reference external" href="https://github.com/skrub-data/skrub/pull/902">#902</a> by <a class="reference external" href="https://github.com/jeromedockes">Jérôme Dockès</a>.</p></li>
<li><p>The <a class="reference internal" href="generated/skrub.GapEncoder.html#skrub.GapEncoder" title="skrub.GapEncoder"><code class="xref py py-class docutils literal notranslate"><span class="pre">GapEncoder</span></code></a> is now a single-column transformer: its <code class="docutils literal notranslate"><span class="pre">fit</span></code>,
<code class="docutils literal notranslate"><span class="pre">fit_transform</span></code> and <code class="docutils literal notranslate"><span class="pre">transform</span></code> methods accept a single column (a pandas
or polars Series). Dataframes and numpy arrays are not accepted.
<a class="reference external" href="https://github.com/skrub-data/skrub/pull/920">#920</a> by <a class="reference external" href="https://github.com/jeromedockes">Jérôme Dockès</a>.</p></li>
<li><p>The <a class="reference internal" href="generated/skrub.GapEncoder.html#skrub.GapEncoder" title="skrub.GapEncoder"><code class="xref py py-class docutils literal notranslate"><span class="pre">GapEncoder</span></code></a> and <a class="reference internal" href="generated/skrub.MinHashEncoder.html#skrub.MinHashEncoder" title="skrub.MinHashEncoder"><code class="xref py py-class docutils literal notranslate"><span class="pre">MinHashEncoder</span></code></a> are now a single-column
transformers: their <code class="docutils literal notranslate"><span class="pre">fit</span></code>, <code class="docutils literal notranslate"><span class="pre">fit_transform</span></code> and <code class="docutils literal notranslate"><span class="pre">transform</span></code> methods
accept a single column (a pandas or polars Series). Dataframes and numpy
arrays are not accepted.
<a class="reference external" href="https://github.com/skrub-data/skrub/pull/920">#920</a> and <a class="reference external" href="https://github.com/skrub-data/skrub/pull/923">#923</a> by <a class="reference external" href="https://github.com/jeromedockes">Jérôme Dockès</a>.</p></li>
<li><p>Added the <a class="reference internal" href="generated/skrub.MultiAggJoiner.html#skrub.MultiAggJoiner" title="skrub.MultiAggJoiner"><code class="xref py py-class docutils literal notranslate"><span class="pre">MultiAggJoiner</span></code></a> that allows to augment a main table with
multiple auxiliary tables. <a class="reference external" href="https://github.com/skrub-data/skrub/pull/876">#876</a> by <a class="reference external" href="https://github.com/TheooJ">Théo Jolivet</a>.</p></li>
<li><p><a class="reference internal" href="generated/skrub.AggJoiner.html#skrub.AggJoiner" title="skrub.AggJoiner"><code class="xref py py-class docutils literal notranslate"><span class="pre">AggJoiner</span></code></a> now only accepts a single table as an input, and some of its
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@
encoder = make_column_transformer(
("passthrough", ["Year"]),
(ohe, ["Genre"]),
(min_hash, ["Platform"]),
(min_hash, "Platform"),
remainder="drop",
)

Expand Down Expand Up @@ -277,7 +277,7 @@
("passthrough", emb_columns2),
("passthrough", ["Year"]),
(ohe, ["Genre"]),
(min_hash, ["Platform"]),
(min_hash, "Platform"),
remainder="drop",
)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,7 @@
},
"outputs": [],
"source": [
"from sklearn.compose import make_column_transformer\nfrom sklearn.preprocessing import OneHotEncoder\n\nfrom skrub import MinHashEncoder\n\nmin_hash = MinHashEncoder(n_components=100)\nohe = OneHotEncoder(handle_unknown=\"ignore\", sparse_output=False)\n\nencoder = make_column_transformer(\n (\"passthrough\", [\"Year\"]),\n (ohe, [\"Genre\"]),\n (min_hash, [\"Platform\"]),\n remainder=\"drop\",\n)"
"from sklearn.compose import make_column_transformer\nfrom sklearn.preprocessing import OneHotEncoder\n\nfrom skrub import MinHashEncoder\n\nmin_hash = MinHashEncoder(n_components=100)\nohe = OneHotEncoder(handle_unknown=\"ignore\", sparse_output=False)\n\nencoder = make_column_transformer(\n (\"passthrough\", [\"Year\"]),\n (ohe, [\"Genre\"]),\n (min_hash, \"Platform\"),\n remainder=\"drop\",\n)"
]
},
{
Expand Down Expand Up @@ -356,7 +356,7 @@
},
"outputs": [],
"source": [
"encoder3 = make_column_transformer(\n (\"passthrough\", emb_columns),\n (\"passthrough\", emb_columns2),\n (\"passthrough\", [\"Year\"]),\n (ohe, [\"Genre\"]),\n (min_hash, [\"Platform\"]),\n remainder=\"drop\",\n)"
"encoder3 = make_column_transformer(\n (\"passthrough\", emb_columns),\n (\"passthrough\", emb_columns2),\n (\"passthrough\", [\"Year\"]),\n (ohe, [\"Genre\"]),\n (min_hash, \"Platform\"),\n remainder=\"drop\",\n)"
]
},
{
Expand Down
Binary file modified dev/_images/sphx_glr_01_encodings_001.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev/_images/sphx_glr_01_encodings_thumb.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev/_images/sphx_glr_08_join_aggregation_003.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev/_images/sphx_glr_09_interpolation_join_001.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev/_images/sphx_glr_09_interpolation_join_002.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev/_images/sphx_glr_09_interpolation_join_003.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev/_images/sphx_glr_09_interpolation_join_thumb.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
9 changes: 5 additions & 4 deletions dev/_sources/CHANGES.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,11 @@ Major changes
used twice (go through 2 different transformers).
:pr:`902` by :user:`Jérôme Dockès <jeromedockes>`.

* The :class:`GapEncoder` is now a single-column transformer: its ``fit``,
``fit_transform`` and ``transform`` methods accept a single column (a pandas
or polars Series). Dataframes and numpy arrays are not accepted.
:pr:`920` by :user:`Jérôme Dockès <jeromedockes>`.
* The :class:`GapEncoder` and :class:`MinHashEncoder` are now a single-column
transformers: their ``fit``, ``fit_transform`` and ``transform`` methods
accept a single column (a pandas or polars Series). Dataframes and numpy
arrays are not accepted.
:pr:`920` and :pr:`923` by :user:`Jérôme Dockès <jeromedockes>`.

* Added the :class:`MultiAggJoiner` that allows to augment a main table with
multiple auxiliary tables. :pr:`876` by :user:`Théo Jolivet <TheooJ>`.
Expand Down
Loading

0 comments on commit 6535818

Please sign in to comment.