Skip to content

Commit

Permalink
Deploying to gh-pages from @ b76cab3 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
TimotheeMickus committed Sep 25, 2023
1 parent 13bc2b3 commit 4ec2f17
Show file tree
Hide file tree
Showing 16 changed files with 32 additions and 2,299 deletions.
5 changes: 0 additions & 5 deletions _modules/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -175,14 +175,9 @@
<h1>All modules for which code is available</h1>
<ul><li><a href="mammoth/models/model.html">mammoth.models.model</a></li>
<li><a href="mammoth/modules/average_attn.html">mammoth.modules.average_attn</a></li>
<li><a href="mammoth/modules/conv_multi_step_attention.html">mammoth.modules.conv_multi_step_attention</a></li>
<li><a href="mammoth/modules/copy_generator.html">mammoth.modules.copy_generator</a></li>
<li><a href="mammoth/modules/embeddings.html">mammoth.modules.embeddings</a></li>
<li><a href="mammoth/modules/global_attention.html">mammoth.modules.global_attention</a></li>
<li><a href="mammoth/modules/multi_headed_attn.html">mammoth.modules.multi_headed_attn</a></li>
<li><a href="mammoth/modules/position_ffn.html">mammoth.modules.position_ffn</a></li>
<li><a href="mammoth/modules/structured_attention.html">mammoth.modules.structured_attention</a></li>
<li><a href="mammoth/modules/weight_norm.html">mammoth.modules.weight_norm</a></li>
<li><a href="mammoth/trainer.html">mammoth.trainer</a></li>
<li><a href="mammoth/translate/beam_search.html">mammoth.translate.beam_search</a></li>
<li><a href="mammoth/translate/decode_strategy.html">mammoth.translate.decode_strategy</a></li>
Expand Down
295 changes: 0 additions & 295 deletions _modules/mammoth/modules/conv_multi_step_attention.html

This file was deleted.

483 changes: 0 additions & 483 deletions _modules/mammoth/modules/copy_generator.html

This file was deleted.

6 changes: 2 additions & 4 deletions _modules/mammoth/modules/embeddings.html
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,6 @@ <h1>Source code for mammoth.modules.embeddings</h1><div class="highlight"><pre>
<span class="c1"># from mammoth.utils.logging import logger</span>

<span class="c1"># import bitsandbytes as bnb</span>
<span class="c1"># from mammoth.modules.stable_embeddings import StableEmbedding</span>


<span class="k">class</span> <span class="nc">SequenceTooLongError</span><span class="p">(</span><span class="ne">Exception</span><span class="p">):</span>
Expand Down Expand Up @@ -243,7 +242,7 @@ <h1>Source code for mammoth.modules.embeddings</h1><div class="highlight"><pre>
<div class="viewcode-block" id="Embeddings"><a class="viewcode-back" href="../../../mammoth.modules.html#mammoth.modules.Embeddings">[docs]</a><span class="k">class</span> <span class="nc">Embeddings</span><span class="p">(</span><span class="n">nn</span><span class="o">.</span><span class="n">Module</span><span class="p">):</span>
<span class="w"> </span><span class="sd">&quot;&quot;&quot;Words embeddings for encoder/decoder.</span>

<span class="sd"> Additionally includes ability to add sparse input features</span>
<span class="sd"> Additionally includes ability to add input features</span>
<span class="sd"> based on &quot;Linguistic Input Features Improve Neural Machine Translation&quot;</span>
<span class="sd"> :cite:`sennrich2016linguistic`.</span>

Expand Down Expand Up @@ -293,7 +292,6 @@ <h1>Source code for mammoth.modules.embeddings</h1><div class="highlight"><pre>
<span class="n">feat_padding_idx</span><span class="o">=</span><span class="p">[],</span>
<span class="n">feat_vocab_sizes</span><span class="o">=</span><span class="p">[],</span>
<span class="n">dropout</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span>
<span class="n">sparse</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span>
<span class="n">freeze_word_vecs</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span>
<span class="p">):</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_validate_args</span><span class="p">(</span><span class="n">feat_merge</span><span class="p">,</span> <span class="n">feat_vocab_sizes</span><span class="p">,</span> <span class="n">feat_vec_exponent</span><span class="p">,</span> <span class="n">feat_vec_size</span><span class="p">,</span> <span class="n">feat_padding_idx</span><span class="p">)</span>
Expand Down Expand Up @@ -324,7 +322,7 @@ <h1>Source code for mammoth.modules.embeddings</h1><div class="highlight"><pre>
<span class="c1"># The embedding matrix look-up tables. The first look-up table</span>
<span class="c1"># is for words. Subsequent ones are for features, if any exist.</span>
<span class="n">emb_params</span> <span class="o">=</span> <span class="nb">zip</span><span class="p">(</span><span class="n">vocab_sizes</span><span class="p">,</span> <span class="n">emb_dims</span><span class="p">,</span> <span class="n">pad_indices</span><span class="p">)</span>
<span class="n">embeddings</span> <span class="o">=</span> <span class="p">[</span><span class="n">nn</span><span class="o">.</span><span class="n">Embedding</span><span class="p">(</span><span class="n">vocab</span><span class="p">,</span> <span class="n">dim</span><span class="p">,</span> <span class="n">padding_idx</span><span class="o">=</span><span class="n">pad</span><span class="p">,</span> <span class="n">sparse</span><span class="o">=</span><span class="n">sparse</span><span class="p">)</span> <span class="k">for</span> <span class="n">vocab</span><span class="p">,</span> <span class="n">dim</span><span class="p">,</span> <span class="n">pad</span> <span class="ow">in</span> <span class="n">emb_params</span><span class="p">]</span>
<span class="n">embeddings</span> <span class="o">=</span> <span class="p">[</span><span class="n">nn</span><span class="o">.</span><span class="n">Embedding</span><span class="p">(</span><span class="n">vocab</span><span class="p">,</span> <span class="n">dim</span><span class="p">,</span> <span class="n">padding_idx</span><span class="o">=</span><span class="n">pad</span><span class="p">)</span> <span class="k">for</span> <span class="n">vocab</span><span class="p">,</span> <span class="n">dim</span><span class="p">,</span> <span class="n">pad</span> <span class="ow">in</span> <span class="n">emb_params</span><span class="p">]</span>
<span class="n">emb_luts</span> <span class="o">=</span> <span class="n">Elementwise</span><span class="p">(</span><span class="n">feat_merge</span><span class="p">,</span> <span class="n">embeddings</span><span class="p">)</span>

<span class="c1"># The final output size of word + feature vectors. This can vary</span>
Expand Down
Loading

0 comments on commit 4ec2f17

Please sign in to comment.