diff --git a/docs/source/CONTRIBUTING.md b/docs/source/CONTRIBUTING.md index 7ad1425b..717a3ca0 100644 --- a/docs/source/CONTRIBUTING.md +++ b/docs/source/CONTRIBUTING.md @@ -5,7 +5,7 @@ OpenNMT-py is a community developed project and we love developer contributions. ## Guidelines Before sending a PR, please do this checklist first: -- Please run `onmt/tests/pull_request_chk.sh` and fix any errors. When adding new functionality, also add tests to this script. Included checks: +- Please run `mammoth/tests/pull_request_chk.sh` and fix any errors. When adding new functionality, also add tests to this script. Included checks: 1. flake8 check for coding style; 2. unittest; 3. continuous integration tests listed in `.travis.yml`. diff --git a/docs/source/FAQ.md b/docs/source/FAQ.md index 05ae79a7..94d1d618 100644 --- a/docs/source/FAQ.md +++ b/docs/source/FAQ.md @@ -236,7 +236,7 @@ Note: all the details about every flag and options for each transform can be fou Transform name: `filtertoolong` -Class: `onmt.transforms.misc.FilterTooLongTransform` +Class: `mammoth.transforms.misc.FilterTooLongTransform` The following options can be added to the configuration : - `src_seq_length`: maximum source sequence length; @@ -246,7 +246,7 @@ The following options can be added to the configuration : Transform name: `prefix` -Class: `onmt.transforms.misc.PrefixTransform` +Class: `mammoth.transforms.misc.PrefixTransform` For each dataset that the `prefix` transform is applied to, you can set the additional `src_prefix` and `tgt_prefix` parameters in its data configuration: @@ -276,7 +276,7 @@ Common options for the tokenization transforms are the following: Transform name: `onmt_tokenize` -Class: `onmt.transforms.tokenize.ONMTTokenizerTransform` +Class: `mammoth.transforms.tokenize.ONMTTokenizerTransform` Additional options are available: - `src_subword_type`: type of subword model for source side (from `["none", "sentencepiece", "bpe"]`); @@ -288,7 +288,7 @@ Additional options are available: Transform name: `sentencepiece` -Class: `onmt.transforms.tokenize.SentencePieceTransform` +Class: `mammoth.transforms.tokenize.SentencePieceTransform` The `src_subword_model` and `tgt_subword_model` should be valid sentencepiece models. @@ -296,7 +296,7 @@ The `src_subword_model` and `tgt_subword_model` should be valid sentencepiece mo Transform name: `bpe` -Class: `onmt.transforms.tokenize.BPETransform` +Class: `mammoth.transforms.tokenize.BPETransform` The `src_subword_model` and `tgt_subword_model` should be valid BPE models. @@ -321,7 +321,7 @@ These different types of noise can be controlled with the following options: Transform name: `switchout` -Class: `onmt.transforms.sampling.SwitchOutTransform` +Class: `mammoth.transforms.sampling.SwitchOutTransform` Options: @@ -331,7 +331,7 @@ Options: Transform name: `tokendrop` -Class: `onmt.transforms.sampling.TokenDropTransform` +Class: `mammoth.transforms.sampling.TokenDropTransform` Options: @@ -341,7 +341,7 @@ Options: Transform name: `tokenmask` -Class: `onmt.transforms.sampling.TokenMaskTransform` +Class: `mammoth.transforms.sampling.TokenMaskTransform` Options: @@ -427,7 +427,7 @@ The `example` argument of `apply` is a `dict` of the form: } ``` -This is defined in `onmt.inputters.corpus.ParallelCorpus.load`. This class is not easily extendable for now but it can be considered for future developments. For instance, we could create some `CustomParallelCorpus` class that would handle other kind of inputs. +This is defined in `mammoth.inputters.corpus.ParallelCorpus.load`. This class is not easily extendable for now but it can be considered for future developments. For instance, we could create some `CustomParallelCorpus` class that would handle other kind of inputs. ## Can I get word alignments while translating? @@ -649,7 +649,7 @@ A server configuration file (`./available_models/conf.json`) is required. It con ### II. How to start the server without Docker ? --- ##### 0. Get the code -The translation server has been merged into onmt-py `master` branch. +The translation server has been merged into mammoth-py `master` branch. Keep in line with master for last fix / improvements. ##### 1. Install `flask` ```bash @@ -699,7 +699,7 @@ RUN pip install --no-cache-dir -r requirements.txt COPY server.py ./ COPY tools ./tools COPY available_models ./available_models -COPY onmt ./onmt +COPY mammoth ./mammoth CMD ["python", "./server.py"] ``` diff --git a/docs/source/examples/LanguageModelGeneration.md b/docs/source/examples/LanguageModelGeneration.md index a6e9bb40..dc4ae3ec 100644 --- a/docs/source/examples/LanguageModelGeneration.md +++ b/docs/source/examples/LanguageModelGeneration.md @@ -106,7 +106,7 @@ Options contained in the loaded model will trigger language modeling specific in head data/wikitext-103-raw/wiki.valid.bpe | cut -d" " -f-15 > data/wikitext-103-raw/lm_input.txt ``` -To proceed with LM inference, sampling methods such as top-k sampling or nucleus sampling are usually applied. Details and options about inference methods can be found in [`onmt/opts.py`](https://github.com/OpenNMT/OpenNMT-py/tree/master/onmt/opts.py). +To proceed with LM inference, sampling methods such as top-k sampling or nucleus sampling are usually applied. Details and options about inference methods can be found in [`mammoth/opts.py`](https://github.com/OpenNMT/OpenNMT-py/tree/master/mammoth/opts.py). The following command will provide inference with nucleus sampling of p=0.9 and return the 3 sequences with the lowest perplexity out of the 10 generated sequences: ```bash diff --git a/docs/source/examples/Library.ipynb b/docs/source/examples/Library.ipynb index fc98cefe..115cb6d0 100644 --- a/docs/source/examples/Library.ipynb +++ b/docs/source/examples/Library.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The example notebook (available [here](https://github.com/OpenNMT/OpenNMT-py/blob/master/docs/source/examples/Library.ipynb)) should be able to run as a standalone execution, provided `onmt` is in the path (installed via `pip` for instance).\n", + "The example notebook (available [here](https://github.com/OpenNMT/OpenNMT-py/blob/master/docs/source/examples/Library.ipynb)) should be able to run as a standalone execution, provided `mammoth` is in the path (installed via `pip` for instance).\n", "\n", "Some parts may not be 100% 'library-friendly' but it's mostly workable." ] @@ -42,12 +42,12 @@ "metadata": {}, "outputs": [], "source": [ - "import onmt\n", - "from onmt.inputters.inputter import _load_vocab, _build_fields_vocab, get_fields, IterOnDevice\n", - "from onmt.inputters.corpus import ParallelCorpus\n", - "from onmt.inputters.dynamic_iterator import DynamicDatasetIter\n", - "from onmt.translate import GNMTGlobalScorer, Translator, TranslationBuilder\n", - "from onmt.utils.misc import set_random_seed" + "import mammoth\n", + "from mammoth.inputters.inputter import _load_vocab, _build_fields_vocab, get_fields, IterOnDevice\n", + "from mammoth.inputters.corpus import ParallelCorpus\n", + "from mammoth.inputters.dynamic_iterator import DynamicDatasetIter\n", + "from mammoth.translate import GNMTGlobalScorer, Translator, TranslationBuilder\n", + "from mammoth.utils.misc import set_random_seed" ] }, { @@ -75,7 +75,7 @@ ], "source": [ "# enable logging\n", - "from onmt.utils.logging import init_logger, logger\n", + "from mammoth.utils.logging import init_logger, logger\n", "init_logger()" ] }, @@ -214,7 +214,7 @@ "metadata": {}, "outputs": [], "source": [ - "from onmt.utils.parse import ArgumentParser\n", + "from mammoth.utils.parse import ArgumentParser\n", "parser = ArgumentParser(description='build_vocab.py')" ] }, @@ -224,7 +224,7 @@ "metadata": {}, "outputs": [], "source": [ - "from onmt.opts import dynamic_prepare_opts\n", + "from mammoth.opts import dynamic_prepare_opts\n", "dynamic_prepare_opts(parser, build_vocab_only=True)" ] }, @@ -279,7 +279,7 @@ } ], "source": [ - "from onmt.bin.build_vocab import build_vocab_main\n", + "from mammoth.bin.build_vocab import build_vocab_main\n", "build_vocab_main(opts)" ] }, @@ -382,8 +382,8 @@ { "data": { "text/plain": [ - "{'src': ,\n", - " 'tgt': ,\n", + "{'src': ,\n", + " 'tgt': ,\n", " 'indices': }" ] }, @@ -478,21 +478,21 @@ "rnn_size = 500\n", "# Specify the core model.\n", "\n", - "encoder_embeddings = onmt.modules.Embeddings(emb_size, len(src_vocab),\n", + "encoder_embeddings = mammoth.modules.Embeddings(emb_size, len(src_vocab),\n", " word_padding_idx=src_padding)\n", "\n", - "encoder = onmt.encoders.RNNEncoder(hidden_size=rnn_size, num_layers=1,\n", + "encoder = mammoth.encoders.RNNEncoder(hidden_size=rnn_size, num_layers=1,\n", " rnn_type=\"LSTM\", bidirectional=True,\n", " embeddings=encoder_embeddings)\n", "\n", - "decoder_embeddings = onmt.modules.Embeddings(emb_size, len(tgt_vocab),\n", + "decoder_embeddings = mammoth.modules.Embeddings(emb_size, len(tgt_vocab),\n", " word_padding_idx=tgt_padding)\n", - "decoder = onmt.decoders.decoder.InputFeedRNNDecoder(\n", + "decoder = mammoth.decoders.decoder.InputFeedRNNDecoder(\n", " hidden_size=rnn_size, num_layers=1, bidirectional_encoder=True, \n", " rnn_type=\"LSTM\", embeddings=decoder_embeddings)\n", "\n", "device = \"cuda\" if torch.cuda.is_available() else \"cpu\"\n", - "model = onmt.models.model.NMTModel(encoder, decoder)\n", + "model = mammoth.models.model.NMTModel(encoder, decoder)\n", "model.to(device)\n", "\n", "# Specify the tgt word generator and loss computation module\n", @@ -500,7 +500,7 @@ " nn.Linear(rnn_size, len(tgt_vocab)),\n", " nn.LogSoftmax(dim=-1)).to(device)\n", "\n", - "loss = onmt.utils.loss.NMTLossCompute(\n", + "loss = mammoth.utils.loss.NMTLossCompute(\n", " criterion=nn.NLLLoss(ignore_index=tgt_padding, reduction=\"sum\"),\n", " generator=model.generator)" ] @@ -520,7 +520,7 @@ "source": [ "lr = 1\n", "torch_optimizer = torch.optim.SGD(model.parameters(), lr=lr)\n", - "optim = onmt.utils.optimizers.Optimizer(\n", + "optim = mammoth.utils.optimizers.Optimizer(\n", " torch_optimizer, learning_rate=lr, max_grad_norm=2)" ] }, @@ -681,7 +681,7 @@ { "data": { "text/plain": [ - "" + "" ] }, "execution_count": 28, @@ -690,10 +690,10 @@ } ], "source": [ - "report_manager = onmt.utils.ReportMgr(\n", + "report_manager = mammoth.utils.ReportMgr(\n", " report_every=50, start_time=None, tensorboard_writer=None)\n", "\n", - "trainer = onmt.Trainer(model=model,\n", + "trainer = mammoth.Trainer(model=model,\n", " train_loss=loss,\n", " valid_loss=loss,\n", " optim=optim,\n", @@ -726,9 +726,9 @@ "metadata": {}, "outputs": [], "source": [ - "src_data = {\"reader\": onmt.inputters.str2reader[\"text\"](), \"data\": src_val}\n", - "tgt_data = {\"reader\": onmt.inputters.str2reader[\"text\"](), \"data\": tgt_val}\n", - "_readers, _data = onmt.inputters.Dataset.config(\n", + "src_data = {\"reader\": mammoth.inputters.str2reader[\"text\"](), \"data\": src_val}\n", + "tgt_data = {\"reader\": mammoth.inputters.str2reader[\"text\"](), \"data\": tgt_val}\n", + "_readers, _data = mammoth.inputters.Dataset.config(\n", " [('src', src_data), ('tgt', tgt_data)])" ] }, @@ -738,9 +738,9 @@ "metadata": {}, "outputs": [], "source": [ - "dataset = onmt.inputters.Dataset(\n", + "dataset = mammoth.inputters.Dataset(\n", " vocab_fields, readers=_readers, data=_data,\n", - " sort_key=onmt.inputters.str2sortkey[\"text\"])" + " sort_key=mammoth.inputters.str2sortkey[\"text\"])" ] }, { @@ -749,7 +749,7 @@ "metadata": {}, "outputs": [], "source": [ - "data_iter = onmt.inputters.OrderedIterator(\n", + "data_iter = mammoth.inputters.OrderedIterator(\n", " dataset=dataset,\n", " device=\"cuda\",\n", " batch_size=10,\n", @@ -766,8 +766,8 @@ "metadata": {}, "outputs": [], "source": [ - "src_reader = onmt.inputters.str2reader[\"text\"]\n", - "tgt_reader = onmt.inputters.str2reader[\"text\"]\n", + "src_reader = mammoth.inputters.str2reader[\"text\"]\n", + "tgt_reader = mammoth.inputters.str2reader[\"text\"]\n", "scorer = GNMTGlobalScorer(alpha=0.7, \n", " beta=0., \n", " length_penalty=\"avg\", \n", @@ -779,7 +779,7 @@ " tgt_reader=tgt_reader, \n", " global_scorer=scorer,\n", " gpu=gpu)\n", - "builder = onmt.translate.TranslationBuilder(data=dataset, \n", + "builder = mammoth.translate.TranslationBuilder(data=dataset, \n", " fields=vocab_fields)" ] }, diff --git a/docs/source/examples/Library.md b/docs/source/examples/Library.md index 230e06b6..12b6ce25 100644 --- a/docs/source/examples/Library.md +++ b/docs/source/examples/Library.md @@ -1,7 +1,7 @@ # Library -The example notebook (available [here](https://github.com/OpenNMT/OpenNMT-py/blob/master/docs/source/examples/Library.ipynb)) should be able to run as a standalone execution, provided `onmt` is in the path (installed via `pip` for instance). +The example notebook (available [here](https://github.com/OpenNMT/OpenNMT-py/blob/master/docs/source/examples/Library.ipynb)) should be able to run as a standalone execution, provided `mammoth` is in the path (installed via `pip` for instance). Some parts may not be 100% 'library-friendly' but it's mostly workable. @@ -18,12 +18,12 @@ from collections import defaultdict, Counter ```python -import onmt -from onmt.inputters.inputter import _load_vocab, _build_fields_vocab, get_fields, IterOnDevice -from onmt.inputters.corpus import ParallelCorpus -from onmt.inputters.dynamic_iterator import DynamicDatasetIter -from onmt.translate import GNMTGlobalScorer, Translator, TranslationBuilder -from onmt.utils.misc import set_random_seed +import mammoth +from mammoth.inputters.inputter import _load_vocab, _build_fields_vocab, get_fields, IterOnDevice +from mammoth.inputters.corpus import ParallelCorpus +from mammoth.inputters.dynamic_iterator import DynamicDatasetIter +from mammoth.translate import GNMTGlobalScorer, Translator, TranslationBuilder +from mammoth.utils.misc import set_random_seed ``` ### Enable logging @@ -31,7 +31,7 @@ from onmt.utils.misc import set_random_seed ```python # enable logging -from onmt.utils.logging import init_logger, logger +from mammoth.utils.logging import init_logger, logger init_logger() ``` @@ -119,13 +119,13 @@ with open("toy-ende/config.yaml", "w") as f: ```python -from onmt.utils.parse import ArgumentParser +from mammoth.utils.parse import ArgumentParser parser = DynamicArgumentParser(description='build_vocab.py') ``` ```python -from onmt.opts import dynamic_prepare_opts +from mammoth.opts import dynamic_prepare_opts dynamic_prepare_opts(parser, build_vocab_only=True) ``` @@ -149,7 +149,7 @@ opts ```python -from onmt.bin.build_vocab import build_vocab_main +from mammoth.bin.build_vocab import build_vocab_main build_vocab_main(opts) ``` @@ -221,8 +221,8 @@ fields - {'src': , - 'tgt': , + {'src': , + 'tgt': , 'indices': } @@ -272,21 +272,21 @@ emb_size = 100 rnn_size = 500 # Specify the core model. -encoder_embeddings = onmt.modules.Embeddings(emb_size, len(src_vocab), +encoder_embeddings = mammoth.modules.Embeddings(emb_size, len(src_vocab), word_padding_idx=src_padding) -encoder = onmt.encoders.RNNEncoder(hidden_size=rnn_size, num_layers=1, +encoder = mammoth.encoders.RNNEncoder(hidden_size=rnn_size, num_layers=1, rnn_type="LSTM", bidirectional=True, embeddings=encoder_embeddings) -decoder_embeddings = onmt.modules.Embeddings(emb_size, len(tgt_vocab), +decoder_embeddings = mammoth.modules.Embeddings(emb_size, len(tgt_vocab), word_padding_idx=tgt_padding) -decoder = onmt.decoders.decoder.InputFeedRNNDecoder( +decoder = mammoth.decoders.decoder.InputFeedRNNDecoder( hidden_size=rnn_size, num_layers=1, bidirectional_encoder=True, rnn_type="LSTM", embeddings=decoder_embeddings) device = "cuda" if torch.cuda.is_available() else "cpu" -model = onmt.models.model.NMTModel(encoder, decoder) +model = mammoth.models.model.NMTModel(encoder, decoder) model.to(device) # Specify the tgt word generator and loss computation module @@ -294,7 +294,7 @@ model.generator = nn.Sequential( nn.Linear(rnn_size, len(tgt_vocab)), nn.LogSoftmax(dim=-1)).to(device) -loss = onmt.utils.loss.NMTLossCompute( +loss = mammoth.utils.loss.NMTLossCompute( criterion=nn.NLLLoss(ignore_index=tgt_padding, reduction="sum"), generator=model.generator) ``` @@ -305,7 +305,7 @@ Now we set up the optimizer. This could be a core torch optim class, or our wrap ```python lr = 1 torch_optimizer = torch.optim.SGD(model.parameters(), lr=lr) -optim = onmt.utils.optimizers.Optimizer( +optim = mammoth.utils.optimizers.Optimizer( torch_optimizer, learning_rate=lr, max_grad_norm=2) ``` @@ -374,10 +374,10 @@ Finally we train. ```python -report_manager = onmt.utils.ReportMgr( +report_manager = mammoth.utils.ReportMgr( report_every=50, start_time=None, tensorboard_writer=None) -trainer = onmt.Trainer(model=model, +trainer = mammoth.Trainer(model=model, train_loss=loss, valid_loss=loss, optim=optim, @@ -435,7 +435,7 @@ trainer.train(train_iter=train_iter, - + @@ -445,22 +445,22 @@ For translation, we can build a "traditional" (as opposed to dynamic) dataset fo ```python -src_data = {"reader": onmt.inputters.str2reader["text"](), "data": src_val} -tgt_data = {"reader": onmt.inputters.str2reader["text"](), "data": tgt_val} -_readers, _data = onmt.inputters.Dataset.config( +src_data = {"reader": mammoth.inputters.str2reader["text"](), "data": src_val} +tgt_data = {"reader": mammoth.inputters.str2reader["text"](), "data": tgt_val} +_readers, _data = mammoth.inputters.Dataset.config( [('src', src_data), ('tgt', tgt_data)]) ``` ```python -dataset = onmt.inputters.Dataset( +dataset = mammoth.inputters.Dataset( vocab_fields, readers=_readers, data=_data, - sort_key=onmt.inputters.str2sortkey["text"]) + sort_key=mammoth.inputters.str2sortkey["text"]) ``` ```python -data_iter = onmt.inputters.OrderedIterator( +data_iter = mammoth.inputters.OrderedIterator( dataset=dataset, device="cuda", batch_size=10, @@ -473,8 +473,8 @@ data_iter = onmt.inputters.OrderedIterator( ```python -src_reader = onmt.inputters.str2reader["text"] -tgt_reader = onmt.inputters.str2reader["text"] +src_reader = mammoth.inputters.str2reader["text"] +tgt_reader = mammoth.inputters.str2reader["text"] scorer = GNMTGlobalScorer(alpha=0.7, beta=0., length_penalty="avg", @@ -486,7 +486,7 @@ translator = Translator(model=model, tgt_reader=tgt_reader, global_scorer=scorer, gpu=gpu) -builder = onmt.translate.TranslationBuilder(data=dataset, +builder = mammoth.translate.TranslationBuilder(data=dataset, fields=vocab_fields) ``` diff --git a/docs/source/mammoth.inputters.rst b/docs/source/mammoth.inputters.rst new file mode 100644 index 00000000..b95aae67 --- /dev/null +++ b/docs/source/mammoth.inputters.rst @@ -0,0 +1,20 @@ +Data Loaders +================= + +Data Readers +------------- + +.. autoexception:: mammoth.inputters.datareader_base.MissingDependencyException + +.. autoclass:: mammoth.inputters.DataReaderBase + :members: + +.. autoclass:: mammoth.inputters.TextDataReader + :members: + + +Dataset +-------- + +.. autoclass:: mammoth.inputters.Dataset + :members: diff --git a/docs/source/mammoth.modules.rst b/docs/source/mammoth.modules.rst new file mode 100644 index 00000000..de33bfd5 --- /dev/null +++ b/docs/source/mammoth.modules.rst @@ -0,0 +1,109 @@ +Modules +============= + +Core Modules +------------ + +.. autoclass:: mammoth.modules.Embeddings + :members: + + +Encoders +--------- + +.. autoclass:: mammoth.encoders.EncoderBase + :members: + +.. autoclass:: mammoth.encoders.MeanEncoder + :members: + +.. autoclass:: mammoth.encoders.RNNEncoder + :members: + + +Decoders +--------- + + +.. autoclass:: mammoth.decoders.DecoderBase + :members: + +.. autoclass:: mammoth.decoders.decoder.RNNDecoderBase + :members: + +.. autoclass:: mammoth.decoders.StdRNNDecoder + :members: + +.. autoclass:: mammoth.decoders.InputFeedRNNDecoder + :members: + +Attention +---------- + +.. autoclass:: mammoth.modules.AverageAttention + :members: + +.. autoclass:: mammoth.modules.GlobalAttention + :members: + + + +Architecture: Transformer +---------------------------- + +.. autoclass:: mammoth.modules.PositionalEncoding + :members: + +.. autoclass:: mammoth.modules.position_ffn.PositionwiseFeedForward + :members: + +.. autoclass:: mammoth.encoders.TransformerEncoder + :members: + +.. autoclass:: mammoth.decoders.TransformerDecoder + :members: + +.. autoclass:: mammoth.modules.MultiHeadedAttention + :members: + :undoc-members: + + +Architecture: Conv2Conv +---------------------------- + +(These methods are from a user contribution +and have not been thoroughly tested.) + + +.. autoclass:: mammoth.encoders.CNNEncoder + :members: + + +.. autoclass:: mammoth.decoders.CNNDecoder + :members: + +.. autoclass:: mammoth.modules.ConvMultiStepAttention + :members: + +.. autoclass:: mammoth.modules.WeightNormConv2d + :members: + +Architecture: SRU +---------------------------- + +.. autoclass:: mammoth.models.sru.SRU + :members: + + +Copy Attention +-------------- + +.. autoclass:: mammoth.modules.CopyGenerator + :members: + + +Structured Attention +------------------------------------------- + +.. autoclass:: mammoth.modules.structured_attention.MatrixTree + :members: diff --git a/docs/source/mammoth.rst b/docs/source/mammoth.rst new file mode 100644 index 00000000..cd3d2a8f --- /dev/null +++ b/docs/source/mammoth.rst @@ -0,0 +1,32 @@ +Framework +================= + +Model +----- + +.. autoclass:: mammoth.models.NMTModel + :members: + +Trainer +------- + +.. autoclass:: mammoth.Trainer + :members: + + +.. autoclass:: mammoth.utils.Statistics + :members: + +Loss +---- + + +.. autoclass:: mammoth.utils.loss.LossComputeBase + :members: + + +Optimizer +--------- + +.. autoclass:: mammoth.utils.Optimizer + :members: diff --git a/docs/source/mammoth.translate.translation_server.rst b/docs/source/mammoth.translate.translation_server.rst new file mode 100644 index 00000000..0bc9dad7 --- /dev/null +++ b/docs/source/mammoth.translate.translation_server.rst @@ -0,0 +1,21 @@ +Server +====== + + +Models +------------- + +.. autoclass:: mammoth.translate.translation_server.ServerModel + :members: + + +Core Server +------------ + +.. autoexception:: mammoth.translate.translation_server.ServerModelError + +.. autoclass:: mammoth.translate.translation_server.Timer + :members: + +.. autoclass:: mammoth.translate.translation_server.TranslationServer + :members: diff --git a/docs/source/mammoth.translation.rst b/docs/source/mammoth.translation.rst new file mode 100644 index 00000000..6b075f96 --- /dev/null +++ b/docs/source/mammoth.translation.rst @@ -0,0 +1,39 @@ +Translation +================== + +Translations +------------- + +.. autoclass:: mammoth.translate.Translation + :members: + +Translator Class +----------------- + +.. autoclass:: mammoth.translate.Translator + :members: + +.. autoclass:: mammoth.translate.TranslationBuilder + :members: + + +Decoding Strategies +-------------------- +.. autoclass:: mammoth.translate.DecodeStrategy + :members: + +.. autoclass:: mammoth.translate.BeamSearch + :members: + +.. autofunction:: mammoth.translate.greedy_search.sample_with_temperature + +.. autoclass:: mammoth.translate.GreedySearch + :members: + +Scoring +-------- +.. autoclass:: mammoth.translate.penalties.PenaltyBuilder + :members: + +.. autoclass:: mammoth.translate.GNMTGlobalScorer + :members: diff --git a/docs/source/onmt.inputters.rst b/docs/source/onmt.inputters.rst deleted file mode 100644 index 99507e29..00000000 --- a/docs/source/onmt.inputters.rst +++ /dev/null @@ -1,20 +0,0 @@ -Data Loaders -================= - -Data Readers -------------- - -.. autoexception:: onmt.inputters.datareader_base.MissingDependencyException - -.. autoclass:: onmt.inputters.DataReaderBase - :members: - -.. autoclass:: onmt.inputters.TextDataReader - :members: - - -Dataset --------- - -.. autoclass:: onmt.inputters.Dataset - :members: diff --git a/docs/source/onmt.modules.rst b/docs/source/onmt.modules.rst deleted file mode 100644 index a3ef216e..00000000 --- a/docs/source/onmt.modules.rst +++ /dev/null @@ -1,109 +0,0 @@ -Modules -============= - -Core Modules ------------- - -.. autoclass:: onmt.modules.Embeddings - :members: - - -Encoders ---------- - -.. autoclass:: onmt.encoders.EncoderBase - :members: - -.. autoclass:: onmt.encoders.MeanEncoder - :members: - -.. autoclass:: onmt.encoders.RNNEncoder - :members: - - -Decoders ---------- - - -.. autoclass:: onmt.decoders.DecoderBase - :members: - -.. autoclass:: onmt.decoders.decoder.RNNDecoderBase - :members: - -.. autoclass:: onmt.decoders.StdRNNDecoder - :members: - -.. autoclass:: onmt.decoders.InputFeedRNNDecoder - :members: - -Attention ----------- - -.. autoclass:: onmt.modules.AverageAttention - :members: - -.. autoclass:: onmt.modules.GlobalAttention - :members: - - - -Architecture: Transformer ----------------------------- - -.. autoclass:: onmt.modules.PositionalEncoding - :members: - -.. autoclass:: onmt.modules.position_ffn.PositionwiseFeedForward - :members: - -.. autoclass:: onmt.encoders.TransformerEncoder - :members: - -.. autoclass:: onmt.decoders.TransformerDecoder - :members: - -.. autoclass:: onmt.modules.MultiHeadedAttention - :members: - :undoc-members: - - -Architecture: Conv2Conv ----------------------------- - -(These methods are from a user contribution -and have not been thoroughly tested.) - - -.. autoclass:: onmt.encoders.CNNEncoder - :members: - - -.. autoclass:: onmt.decoders.CNNDecoder - :members: - -.. autoclass:: onmt.modules.ConvMultiStepAttention - :members: - -.. autoclass:: onmt.modules.WeightNormConv2d - :members: - -Architecture: SRU ----------------------------- - -.. autoclass:: onmt.models.sru.SRU - :members: - - -Copy Attention --------------- - -.. autoclass:: onmt.modules.CopyGenerator - :members: - - -Structured Attention -------------------------------------------- - -.. autoclass:: onmt.modules.structured_attention.MatrixTree - :members: diff --git a/docs/source/onmt.rst b/docs/source/onmt.rst deleted file mode 100644 index 5ae056ce..00000000 --- a/docs/source/onmt.rst +++ /dev/null @@ -1,32 +0,0 @@ -Framework -================= - -Model ------ - -.. autoclass:: onmt.models.NMTModel - :members: - -Trainer -------- - -.. autoclass:: onmt.Trainer - :members: - - -.. autoclass:: onmt.utils.Statistics - :members: - -Loss ----- - - -.. autoclass:: onmt.utils.loss.LossComputeBase - :members: - - -Optimizer ---------- - -.. autoclass:: onmt.utils.Optimizer - :members: diff --git a/docs/source/onmt.translate.translation_server.rst b/docs/source/onmt.translate.translation_server.rst deleted file mode 100644 index 3426fade..00000000 --- a/docs/source/onmt.translate.translation_server.rst +++ /dev/null @@ -1,21 +0,0 @@ -Server -====== - - -Models -------------- - -.. autoclass:: onmt.translate.translation_server.ServerModel - :members: - - -Core Server ------------- - -.. autoexception:: onmt.translate.translation_server.ServerModelError - -.. autoclass:: onmt.translate.translation_server.Timer - :members: - -.. autoclass:: onmt.translate.translation_server.TranslationServer - :members: diff --git a/docs/source/onmt.translation.rst b/docs/source/onmt.translation.rst deleted file mode 100644 index bb6f5a5d..00000000 --- a/docs/source/onmt.translation.rst +++ /dev/null @@ -1,39 +0,0 @@ -Translation -================== - -Translations -------------- - -.. autoclass:: onmt.translate.Translation - :members: - -Translator Class ------------------ - -.. autoclass:: onmt.translate.Translator - :members: - -.. autoclass:: onmt.translate.TranslationBuilder - :members: - - -Decoding Strategies --------------------- -.. autoclass:: onmt.translate.DecodeStrategy - :members: - -.. autoclass:: onmt.translate.BeamSearch - :members: - -.. autofunction:: onmt.translate.greedy_search.sample_with_temperature - -.. autoclass:: onmt.translate.GreedySearch - :members: - -Scoring --------- -.. autoclass:: onmt.translate.penalties.PenaltyBuilder - :members: - -.. autoclass:: onmt.translate.GNMTGlobalScorer - :members: diff --git a/docs/source/options/build_vocab.rst b/docs/source/options/build_vocab.rst index 57fda68e..95bdc79b 100644 --- a/docs/source/options/build_vocab.rst +++ b/docs/source/options/build_vocab.rst @@ -2,7 +2,7 @@ Build Vocab =========== .. argparse:: - :filename: ../onmt/bin/build_vocab.py + :filename: ../mammoth/bin/build_vocab.py :func: _get_parser :prog: build_vocab.py diff --git a/docs/source/options/server.rst b/docs/source/options/server.rst index 63b2676f..b883d4fe 100644 --- a/docs/source/options/server.rst +++ b/docs/source/options/server.rst @@ -2,6 +2,6 @@ Server ========= .. argparse:: - :filename: ../onmt/bin/server.py + :filename: ../mammoth/bin/server.py :func: _get_parser :prog: server.py \ No newline at end of file diff --git a/docs/source/options/train.rst b/docs/source/options/train.rst index 67dc1cb2..066aa160 100644 --- a/docs/source/options/train.rst +++ b/docs/source/options/train.rst @@ -2,6 +2,6 @@ Train ===== .. argparse:: - :filename: ../onmt/bin/train.py + :filename: ../mammoth/bin/train.py :func: _get_parser :prog: train.py \ No newline at end of file diff --git a/docs/source/options/translate.rst b/docs/source/options/translate.rst index db0423a4..4b6244b7 100644 --- a/docs/source/options/translate.rst +++ b/docs/source/options/translate.rst @@ -2,6 +2,6 @@ Translate ========= .. argparse:: - :filename: ../onmt/bin/translate.py + :filename: ../mammoth/bin/translate.py :func: _get_parser :prog: translate.py \ No newline at end of file