Skip to content

Commit

Permalink
Updated documentation for LM finetuning script
Browse files Browse the repository at this point in the history
  • Loading branch information
LysandreJik committed Sep 2, 2019
1 parent 3fbf301 commit 7f52243
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 6 deletions.
6 changes: 5 additions & 1 deletion docs/source/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -459,14 +459,16 @@ The same option as in the original scripts are provided, please refer to the cod


Causal LM fine-tuning on GPT/GPT-2, Masked LM fine-tuning on BERT/RoBERTa
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Before running the following examples you should download the `WikiText-2 dataset <https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/>`__ and unpack it to some directory `$WIKITEXT_2_DATASET`
The following results were obtained using the `raw` WikiText-2 (no tokens were replaced before the tokenization).

This example fine-tunes GPT-2 on the WikiText-2 dataset. The loss function is a causal language modeling loss (perplexity).

.. code-block:: bash
export WIKITEXT_2_DATASET=/path/to/wikitext_dataset
python run_lm_finetuning.py
Expand All @@ -485,6 +487,8 @@ This example fine-tunes RoBERTa on the WikiText-2 dataset. The loss function is
The `--mlm` flag is necessary to fine-tune BERT/RoBERTa on masked language modeling.

.. code-block:: bash
export WIKITEXT_2_DATASET=/path/to/wikitext_dataset
python run_lm_finetuning.py
Expand Down
10 changes: 5 additions & 5 deletions docs/source/model_doc/distilbert.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,35 +2,35 @@ DistilBERT
----------------------------------------------------

``DistilBertConfig``
~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: pytorch_transformers.DistilBertConfig
:members:


``DistilBertTokenizer``
~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: pytorch_transformers.DistilBertTokenizer
:members:


``DistilBertModel``
~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: pytorch_transformers.DistilBertModel
:members:


``DistilBertForMaskedLM``
~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: pytorch_transformers.DistilBertForMaskedLM
:members:


``DistilBertForSequenceClassification``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. autoclass:: pytorch_transformers.DistilBertForSequenceClassification
:members:
Expand Down

0 comments on commit 7f52243

Please sign in to comment.