Skip to content

Commit

Permalink
Use relative links with .md extension (automatically converted to .ht…
Browse files Browse the repository at this point in the history
…ml for github-pages) (awslabs#979)
  • Loading branch information
fhieber authored Dec 1, 2021
1 parent c44f126 commit 2b2d900
Show file tree
Hide file tree
Showing 10 changed files with 22 additions and 34 deletions.
4 changes: 4 additions & 0 deletions docs/_config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,7 @@ markdown: kramdown
highlighter: rouge
url: https://awslabs.github.io
base_url: /sockeye
relative-links:
enabled: true
include:
- tutorials
6 changes: 1 addition & 5 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
---
layout: default
---

# Sockeye

[![PyPI version](https://badge.fury.io/py/sockeye.svg)](https://badge.fury.io/py/sockeye)
Expand All @@ -19,7 +15,7 @@ For a quickstart guide to training a standard NMT model on any size of data, see

If you are interested in collaborating or have any questions, please submit a pull request or [issue](https://github.com/awslabs/sockeye/issues/new).
You can also send questions to *sockeye-dev-at-amazon-dot-com*.
Developers may be interested in [our developer guidelines](development.html).
Developers may be interested in [our developer guidelines](development.md).

## Citation

Expand Down
8 changes: 2 additions & 6 deletions docs/inference.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
---
layout: default
---

# Translation

Decoding (a.k.a. inference or translation) in sockeye is made available through the `sockeye.translate` module.
Expand Down Expand Up @@ -41,12 +37,12 @@ The PNG files will be written to files beginning with the prefix given by the `-

## Source factors

If your [model was trained with source factors](training.html#source-factors), you will need to supply them at test-time, too.
If your [model was trained with source factors](training.md#source-factors), you will need to supply them at test-time, too.
Factors can be provided in three formats: (a) separate, token-parallel files (as in training), (b) direct annotations on words, or (c) in a JSON object.

### Parallel files

You can also provide parallel files, [in the same style as training](training.html#source-factors).
You can also provide parallel files, [in the same style as training](training.md#source-factors).
Factor files are token-parallel to the source and are passed in to `sockeye.translate` via the `--input-factors` flag.
(In this scenario, the source is another file, passed via `--input`).

Expand Down
4 changes: 0 additions & 4 deletions docs/scoring.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
---
layout: default
---

# Scoring existing translations

Sockeye provides a fast scoring module that permits the scoring of existing translations.
Expand Down
8 changes: 2 additions & 6 deletions docs/training.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
---
layout: default
---

# Training

## Data preparation
Expand Down Expand Up @@ -110,7 +106,7 @@ In this case a drop in training throughput is expected.
#### Multi-GPU training
Training can be carried out on multiple GPUs. See the
[WMT 2014 English-German tutorial](https://awslabs.github.io/sockeye/tutorials/wmt_large.html) for more information.
[WMT 2014 English-German tutorial](tutorials/wmt_large.md) for more information.
### Checkpoint averaging
Expand Down Expand Up @@ -152,7 +148,7 @@ Since these embeddings concatenated to those of the word embeddings, the total s
You can also sum the embeddings (`--source-factors-combine sum`).
In this case, you do not need to specify `--source-factors-num-embed`, since they are automatically all set to the size of the word embeddings (`--num-embed`).
You then also have to apply factors for the source side [at inference time](inference.html#source-factors).
You then also have to apply factors for the source side [at inference time](inference.md#source-factors).
## Target factors
Expand Down
12 changes: 6 additions & 6 deletions docs/tutorials.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,16 @@

## Setup

For installing Sockeye follow the [installation instructions](setup.html) to manually install Sockeye and all dependencies.
For installing Sockeye follow the [installation instructions](setup.md) to manually install Sockeye and all dependencies.
The tutorials below might have additional dependencies that will be mentioned at the beginning of each tutorial.

## Tutorials

Below is the full list of tutorials we provide. We recommend going through them in order as they will gradually
introduce different concepts and parameters used for training and translation.

1. [Sequence copy task](tutorials/seqcopy.html)
1. [WMT German to English news translation](tutorials/wmt.html)
1. [Domain adaptation of NMT models](tutorials/adapt.html)
1. [Large data: WMT English-German 2014](tutorials/wmt_large.html)
1. [Multilingual Zero-shot Translation IWSLT 2017](tutorials/multilingual.html)
1. [Sequence copy task](tutorials/seqcopy_tutorial.md)
1. [WMT German to English news translation](tutorials/wmt.md)
1. [Domain adaptation of NMT models](tutorials/adapt.md)
1. [Large data: WMT English-German 2014](tutorials/wmt_large.md)
1. [Multilingual Zero-shot Translation IWSLT 2017](tutorials/multilingual.md)
2 changes: 1 addition & 1 deletion docs/tutorials/adapt.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Although the quality of machine translation systems is nowadays remarkably good,
These customizations may include preferring some word translation over others or adapting the style of the text, among others.
In this tutorial, we show two methods on how to perform domain adaptation of a general translation system using Sockeye.

We assume you already have a trained Sockeye model, for example the one trained from the [WMT tutorial tutorial](wmt.html).
We assume you already have a trained Sockeye model, for example the one trained from the [WMT tutorial](wmt.md).
We also assume that you have two training sets, one composed of general or out-of-domain (OOD) data, and one composed of in-domain (ID) data on which you want to adapt your system.
Note that both datasets need to be pre-processed in the same way.

Expand Down
8 changes: 4 additions & 4 deletions docs/tutorials/multilingual.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
In this tutorial we will train a multilingual Sockeye model that can translate between several language pairs,
including ones that we did not have training data for (this is called _zero-shot translation_).

Please note: this tutorial assumes that you are familiar with the introductory tutorials on [copying
sequences](https://awslabs.github.io/sockeye/tutorials/seqcopy.html)
and [training a standard WMT model](https://awslabs.github.io/sockeye/tutorials/wmt.html).
Please note: this tutorial assumes that you are familiar with the introductory tutorials on
[copying sequences](seqcopy_tutorial.md)
and [training a standard WMT model](wmt.md).

## Approach

Expand Down Expand Up @@ -42,7 +42,7 @@ virtualenv -p python3 sockeye3
source sockeye3/bin/activate
```

Then [install the correct version of Sockeye](https://awslabs.github.io/sockeye/setup.html).
Then [install the correct version of Sockeye](../setup.md).
We also install several libraries for preprocessing, monitoring and evaluation:

```bash
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ The task is then to train a model that copies the sequence from the source to th
This task is on the one hand difficult enough to be interesting and on the other and allows for quickly training a model.

## Setup
For this tutorial we assume that you have successfully [installed](../setup.html) Sockeye.
For this tutorial we assume that you have successfully [installed](../setup.md) Sockeye.
We will be using scripts from the Sockeye repository, so you should either clone the repository or manually download the scripts.
Just as a reminder: Everything is run using Python 3, so depending on your setup you may have to replace `python` with `python3` below.
All of the commands below assume you are running on a CPU.
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/wmt.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ pip install tensorboard

All of the commands below assume you're running on a CPU.
If you have a GPU available you can simply remove `--use-cpu`.
With multiple GPUs you can use `torchrun` to spawn multiple training processes (see [WMT 2014 English-German tutorial](https://awslabs.github.io/sockeye/tutorials/wmt_large.html)).
With multiple GPUs you can use `torchrun` to spawn multiple training processes (see [WMT 2014 English-German tutorial](wmt_large.md)).

## Data

Expand Down

0 comments on commit 2b2d900

Please sign in to comment.