Skip to content

Commit

Permalink
Fix deadlines (PaddlePaddle#491)
Browse files Browse the repository at this point in the history
* fix deadlinks

* fix deadlinks
  • Loading branch information
yingyibiao authored Jun 4, 2021
1 parent 9f9f0b1 commit f50b3ca
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 4 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,7 @@ PaddleNLP基于PaddlePaddle 2.0全新API体系,提供了丰富的应用场景
| :--------------- | ---------- |
| [SimNet](./examples/text_matching/simnet/) | 百度提出的的语义匹配框架,主要使用BOW、CNN、GRNN等核心网络作为表示层,适用于信息检索、新闻推荐、智能客服等多种语义匹配应用场景。|
| [ERNIE](./examples/text_matching/ernie_matching/) | 基于ERNIE使用LCQMC数据完成中文句对匹配任务,提供了Pointwise和Pairwise两种类型学习方式。 |
| [Sentence-BERT](./examples/text_matching/sentence_transformer/) | 基于Siamese双塔结构的[Sentence-BERT](https://arxiv.org/abs/1908.1008)文本匹配模型,可用于获取基于Transformer预训练模型的句子向量化表示。
| [Sentence-BERT](./examples/text_matching/sentence_transformers/) | 基于Siamese双塔结构的[Sentence-BERT](https://arxiv.org/abs/1908.1008)文本匹配模型,可用于获取基于Transformer预训练模型的句子向量化表示。

#### 语义索引 (Semantic Indexing)

Expand Down
4 changes: 2 additions & 2 deletions README_en.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ electra = ElectraModel.from_pretrained('chinese-electra-small')
gpt = GPTForPretraining.from_pretrained('gpt-cpm-large-cn')
```

For more pretrained model selection, please refer to [Transformer API](./docs/transformers.md)
For more pretrained model selection, please refer to [Transformer API](./docs/model_zoo/transformers.rst)

### Extract Feature Through Pre-trained Model

Expand Down Expand Up @@ -128,7 +128,7 @@ For model zoo introduction please refer to[PaddleNLP Model Zoo](./docs/model_zoo

## API Usage

- [Transformer API](./docs/transformers.md)
- [Transformer API](./docs/model_zoo/transformers.rst)
- [Data API](./docs/data.md)
- [Dataset API](./docs/datasets.md)
- [Embedding API](./docs/embeddings.md)
Expand Down
1 change: 0 additions & 1 deletion docs/model_zoo.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ PaddleNLP提供了丰富的模型结构,包含经典的RNN类模型结构,
| ------ | ------ |
| [Transformer](../examples/machine_translation/transformer/) | [Attention Is All You Need](https://arxiv.org/abs/1706.03762) |
| [Transformer-XL](../examples/language_model/transformer-xl/) | [Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context](https://arxiv.org/abs/1901.02860) |
| [ALBERT](../examples/language_model/albert/) | [ALBERT: A Lite BERT for Self-supervised Learning of Language Representations](https://arxiv.org/abs/1909.11942) |
| [BERT](../examples/language_model/bert/) | [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805) |
| [ERNIE](../examples/text_classification/pretrained_models) | [ERNIE: Enhanced Representation through Knowledge Integration](https://arxiv.org/abs/1904.09223) |
| [ERNIE-Tiny](../examples/text_classification/pretrained_models) | 百度自研的小型化ERNIE网络结构,采用浅层Transformer,加宽隐层参数,中文subword粒度词表结合蒸馏的方法使模型相比SOTA Before BERT 提升8.35%, 速度提升4.3倍。 |
Expand Down

0 comments on commit f50b3ca

Please sign in to comment.