Skip to content

Latest commit

 

History

History
8 lines (6 loc) · 754 Bytes

README.md

File metadata and controls

8 lines (6 loc) · 754 Bytes

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • NEZHA-TensorFlow is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks developed by TensorFlow.
  • NEZHA-PyTorch is the PyTorch version of NEZHA.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.