31

GitHub - huawei-noah/Pretrained-Language-Model: Pretrained language model and it...

 4 years ago
source link: https://github.com/huawei-noah/Pretrained-Language-Model
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

README.md

Pretrained Language Model

This repository provides the latest pretrained language models and its related optimization techniques developed by Huawei Noah's Ark Lab.

Directory structure

  • NEZHA is a pretrained Chinese language model which achieves the state-of-the-art performances on several Chinese NLP tasks.
  • TinyBERT is a compressed BERT model which achieves 7.5x smaller and 9.4x faster on inference.

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK