Stars
A Python implementation of global optimization with gaussian processes.
Experimental Research Computing documentation generation using mkdocs
Quick, visual, principled introduction to pytorch code through five colab notebooks.
Code for NAACL 2018 paper "Multi-task Learning of Pairwise Sequence Classification Tasks Over Disparate Label Spaces" by Isabelle Augenstein, Sebastian Ruder, Anders Søgaard
Multitask Learning with Pretrained Transformers
Codes for "Learning Sparse Sharing Architectures for Multiple Tasks"
word2vec, sentence2vec, machine reading comprehension, dialog system, text classification, pretrained language model (i.e., XLNet, BERT, ELMo, GPT), sequence labeling, information retrieval, inform…
glaserL / conll
Forked from acoli-repo/conll-mergeTools for manipulating CoNLL TSV and related formats
Tools for manipulating CoNLL TSV and related formats
❤️中国科学技术大学计算机学院课程资源(https://mbinary.xyz/ustc-cs/)
Pytorch-Named-Entity-Recognition-with-BERT
reference tensorflow code for named entity tagging
Multiple Different Natural Language Processing Tasks in a Single Deep Model
BERT fine-tuning for POS tagging task (Keras)
《神经网络与深度学习》 邱锡鹏著 Neural Network and Deep Learning
此项目是机器学习(Machine Learning)、深度学习(Deep Learning)、NLP面试中常考到的知识点和代码实现,也是作为一个算法工程师必会的理论基础知识。
深度学习面试宝典(含数学、机器学习、深度学习、计算机视觉、自然语言处理和SLAM等方向)
A short tutorial on Elmo training (Pre trained, Training on new data, Incremental training)
这是一个seq2seq模型,编码器是bert,解码器是transformer的解码器,可用于自然语言处理中文本生成领域的任务