Starred repositories
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Models and examples built with TensorFlow
scikit-learn: machine learning in Python
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
keras implement of transformers for humans
Real time interactive streaming digital human
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN,…
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Transformer-based models implemented in tensorflow 2.x(using keras).