-
Stayhome Polytechnic
- Mars 🌕
-
13:46
(UTC +08:00)
Stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Original reference implementation of "3D Gaussian Splatting for Real-Time Radiance Field Rendering"
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…
A Collection of Variational Autoencoders (VAE) in PyTorch.
SF3D: Stable Fast 3D Mesh Reconstruction with UV-unwrapping and Illumination Disentanglement
Package for computing causal effects of text (as treatment)
Interpreting Language Models with Contrastive Explanations (EMNLP 2022 Best Paper Honorable Mention)
Package for defining computation graphs and performing intervention experiments
NeurIPS 2022 paper "FR: Folded rationalization with a unified encoder"
ACL 2023 *oral* paper "MGR: Multi-generator based Rationalization"
KDD 2023 paper "Decoupled Rationalization with Asymmetric Learning Rates: A Flexible Lipschitz Restraint"