Stars
MOE
2 repositories
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Tutel MoE: An Optimized Mixture-of-Experts Implementation