Skip to content
View liux520's full-sized avatar

Block or report liux520

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Stars

MOE

2 repositories

PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538

Python 1,044 107 Updated Apr 19, 2024

Tutel MoE: An Optimized Mixture-of-Experts Implementation

Python 766 96 Updated Feb 20, 2025