Stars
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
Survey: A collection of AWESOME papers and resources on the latest research in Mixture of Experts.
A curated reading list of research in Mixture-of-Experts(MoE).
Reading list for research topics in multimodal machine learning
✨✨Latest Advances on Multimodal Large Language Models
🎓Automatically Update Multimodal and Computational Argumentation Papers Daily using Github Actions (Update Every 12th hours)
Paper list about multimodal and large language models, only used to record papers I read in the daily arxiv for personal needs.
🎓Automatically Update LLM inference systems Papers Daily using Github Actions (Update Every 12th hours)
Code for "DAMEX: Dataset-aware Mixture-of-Experts for visual understanding of mixture-of-datasets", accepted at Neurips 2023 (Main conference).
Awesome LLM compression research papers and tools.
[NeurIPS 24] MoE Jetpack: From Dense Checkpoints to Adaptive Mixture of Experts for Vision Tasks
The official implementation of the paper "Demystifying the Compression of Mixture-of-Experts Through a Unified Framework".
Official implementation of "MMNeuron: Discovering Neuron-Level Domain-Specific Interpretation in Multimodal Large Language Model". Our codes are borrowed from Tang's language specific neurons imple…
Mitigating Modality Prior-Induced Hallucinations in Multimodal Large Language Models via Deciphering Attention Causality
obananas / MemVR
Forked from 1zhou-Wang/MemVROfficial implementation of paper 'Look Twice Before You Answer: Memory-Space Visual Retracing for Hallucination Mitigation in Multimodal Large Language Models'.
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts
[Preprint] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models
【EMNLP 2024🔥】Video-LLaVA: Learning United Visual Representation by Alignment Before Projection
[NeurIPS 2024] Code for Dual Prototype Evolving for Test-Time Generalization of Vision-Language Models
Codebase for Aria - an Open Multimodal Native MoE
Official implementation of paper 'Look Twice Before You Answer: Memory-Space Visual Retracing for Hallucination Mitigation in Multimodal Large Language Models'.
An open source implementation of LFMs from Liquid AI: Liquid Foundation Models
obananas / insightface
Forked from deepinsight/insightfaceState-of-the-art 2D and 3D Face Analysis Project
obananas / Incomplete-multi-view-clustering
Forked from Jeaninezpp/Awesome-Incomplete-multi-view-clusteringA collection of incomplete multi-view clustering paper
obananas / GNNPapers
Forked from thunlp/GNNPapersMust-read papers on graph neural networks (GNN)
obananas / machinelearning
Forked from ljpzzz/machinelearningMy blogs and code for machine learning. http://cnblogs.com/pinard
obananas / awesome-awesome-machine-learning
Forked from ZhiningLiu1998/awesome-machine-learning-resourcesA curated list of awesome lists across all machine learning topics. | 机器学习/深度学习/人工智能一切主题 (学习范式/任务/应用/模型/道德/交叉学科/数据集/框架/教程) 的资源列表汇总。
obananas / GitHubDaily
Forked from GitHubDaily/GitHubDaily坚持分享 GitHub 上高质量、有趣实用的开源技术教程、开发者工具、编程网站、技术资讯。