- CHINA
Stars
🤗 smolagents: a barebones library for agents. Agents write python code to call tools and orchestrate other agents.
State-of-the-Art Text Embeddings
Large Concept Models: Language modeling in a sentence representation space
A V2Ray client for Android, support Xray core and v2fly core
A collection of LogitsProcessors to customize and enhance LLM behavior for specific tasks.
PyTorch implementation of the paper: Long-tail Learning via Logit Adjustment
Winning solution for the Kaggle Feedback Prize Challenge.
A list of awesome academic researches and industrial materials about Large Language Model (LLM) and Artificial Intelligence for IT Operations (AIOps).
A machine learning compiler for GPUs, CPUs, and ML accelerators
Simulation of spiking neural networks (SNNs) using PyTorch.
A playbook for systematically maximizing the performance of deep learning models.
MinHash, LSH, LSH Forest, Weighted MinHash, HyperLogLog, HyperLogLog++, LSH Ensemble and HNSW
Generalist and Lightweight Model for Named Entity Recognition (Extract any entity types from texts) @ NAACL 2024
A JavaScript / TypeScript / Python / C# / PHP cryptocurrency trading API with support for more than 100 bitcoin/altcoin exchanges
Retrieval and Retrieval-augmented LLMs
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
🦜🔗 Build context-aware reasoning applications
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
A high-throughput and memory-efficient inference and serving engine for LLMs
PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
Revisiting Long-term Time Series Forecasting: An Investigation on Linear Mapping
A professional list of Papers, Tutorials, and Surveys on AI for Time Series in top AI conferences and journals.
A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.
This is the pytorch implementation of Basisformer in the Neurips paper: [BasisFormer: Attention-based Time Series Forecasting with Learnable and Interpretable Basis]