-
Wild Chicken University
- Dream, Bed
Highlights
- Pro
Stars
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers. Support deepseek-r1
Make websites accessible for AI agents
Official Implementation of ICML 2023 paper: "A Generalization of ViT/MLP-Mixer to Graphs"
Fast and memory-efficient exact attention
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Fully open reproduction of DeepSeek-R1
An implementation of local windowed attention for language modeling
A concise but complete full-attention transformer with a set of promising experimental features from various papers
DSPy: The framework for programming—not prompting—language models
This repository contains the official code for Energy Transformer---an efficient Energy-based Transformer variant for graph classification
The Energy Transformer block, in JAX
🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.
Stanford NLP Python library for Representation Finetuning (ReFT)
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Agent framework and applications built upon Qwen>=2.0, featuring Function Calling, Code Interpreter, RAG, and Chrome extension.
Development repository for the Triton language and compiler
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
A curated list of awesome LLM for Autonomous Driving resources (continually updated)
Fast and memory-efficient exact attention
[WACV 2024 Survey Paper] Multimodal Large Language Models for Autonomous Driving
LimSim & LimSim++: Integrated traffic and autonomous driving simulators with (M)LLM support
Official Implementation for DeepHGCN: Toward Deeper Hyperbolic Graph Convolutional Network
Fine-tuning Vision Transformers on various classification datasets
Official implementation of ICLR'24 paper, "Curiosity-driven Red Teaming for Large Language Models" (https://openreview.net/pdf?id=4KqkizXgXU)