Stars
Official implementation of NeurIPS 2024 "Visual Fourier Prompt Tuning"
Official Pytorch implementation of "E2VPT: An Effective and Efficient Approach for Visual Prompt Tuning". (ICCV2023)
The official implementation for paper: Revisting the Power of Prompt for Visual Tuning.
[NeurIPS'22] This is an official implementation for "Scaling & Shifting Your Features: A New Baseline for Efficient Model Tuning".
This repo includes Claude prompt curation to use Claude better.
[ICCV 2023 oral] This is the official repository for our paper: ''Sensitivity-Aware Visual Parameter-Efficient Fine-Tuning''.
Official Pytorch implementation of 'Facing the Elephant in the Room: Visual Prompt Tuning or Full Finetuning'? (ICLR2024)
❄️🔥 Visual Prompt Tuning [ECCV 2022] https://arxiv.org/abs/2203.12119
This is the oficial repository for "Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts" (EMNLP 2022)
On Transferability of Prompt Tuning for Natural Language Processing
Official implementation of ORCA proposed in the paper "Cross-Modal Fine-Tuning: Align then Refine"
Collection of AWESOME vision-language models for vision tasks
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Must-read papers on prompt-based tuning for pre-trained language models.
Code for Language-Interfaced FineTuning for Non-Language Machine Learning Tasks.
pytorch implementation for Contrastive Adaptation Network
Use ChatGPT to summarize the arXiv papers. 全流程加速科研,利用chatgpt进行论文全文总结+专业翻译+润色+审稿+审稿回复
Official codebase for Pretrained Transformers as Universal Computation Engines.
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
An annotated implementation of the Transformer paper.
Code and documentation to train Stanford's Alpaca models, and generate the data.
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Prompt Learning for Vision-Language Models (IJCV'22, CVPR'22)
Publication-ready NN-architecture schematics.
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
A collection of AWESOME things about domian adaptation
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习