Highlights
- Pro
Stars
A curated list of watermarking schemes for generative AI models
A python/pytorch package for invertible neural networks
This code is the official implementation of paper "Certifiably Robust Image Watermark".
Provable adversarial robustness at ImageNet scale
A project page template for academic papers. Demo at https://eliahuhorwitz.github.io/Academic-project-page-template/
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
python library for invisible image watermark (blind image watermark)
A collection of resources and papers on Diffusion Models
A beautiful, simple, clean, and responsive Jekyll theme for academics
Practical course about Large Language Models.
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
pure-python ECDSA signature/verification and ECDH key agreement
Python library for fast elliptic curve crypto
DomainBed is a suite to test domain generalization algorithms
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Code accompanying the paper Pretraining Language Models with Human Preferences
A full pipeline to finetune Vicuna LLM with LoRA and RLHF on consumer hardware. Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the Vicuna architecture. Basically Chat…
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
A curated list of academic events on AI Security & Privacy
Prediction Poisoning: Towards Defenses Against DNN Model Stealing Attacks (ICLR '20)
A curated list of foundation models for vision and language tasks
PyTorch code and models for the DINOv2 self-supervised learning method.
A curated list of awesome papers on dataset distillation and related applications.
[AAAI '23] PINAT: A Permutation INvariance Augmented Transformer for NAS Predictor
[NeurIPS 2021] “Stronger NAS with Weaker Predictors“, Junru Wu, Xiyang Dai, Dongdong Chen, Yinpeng Chen, Mengchen Liu, Ye Yu, Zhangyang Wang, Zicheng Liu, Mei Chen and Lu Yuan
NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.