Stars
A visual no-code/code-free web crawler/spider易采集:一个可视化浏览器自动化测试/数据采集/爬虫软件,可以无代码图形化的设计和执行爬虫任务。别名:ServiceWrapper面向Web应用的智能化服务封装系统。
real time face swap and one-click video deepfake with only a single image
使用pytorch_quantization对yolov8进行量化
Effortless data labeling with AI support from Segment Anything and other awesome models.
🕶 A curated list of Tiny Object Detection papers and related resources.
Code for the ICML 2023 paper "SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot".
[ICCV2023 Best Paper Finalist] PyTorch implementation of DiffusionDet (https://arxiv.org/abs/2211.09788)
YOLOv3、YOLOv4、YOLOv5、YOLOv5-Lite、YOLOv6-v1、YOLOv6-v2、YOLOv7、YOLOX、YOLOX-Lite、PP-YOLOE、PP-PicoDet-Plus、YOLO-Fastest v2、FastestDet、YOLOv5-SPD、TensorRT、NCNN、Tengine、OpenVINO
Aim 💫 — An easy-to-use & supercharged open-source experiment tracker.
An implementation of Microsoft's "FastSpeech 2: Fast and High-Quality End-to-End Text to Speech"
😝 TensorFlowTTS: Real-Time State-of-the-art Speech Synthesis for Tensorflow 2 (supported including English, French, Korean, Chinese, German and Easy to adapt for other languages)
Tools for multi-label classification problems.
PyTorch tutorials, examples and some books I found 【不定期更新】整理的PyTorch 最新版教程、例子和书籍
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
[CVPR 2022] This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transformers.
A PyTorch-based library for semi-supervised learning (NeurIPS'21)
Masked Autoencoders Are Scalable Vision Learners
Unofficial PyTorch implementation of Masked Autoencoders Are Scalable Vision Learners
RM Operation can equivalently convert ResNet to VGG, which is better for pruning; and can help RepVGG perform better when the depth is large.
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax for Partial Differential Equations
《Designing Data-Intensive Application》DDIA中文翻译
[NeurIPS 2021 Spotlight] & [IJCV 2024] SOFT: Softmax-free Transformer with Linear Complexity
Official Code for "Non-deep Networks"