-
Tsinghua University
- Beijing
Highlights
- Pro
Stars
翻墙-科学上网、自由上网、免费科学上网、免费翻墙、fanqiang、油管youtube/视频下载、软件、VPN、一键翻墙浏览器,vps一键搭建翻墙服务器脚本/教程,免费shadowsocks/ss/ssr/v2ray/goflyway账号/节点,翻墙梯子,电脑、手机、iOS、安卓、windows、Mac、Linux、路由器翻墙、科学上网、youtube视频下载、youtube油管镜像/免翻墙…
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
End-to-End Object Detection with Transformers
Machine Learning Engineering Open Book
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Jittor is a high-performance deep learning framework based on JIT compiling and meta-operators.
Simple, minimal implementation of the Mamba SSM in one file of PyTorch.
Large-scale, Informative, and Diverse Multi-round Chat Data (and Models)
An unofficial styleguide and best practices summary for PyTorch
CogDL: A Comprehensive Library for Graph Deep Learning (WWW 2023)
A simple and efficient Mamba implementation in pure PyTorch and MLX.
A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
Efficient Training (including pre-training and fine-tuning) for Big Models
Python package for performing Entity and Text Matching using Deep Learning.
Assignment 1: automatic differentiation
Source code and dataset for ACL 2019 paper "Cognitive Graph for Multi-Hop Reading Comprehension at Scale"
A comprehensive, unified and modular event extraction toolkit.
内网穿透工具 基于Python/WebSocket实现, Expose your local services to the internet.
Pytorch implementation of baseline models of KQA Pro, a large-scale dataset of complex question answering over knowledge base.
A large-scale knowledge repository for adaptive learning, learning analytics, and knowledge discovery in MOOCs, hosted by THU KEG.
Do Pre-trained Models Benefit Knowledge Graph Completion? A Reliable Evaluation and a Reasonable Approach