Stars
[ACL 2024 Demo] SeaLLMs - Large Language Models for Southeast Asia
该仓库尝试整理推荐系统领域的一些经典算法模型
The source code for our paper "Scenario-Adaptive Feature Interaction for Click-Through Rate Prediction" (accepted by KDD2023 Applied Science Track), which proposes a model for Multi-Scenario/Multi-…
搜索、推荐、广告、用增等工业界实践文章收集(来源:知乎、Datafuntalk、技术公众号)
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
SentiX : A Sentiment-Aware Pretrained Model for Cross-domain Sentiment Aanlysis
integrate external knowledge to machine reading comprehension
A Multi-Aspect Multi-Sentiment Dataset for aspect-based sentiment analysis.
Enriching BERT with Knowledge Graph Embedding for Document Classification (PyTorch)
code for our NAACL 2019 paper: "BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis"
Implemention some Baseline Model upon Bert for Text Classification
KG-BERT: BERT for Knowledge Graph Completion
Implementation of R-GCNs for Relational Link Prediction
Domain Adaptation using External Knowledge for Sentiment Analysis
Code and data for the paper, "Automatically Neutralizing Subjective Bias in Text"
A Human-like Semantic Cognition Network for Aspect-level Sentiment Classification
Projecting Embeddings for Domain Adaptation: Joint Modeling of Sentiment in Diverse Domains
翻墙-科学上网、自由上网、免费科学上网、免费翻墙、油管youtube、fanqiang、软件、VPN、一键翻墙浏览器,vps一键搭建翻墙服务器脚本/教程,免费shadowsocks/ss/ssr/v2ray/goflyway账号/节点,翻墙梯子,电脑、手机、iOS、安卓、windows、Mac、Linux、路由器翻墙、科学上网、youtube视频下载、美区apple id共享账号
The code for 2019 Tencent College Algorithm Contest, and the online result ranks 1st in the preliminary.
My continuously updated Machine Learning, Probabilistic Models and Deep Learning notes and demos (2000+ slides) 我不间断更新的机器学习,概率模型和深度学习的讲义(2000+页)和视频链接
Cheetsheet (cheat sheet or quick reference) generator. Use it for guides, instructions or study. Made in Python 3
Sentiment-Polarized Word Embedding for Multi-Label Sentiment Classification
A simple attention weights visualizer for text classification.
One has no future if one couldn't teach themself.
Multiple Different Natural Language Processing Tasks in a Single Deep Model