This repository contains the code to train Dynamic Contextualized Word Embeddings (DCWEs), proposed in the ACL paper Dynamic Contextualized Word Embeddings. DCWEs represent the meaning of words as a function of both linguistic and extralinguistic (social and temporal) context.
If you use the code in this repository, please cite the following paper:
@inproceedings{hofmann2021dcwe,
title = {Dynamic Contextualized Word Embeddings},
author = {Hofmann, Valentin and Pierrehumbert, Janet and Sch{\"u}tze, Hinrich},
booktitle = {Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics},
year = {2021}
}