Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于BERT与MLM #1

Closed
TianqiFu opened this issue Dec 26, 2023 · 4 comments
Closed

关于BERT与MLM #1

TianqiFu opened this issue Dec 26, 2023 · 4 comments

Comments

@TianqiFu
Copy link

想请教在实验中基线BERT与MLM有何区别,BERT与MLM是使用不同的掩码策略吗?

@1140310118
Copy link
Collaborator

你好。MLM是指使用掩码语言建模的目标对BERT进行进一步的预训练,这里的预训练语料为laptop和restaurant领域的评论。

@TianqiFu
Copy link
Author

感谢回复。我能否这样理解,实验中的基线BERT是用了BERT-base-uncased的初始权重,而MLM是在此基础上使用随机15%的掩码策略利用laptop和restaurant领域的评论进一步的训练?

@1140310118
Copy link
Collaborator

是的。

@TianqiFu
Copy link
Author

谢谢!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants