We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
想请教在实验中基线BERT与MLM有何区别,BERT与MLM是使用不同的掩码策略吗?
The text was updated successfully, but these errors were encountered:
你好。MLM是指使用掩码语言建模的目标对BERT进行进一步的预训练,这里的预训练语料为laptop和restaurant领域的评论。
Sorry, something went wrong.
感谢回复。我能否这样理解,实验中的基线BERT是用了BERT-base-uncased的初始权重,而MLM是在此基础上使用随机15%的掩码策略利用laptop和restaurant领域的评论进一步的训练?
是的。
谢谢!
No branches or pull requests
想请教在实验中基线BERT与MLM有何区别,BERT与MLM是使用不同的掩码策略吗?
The text was updated successfully, but these errors were encountered: