We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unilm在fine-tuing时可以使用自回归,类似于语言模型的训练,不过没法充分利用上预训练时的decode能力,也可以使用和预训练阶段一样的非自回归的形式,但是这样感觉训练效率较低,实际比较过后,两者的效果差异不明显,请问下作者有没进行过类似的比较?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
unilm在fine-tuing时可以使用自回归,类似于语言模型的训练,不过没法充分利用上预训练时的decode能力,也可以使用和预训练阶段一样的非自回归的形式,但是这样感觉训练效率较低,实际比较过后,两者的效果差异不明显,请问下作者有没进行过类似的比较?
The text was updated successfully, but these errors were encountered: