-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
针对t5-large模型的训练问题 #17
Comments
哦哦,应该表述有点歧义 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
大神们好。我看该项目说是“
在t5-large版基础上,使用数百G中文语料,训练了100万步,累积训练了1.5万亿个中文字词级别token
”我想问下,这里是采用
t5-large
模型作为预训练模型,在中文数据上进行微调训练的嘛?The text was updated successfully, but these errors were encountered: