-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inquiry about Model Size and Plans for Open Sourcing Larger Models #3
Comments
|
fixe the issue in this commit: b38e199, and naming in modelscope: https://www.modelscope.cn/models/M2Cognition/M2-Encoder/files |
Thanks for the detailed response and sharing your future plans. I'm excited to hear about your team's intention to open source models at the 1 billion and 10 billion parameter scale. Appreciate your contributions through open sourcing valuable AI resources. |
We have released 1B and 10B modes in this PR: #14 |
Hello, I noticed that the model in Modelscope is m2_encoder_0.2B.ckpt. Is this the 0.4B parameter model mentioned in the paper? Will there be larger models open sourced in the future?
The text was updated successfully, but these errors were encountered: