You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The checkpoints in the huggingface of Llama(13b,7b) seems cannot be directly loaded in the model when training MiniLLM since its not considered model parallelism. Is there any way to convert the weight to mp=4?
The text was updated successfully, but these errors were encountered:
The checkpoints in the huggingface of Llama(13b,7b) seems cannot be directly loaded in the model when training MiniLLM since its not considered model parallelism. Is there any way to convert the weight to mp=4?
The text was updated successfully, but these errors were encountered: