You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to extend Mistral 7B, I've try the code of extending the model. It's work fine but when I am trying to load the model with the new data, I got an error. The model doesn't know the new block.
when I've try model.load_state_dict(output) where output is the new state_dict from the block expansion script, I got this error.
RuntimeError: Error(s) in loading state_dict for MistralForCausalLM:
Unexpected key(s) in state_dict: "model.layers.32.self_attn.q_proj.weight", "model.layers.32.self_attn.k_proj.weight",etc...
can you tell please how did you solve this or you didn't face it when working with llama
The text was updated successfully, but these errors were encountered:
I think you may need to revise the config, especially the key of "num_hidden_key" in the config.json file. You should use the number of layers after expansion for this key and then load Mistral. Meanwhile, in my experience, the Mistral may need much lower learning rate than LLaMA, because I experienced loss oscillation in my experiment. I find that conclusion in the meta-math repo too. https://huggingface.co/meta-math/MetaMath-Mistral-7B
I am trying to extend Mistral 7B, I've try the code of extending the model. It's work fine but when I am trying to load the model with the new data, I got an error. The model doesn't know the new block.
when I've try model.load_state_dict(output) where output is the new state_dict from the block expansion script, I got this error.
RuntimeError: Error(s) in loading state_dict for MistralForCausalLM:
Unexpected key(s) in state_dict: "model.layers.32.self_attn.q_proj.weight", "model.layers.32.self_attn.k_proj.weight",etc...
can you tell please how did you solve this or you didn't face it when working with llama
The text was updated successfully, but these errors were encountered: