You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
save_split calls
model.save_pretrained(save_directory, save_function=save_split, max_shard_size='10000GB')
And since the default value for safetensor serialization is true, save_pretrained will not call the save_function.
We were able to reproduce the issue and the fix should be available in the upcoming release. For now, can you downgrade the transformers version using the following command: pip install "transformers<4.35.0"
We were able to reproduce the issue and the fix should be available in the upcoming release. For now, can you downgrade the transformers version using the following command: pip install "transformers<4.35.0"
Thanks for the quick response. I can confirm that save_split works fine with transformers v4.34.1
Relevant TF PR - huggingface/transformers#27064
save_split calls
model.save_pretrained(save_directory, save_function=save_split, max_shard_size='10000GB')
And since the default value for safetensor serialization is true, save_pretrained will not call the save_function.
You can reproduce by following https://github.com/aws-neuron/aws-neuron-samples/blob/master/torch-neuronx/transformers-neuronx/inference/meta-llama-2-13b-sampling.ipynb using latest transformer package
The text was updated successfully, but these errors were encountered: