Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

save_split seems to be broken after transformers made safetensor serialization default #55

Closed
jitto opened this issue Nov 9, 2023 · 3 comments

Comments

@jitto
Copy link

jitto commented Nov 9, 2023

Relevant TF PR - huggingface/transformers#27064

save_split calls
model.save_pretrained(save_directory, save_function=save_split, max_shard_size='10000GB')
And since the default value for safetensor serialization is true, save_pretrained will not call the save_function.

You can reproduce by following https://github.com/aws-neuron/aws-neuron-samples/blob/master/torch-neuronx/transformers-neuronx/inference/meta-llama-2-13b-sampling.ipynb using latest transformer package

@aws-rhsoln
Copy link

We were able to reproduce the issue and the fix should be available in the upcoming release. For now, can you downgrade the transformers version using the following command:
pip install "transformers<4.35.0"

@jitto
Copy link
Author

jitto commented Nov 9, 2023

We were able to reproduce the issue and the fix should be available in the upcoming release. For now, can you downgrade the transformers version using the following command: pip install "transformers<4.35.0"

Thanks for the quick response. I can confirm that save_split works fine with transformers v4.34.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants