Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Remove contiguous() in split_embedding_weights_with_scale_bias (#1808)
Summary: Pull Request resolved: #1808 Calling `tensor.contiguous()` in case of non-contiguous tensor creates a new tensor. Changing it will not change the original `tensor`. To use results of `split_embedding_weights_with_scale_bias(split_scale_bias_mode=2)` as a tensor in state_dict - we should be able via that tensor to change the original tbe weight. For that we need to remove copy via contiguous(). Reviewed By: jianyuh Differential Revision: D46483112 Privacy Context Container: L1138451 fbshipit-source-id: 389162260da6ba0c9e91a685fdbd3effe5f4093b
- Loading branch information