Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Remove contiguous() in split_embedding_weights_with_scale_bias
Summary: Calling `tensor.contiguous()` in case of non-contiguous tensor creates a new tensor. Changing it will not change the original `tensor`. To use results of `split_embedding_weights_with_scale_bias(split_scale_bias_mode=2)` as a tensor in state_dict - we should be able via that tensor to change the original tbe weight. For that we need to remove copy via contiguous(). Differential Revision: D46483112 Privacy Context Container: L1138451 fbshipit-source-id: dffaad9c7e92aaaf7761958c0d190a7ed21ee00e
- Loading branch information