Skip to content

Commit

Permalink
Remove contiguous() in split_embedding_weights_with_scale_bias (#1808)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #1808

Calling `tensor.contiguous()` in case of non-contiguous tensor creates a new tensor.
Changing it will not change the original `tensor`.

To use results of `split_embedding_weights_with_scale_bias(split_scale_bias_mode=2)` as a tensor in state_dict - we should be able via that tensor to change the original tbe weight.

For that we need to remove copy via contiguous().

Reviewed By: jianyuh

Differential Revision:
D46483112

Privacy Context Container: L1138451

fbshipit-source-id: 389162260da6ba0c9e91a685fdbd3effe5f4093b
  • Loading branch information
Ivan Kobzarev authored and facebook-github-bot committed Jun 6, 2023
1 parent 00ee2aa commit 10e93a0
Showing 1 changed file with 4 additions and 6 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -1270,16 +1270,14 @@ def split_embedding_weights_with_scale_bias(
splits.append(
(
weights_shifts[:, self.scale_bias_size_in_bytes :],
weights_shifts[:, : self.scale_bias_size_in_bytes // 2]
.contiguous()
.view(torch.float16),
weights_shifts[
:, : self.scale_bias_size_in_bytes // 2
].view(torch.float16),
weights_shifts[
:,
self.scale_bias_size_in_bytes
// 2 : self.scale_bias_size_in_bytes,
]
.contiguous()
.view(torch.float16),
].view(torch.float16),
)
)
elif (
Expand Down

0 comments on commit 10e93a0

Please sign in to comment.