Skip to content

Commit

Permalink
[misc][distributed] fix pp missing layer condition (vllm-project#6446)
Browse files Browse the repository at this point in the history
  • Loading branch information
youkaichao authored Jul 15, 2024
1 parent 64fdc08 commit 4cf256a
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion vllm/model_executor/models/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,10 @@ def get_pp_missing_layer_names(model: torch.nn.Module) -> List[str]:
missing_layer_names = []
for name, module in model.named_modules():
if isinstance(module, PPMissingLayer):
missing_layer_names.append(name)
# NOTE: the trailing dot is used to match the prefix of the layer.
# without the dot, we could match a layer that is not missing,
# e.g., 'encoder.layer.1' would match 'encoder.layer.11'
missing_layer_names.append(name + '.')
_model_to_pp_missing_layer_names[model_id] = missing_layer_names

return missing_layer_names
Expand Down

0 comments on commit 4cf256a

Please sign in to comment.