Skip to content

Commit

Permalink
[Bugfix] Eagle: change config name for fc bias
Browse files Browse the repository at this point in the history
- Changes were made to load fc bias from config as part of this PR - vllm-project#8790
- `bias` param in config is also used to decide if attention has a bias or not in [LlamaDecoderLayer](https://github.com/vllm-project/vllm/blob/main/vllm/model_executor/models/llama.py#L215)
- Due to the same param name, model is not loaded properly for cases which do not have attention in decoder layer but have a bias in eagle fc layer.
- Have changed the param to `eagle_fc_bias` to avoid conflict.
  • Loading branch information
gopalsarda committed Oct 22, 2024
1 parent ca30c3c commit fcea109
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/model_executor/models/eagle.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def __init__(self, config: EAGLEConfig, *args, **kwargs) -> None:
self.model = model_cls(self.config.model, *args, **kwargs)
self.fc = nn.Linear(config.model.hidden_size * 2,
config.model.hidden_size,
bias=getattr(self.config, "bias", False))
bias=getattr(self.config, "eagle_fc_bias", False))

self.orig_vocab_size = config.vocab_size
self.truncated_vocab_size = config.truncated_vocab_size
Expand Down

0 comments on commit fcea109

Please sign in to comment.