Skip to content

Commit

Permalink
[Misc] Minor fix in KVCache type (#3652)
Browse files Browse the repository at this point in the history
  • Loading branch information
WoosukKwon authored Mar 27, 2024
1 parent 7687934 commit e66b629
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 8 deletions.
4 changes: 2 additions & 2 deletions docs/source/models/adding_model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -56,8 +56,8 @@ Next, you need to rewrite the :code:`forward` methods of your model by following
- return_dict: Optional[bool] = None,
-) -> Union[Tuple, CausalLMOutputWithPast]:
+ positions: torch.Tensor,
+ kv_caches: List[KVCache],
+ input_metadata: InputMetadata,
+ kv_caches: List[torch.Tensor],
+ attn_metadata: AttentionMetadata,
+) -> Optional[SamplerOutput]:
1. Update the code by considering that :code:`input_ids` and :code:`positions` are now flattened tensors.
Expand Down
6 changes: 2 additions & 4 deletions vllm/model_executor/models/llava.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from typing import List, Optional, Tuple
from typing import List, Optional

import torch
from torch import nn
Expand All @@ -19,8 +19,6 @@
hf_model_weights_iterator)
from vllm.sequence import SamplerOutput

KVCache = Tuple[torch.Tensor, torch.Tensor]

_KEYS_TO_MODIFY_MAPPING = {
"language_model.lm_head": "lm_head",
"language_model.model": "language_model",
Expand Down Expand Up @@ -102,7 +100,7 @@ def forward(
self,
input_ids: torch.Tensor,
positions: torch.Tensor,
kv_caches: List[KVCache],
kv_caches: List[torch.Tensor],
attn_metadata: AttentionMetadata,
image_input: Optional[torch.Tensor] = None
) -> SamplerOutput: # noqa: E501
Expand Down
2 changes: 0 additions & 2 deletions vllm/worker/neuron_model_runner.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,6 @@

logger = init_logger(__name__)

KVCache = Tuple[torch.Tensor, torch.Tensor]


class NeuronModelRunner:

Expand Down

0 comments on commit e66b629

Please sign in to comment.