Skip to content

Commit

Permalink
Remove Yi model definition, please use LlamaForCausalLM instead (vl…
Browse files Browse the repository at this point in the history
…lm-project#2854)

Co-authored-by: Roy <[email protected]>
  • Loading branch information
2 people authored and jvmncs committed Feb 14, 2024
1 parent 299b8cc commit 44b28d2
Show file tree
Hide file tree
Showing 5 changed files with 2 additions and 402 deletions.
7 changes: 2 additions & 5 deletions docs/source/models/supported_models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@ Alongside each architecture, we include some popular models that use it.
- InternLM2
- :code:`internlm/internlm2-7b`, :code:`internlm/internlm2-chat-7b`, etc.
* - :code:`LlamaForCausalLM`
- LLaMA, LLaMA-2, Vicuna, Alpaca, Koala, Guanaco
- :code:`meta-llama/Llama-2-13b-hf`, :code:`meta-llama/Llama-2-70b-hf`, :code:`openlm-research/open_llama_13b`, :code:`lmsys/vicuna-13b-v1.3`, :code:`young-geng/koala`, etc.
- LLaMA, LLaMA-2, Vicuna, Alpaca, Yi
- :code:`meta-llama/Llama-2-13b-hf`, :code:`meta-llama/Llama-2-70b-hf`, :code:`openlm-research/open_llama_13b`, :code:`lmsys/vicuna-13b-v1.3`, :code:`01-ai/Yi-6B`, :code:`01-ai/Yi-34B`, etc.
* - :code:`MistralForCausalLM`
- Mistral, Mistral-Instruct
- :code:`mistralai/Mistral-7B-v0.1`, :code:`mistralai/Mistral-7B-Instruct-v0.1`, etc.
Expand All @@ -77,9 +77,6 @@ Alongside each architecture, we include some popular models that use it.
* - :code:`StableLMEpochForCausalLM`
- StableLM
- :code:`stabilityai/stablelm-3b-4e1t/` , :code:`stabilityai/stablelm-base-alpha-7b-v2`, etc.
* - :code:`YiForCausalLM`
- Yi
- :code:`01-ai/Yi-6B`, :code:`01-ai/Yi-34B`, etc.

If your model uses one of the above model architectures, you can seamlessly run your model with vLLM.
Otherwise, please refer to :ref:`Adding a New Model <adding_a_new_model>` for instructions on how to implement support for your model.
Expand Down
330 changes: 0 additions & 330 deletions vllm/model_executor/models/yi.py

This file was deleted.

1 change: 0 additions & 1 deletion vllm/transformers_utils/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@
"qwen": QWenConfig,
"RefinedWeb": RWConfig, # For tiiuae/falcon-40b(-instruct)
"RefinedWebModel": RWConfig, # For tiiuae/falcon-7b(-instruct)
"yi": YiConfig,
}


Expand Down
2 changes: 0 additions & 2 deletions vllm/transformers_utils/configs/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
# tiiuae/falcon-7b(-instruct) models. Newer Falcon models will use the
# `FalconConfig` class from the official HuggingFace transformers library.
from vllm.transformers_utils.configs.falcon import RWConfig
from vllm.transformers_utils.configs.yi import YiConfig

__all__ = [
"AquilaConfig",
Expand All @@ -16,5 +15,4 @@
"MPTConfig",
"QWenConfig",
"RWConfig",
"YiConfig",
]
Loading

0 comments on commit 44b28d2

Please sign in to comment.