forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add fake HPU mode to Habana components with dummy habana_frameworks module. #250
Merged
madamczykhabana
merged 39 commits into
habana_main
from
private/jmaksymczuk/fake_hpu_cpu
Sep 17, 2024
Merged
Changes from 37 commits
Commits
Show all changes
39 commits
Select commit
Hold shift + click to select a range
e52c0ec
Update habana_model_runner.py
kzawora-intel dcc878b
Merge remote-tracking branch 'origin/habana_main' into private/kzawor…
kzawora-intel afffe33
Add fake HPU mode
kzawora-intel ed414dc
Merge remote-tracking branch 'origin/habana_main' into private/kzawor…
kzawora-intel ceca996
format.sh
kzawora-intel 1976d75
tp fixes
kzawora-intel db4c30f
add cpu github action job
kzawora-intel 08c9cf3
format.sh
kzawora-intel ebcb4ab
fix cputest job
kzawora-intel 506e026
add better validation
kzawora-intel 08a24b0
[WIP] Fake hpu cpu migration with dummy habana_frameworks.
jmaksymczuk 731cab1
Add --fake_hpu to cpu-test.
jmaksymczuk b87d43d
Trigger cpu-test on PR to private/kzawora/fake_hpu.
jmaksymczuk 1b09033
Create dummy habana_frameworks.torch.utils.internal.is_lazy dummy met…
jmaksymczuk dd8ac9b
Merge branch 'habana_main' into private/jmaksymczuk/fake_hpu_cpu
jmaksymczuk fb4ca58
Fix for model_runner and loader.
jmaksymczuk 2cf66a2
Fix for ruff checks.
jmaksymczuk 34d4141
Merge branch 'habana_main' into private/jmaksymczuk/fake_hpu_cpu
jmaksymczuk 4d08172
Add dummy bridge_config module.
jmaksymczuk b7beb49
format
jmaksymczuk 4e957d4
Merge branch 'habana_main' into private/jmaksymczuk/fake_hpu_cpu
jmaksymczuk e9c1064
Missing bracket.
jmaksymczuk 91657ec
Refactor code.
jmaksymczuk 3f1c973
format
jmaksymczuk 0d9dff6
Fix model runner, format.
jmaksymczuk e5cd53a
Review changes.
jmaksymczuk 4ab0063
Merge branch 'habana_main' into private/jmaksymczuk/fake_hpu_cpu
jmaksymczuk 73f213a
Merge remote-tracking branch 'origin/habana_main' into private/jmaksy…
jmaksymczuk 1d9fd69
Remove --fake_hpu, is_fake_hpu and cpu migration depends on VLLM_USE_…
jmaksymczuk d4efdba
format
jmaksymczuk a0f9f3c
Merge habana_main into private/jmaksymczuk/fake_hpu_cpu.
jmaksymczuk 0c79630
Dummy modules based on MagicMock - improves visibility.
jmaksymczuk 88efc02
Remove failing prompt - text formatting.
jmaksymczuk 5864c3a
Rephrase one prompt that generated weirdly formatted output.
jmaksymczuk 7633c4d
prompts
jmaksymczuk 1b034d7
format
jmaksymczuk 88611af
Merge branch 'habana_main' into private/jmaksymczuk/fake_hpu_cpu
jmaksymczuk b414ffb
Create needed dummy modules automatically, add comments.
jmaksymczuk 8d01b78
format
jmaksymczuk File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
name: cpu-test | ||
|
||
on: | ||
# Trigger the workflow on push or pull request, | ||
# but only for the habana_main branch | ||
push: | ||
branches: | ||
- habana_main | ||
pull_request: | ||
branches: | ||
- habana_main | ||
|
||
|
||
jobs: | ||
cputest: | ||
runs-on: ubuntu-latest | ||
strategy: | ||
matrix: | ||
python-version: ["3.10"] | ||
steps: | ||
- uses: actions/checkout@v2 | ||
- name: Set up Python ${{ matrix.python-version }} | ||
uses: actions/setup-python@v2 | ||
with: | ||
python-version: ${{ matrix.python-version }} | ||
- name: Install dependencies | ||
run: | | ||
python -m pip install --upgrade pip | ||
pip install torch --extra-index-url https://download.pytorch.org/whl/cpu | ||
pip install -r requirements-hpu.txt | ||
VLLM_TARGET_DEVICE=hpu python setup.py develop | ||
- name: cpu-test | ||
run: | | ||
VLLM_SKIP_WARMUP=true VLLM_PROMPT_SEQ_BUCKET_MAX=128 VLLM_USE_FAKE_HPU=1 python examples/offline_inference_fakehpu.py |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
import os | ||
|
||
from vllm import LLM, SamplingParams | ||
|
||
if os.environ.get('VLLM_USE_FAKE_HPU', '0') != '0': | ||
from vllm.utils import migrate_to_cpu | ||
migrate_to_cpu() | ||
|
||
# Sample prompts. | ||
prompts = [ | ||
"Berlin is the capital city of ", | ||
"Louvre is located in the city of ", | ||
"Barack Obama was the 44th president of ", | ||
"Warsaw is the capital city of ", | ||
"Gniezno is a city in ", | ||
"San Francisco is located in the state of ", | ||
"Llanfairpwllgwyngyll is located in country of ", | ||
] | ||
ref_answers = [ | ||
"Germany", "Paris", "United States", "Poland", "Poland", "California", | ||
"Wales" | ||
] | ||
# Create a sampling params object. | ||
sampling_params = SamplingParams(temperature=0, n=1, use_beam_search=False) | ||
|
||
# Create an LLM. | ||
llm = LLM(model="facebook/opt-125m", max_model_len=32, max_num_seqs=4) | ||
# Generate texts from the prompts. The output is a list of RequestOutput objects | ||
# that contain the prompt, generated text, and other information. | ||
outputs = llm.generate(prompts, sampling_params) | ||
# Print the outputs. | ||
for output, answer in zip(outputs, ref_answers): | ||
prompt = output.prompt | ||
generated_text = output.outputs[0].text | ||
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}") | ||
assert answer in generated_text, ( | ||
f"The generated text does not contain the correct answer: {answer}") | ||
print('PASSED') |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is very brittle. Anytime someone adds a new module in a any file we'd need to remember to wrap it here. Couldn't we do it somehow differently?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've done research and asked a few people and I have not found a different way of doing it unfortunately. I'm open for suggestions but for now I have not found a more "elegant" way of doing that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about using MagicMock?
https://stackoverflow.com/a/37126323
https://docs.python.org/3/library/unittest.mock.html
As far as I understand it should automatically mock everything in the hierarchy below. We could do it only for 'habana_frameworks'.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately MagicMock doesn't solve the submodules issue but it highly improves visibility and mokes further dummy modules additions much simpler -> Changed origin dummy modules handling to MagicMock.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm... Perhaps something like this could work:
Could you please check if it works? (last thing, I promise! 😄 )