Skip to content

Commit

Permalink
[Bugfix] Set enable_prefix_caching=True in prefix caching example (vl…
Browse files Browse the repository at this point in the history
  • Loading branch information
WoosukKwon authored and jimpang committed Mar 31, 2024
1 parent aea238e commit b53d361
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion examples/offline_inference_with_prefix.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
sampling_params = SamplingParams(temperature=0.0)

# Create an LLM.
llm = LLM(model="facebook/opt-125m")
llm = LLM(model="facebook/opt-125m", enable_prefix_caching=True)

generating_prompts = [prefix + prompt for prompt in prompts]

Expand Down

0 comments on commit b53d361

Please sign in to comment.