Skip to content

Commit

Permalink
correct moe OSL in inference_rules.adoc (#296)
Browse files Browse the repository at this point in the history
Co-authored-by: Pablo Gonzalez <[email protected]>
  • Loading branch information
viraatc and pgmpablo157321 authored Nov 26, 2024
1 parent 7f5f8a0 commit 679e15d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion inference_rules.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,7 @@ For each of the following benchmarks it is necessary to use the following infere
|Summarization (GPT-J) |max_new_tokens |128 | Maximum number of new tokens to generate
|Summarization (GPT-J) |early_stopping |True | Use the EOS token to stop generating tokens
|Summarization (Llama2) |max_new_tokens |1024 | Maximum number of new tokens to generate
|Text Generation (Mixtral-8x7B) |max_new_tokens |2048 | Maximum number of new tokens to generate
|Text Generation (Mixtral-8x7B) |max_new_tokens |1024 | Maximum number of new tokens to generate
|===

== Load Generator
Expand Down

0 comments on commit 679e15d

Please sign in to comment.