You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm having trouble reproducing the Llama-2-7B-chat model responses to given prompts and with given temperatures. The responses differ not just stylistically due to randomness, but in terms of informativeness and generation length. Could you please share the model parameters and seeds for this and other models with which the responses were generated?
The text was updated successfully, but these errors were encountered:
I'm having trouble reproducing the Llama-2-7B-chat model responses to given prompts and with given temperatures. The responses differ not just stylistically due to randomness, but in terms of informativeness and generation length. Could you please share the model parameters and seeds for this and other models with which the responses were generated?
The text was updated successfully, but these errors were encountered: