Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Min_tokens #688

Closed
rlasseri opened this issue Aug 7, 2023 · 4 comments
Closed

Min_tokens #688

rlasseri opened this issue Aug 7, 2023 · 4 comments

Comments

@rlasseri
Copy link

rlasseri commented Aug 7, 2023

Hi ! Thanks for your work,
A quick question is it possible to have a min_token parameters ?
Thanks

@zhuohan123
Copy link
Member

I believe this is possible. You should be able to force the generation not to finish in this function:

def _stop_sequences(self, seq_groups: List[SequenceGroup]) -> None:

It would be great if you can contribute this feature!

@Lvjinhong
Copy link

So, has it been implemented now?

@hmellor
Copy link
Collaborator

hmellor commented Mar 25, 2024

You could use a negative length penalty

length_penalty: Float that penalizes sequences based on their length.
Used in beam search.

@hmellor hmellor closed this as completed Mar 25, 2024
@njhill
Copy link
Member

njhill commented Mar 25, 2024

This has now been implemented and will be in the next version: #3124

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants