We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It seems that bipe_alibi is not working yet.
get_ape_embeddings returns a tuple, which is different from embed_tokens.
get_ape_embeddings
All codes since here do not work.
if self.config.rpe_type == "bipe_alibi": inputs_embeds = self.get_ape_embeddings(torch.stack([input_ids, token_ids], dim=-1)) else: inputs_embeds = self.embed_tokens(input_ids)
The text was updated successfully, but these errors were encountered:
Hi, Sorry about the typo in get_ape_embeddings and thanks for pointing this out.
It should be return embed rather than return embed, X[:, :, 0].
return embed
return embed, X[:, :, 0]
Sorry, something went wrong.
No branches or pull requests
It seems that bipe_alibi is not working yet.
get_ape_embeddings
returns a tuple, which is different from embed_tokens.All codes since here do not work.
The text was updated successfully, but these errors were encountered: