-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'TextGenerationPipeline' has no attribute ALLOWED_MODELS #6
Comments
Hi, I met the same problem, have you solved it? Thanks. |
It appears to work if you comment out that section (lines 11, 12, and 13). Might cause some error checking to fail later though depending on your use case, but I was able to run the text_generation.ipynb example. |
@ebolam answer did not work for me. The code fails on generator = pipeline('text-generation', model=model, tokenizer=tokenizer) with RuntimeError: Failed to import transformers.models.biogpt.configuration_biogpt because of the following error (look up to see its traceback):
No module named 'transformers.models.biogpt.configuration_biogpt' |
I was able to the fix the problem using older version of transformers==4.15.0. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This part looks broken on Huggingface Transformers main build:
mkultra/mkultra/inference.py
Line 12 in a25c72d
The text was updated successfully, but these errors were encountered: