Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: cannot import name 'checkpoint' from 'transformers.models.t5.modeling_t5' #47

Open
Dylan-LDR opened this issue Dec 17, 2023 · 15 comments

Comments

@Dylan-LDR
Copy link

Thanks for your great work. I successfully install vima and vima_bench. However, when I try to run the example by '''python3 scripts/example.py --ckpt=../VimaBench/ckpts/200M.ckpt --partition=placement_generalization --task=follow_order''', I face the problem of failure to import checkpoint when import modeling_t5 at vima/nn/prompt_encoder/prompt_encoder.py. Did I miss some requirements and how can I fix this?

@Dylan-LDR
Copy link
Author

I did check the source code and document of Transformers modeling_t5 but I could not find checkpoint definition.

@Zero-coder
Copy link

same question +1 @yunfanjiang

@Zero-coder
Copy link

Thanks for contributing such inspiring work. I guess this problem might also related to HF version.

@Dylan-LDR
Copy link
Author

Yes, I think the problem should lay in the version conflict of transformers. Maybe '''checkpoint''' is removed after a certain update so that I cannot find it in my installed transformers 4.36.1. Unfortunately I spot that there are no detailed versions of packages in requirement and it is hard to find supports from the community. Hope there could be more explanations and discussions.

@Abivishaq
Copy link

Yeah, the transformers version seems to be the issue. Version 4.20.0 worked for me.

pip install transformers==4.20.0 

@Zero-coder
Copy link

Zero-coder commented Dec 19, 2023

Yes, I think the problem should lay in the version conflict of transformers. Maybe '''checkpoint''' is removed after a certain update so that I cannot find it in my installed transformers 4.36.1. Unfortunately I spot that there are no detailed versions of packages in requirement and it is hard to find supports from the community. Hope there could be more explanations and discussions.

Thanks for your information, it's helpful.

@Zero-coder
Copy link

Yeah, the transformers version seems to be the issue. Version 4.20.0 worked for me.

pip install transformers==4.20.0 

Thanks, transformers==4.20.0, this version is helpful, however, then I encountered a new problem "[2023-12-19T02:58:39Z ERROR cached_path::cache] Max retries exceeded for https://huggingface.co/t5-base/resolve/main/tokenizer.json
Traceback (most recent call last):
File "/home/jmw/VIMA-main/scripts/example.py", line 74, in
tokenizer = Tokenizer.from_pretrained("t5-base")
Exception: Model "t5-base" on the Hub doesn't have a tokenizer"

@Zero-coder
Copy link

Those are related solution, I am trying to solve.

#20
#42

@Dylan-LDR
Copy link
Author

#20 (comment)
This reply works for me.

@Zero-coder
Copy link

Zero-coder commented Dec 19, 2023

#20 (comment) This reply works for me.

Hello, thanks for your information.
Can you give more details about this, I tried this, and I encounter with
4318f238e202e27c0e9e247cad36d5b

@Zero-coder
Copy link

024d101a3d1a8f3e60df064699098cb
and this is my file structure.

@Zero-coder
Copy link

@Dylan-LDR

@Dylan-LDR
Copy link
Author

Dylan-LDR commented Dec 19, 2023

#20 (comment) This reply works for me.

Hello, thanks for your information. Can you give more details about this, I tried this, and I encounter with 4318f238e202e27c0e9e247cad36d5b

I just follow the instruction of #20 (comment) and load the tokenizer from local directory. It works fine for me totally. I guess the assertion is caused by the miss of VIMA model checkpoint. Did you ever download the checkpoint file 200M.ckpt from HF and place it into your code directory? I cannot find it from the file structure you posted.

@Zero-coder
Copy link

#20 (comment) This reply works for me.

Hello, thanks for your information. Can you give more details about this, I tried this, and I encounter with 4318f238e202e27c0e9e247cad36d5b

I just follow the instruction of #20 (comment) and load the tokenizer from local directory. It works fine for me totally. I guess the assertion is caused by the miss of VIMA model checkpoint. Did you ever download the checkpoint file 200M.ckpt from HF and place it into your code directory? I cannot find it from the file structure you posted.

Many thanks, it's working after following your suggestion. 🙂🌱

@jpbalarini
Copy link

Downgrading to version pip install transformers==4.34.1 did the trick for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants