We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在参照tools/run_text_generation_server_hf.py进行推理任务开发时,发现下面这3行的中的tokenizer 和 model 应为 self.tokenizer 和 self.model
tools/run_text_generation_server_hf.py
tokenizer
model
self.tokenizer
self.model
Yuan-2.0/tools/run_text_generation_server_hf.py
Line 148 in a262e40
Line 150 in a262e40
Line 151 in a262e40
同样的问题也存在于tools/run_text_generation_server.py
tools/run_text_generation_server.py
The text was updated successfully, but these errors were encountered:
@zhaoxudong01-ieisystem
Sorry, something went wrong.
@lilianlhl
非常感谢您提出的问题,经验证两种方式都可以正常跑通代码。 当然您提出的self.tokenizer 和 self.model 更符合编程规范,我们会马上提交PR优化这段代码,谢谢您的宝贵建议。
No branches or pull requests
在参照
tools/run_text_generation_server_hf.py
进行推理任务开发时,发现下面这3行的中的tokenizer
和model
应为self.tokenizer
和self.model
Yuan-2.0/tools/run_text_generation_server_hf.py
Line 148 in a262e40
Yuan-2.0/tools/run_text_generation_server_hf.py
Line 150 in a262e40
Yuan-2.0/tools/run_text_generation_server_hf.py
Line 151 in a262e40
同样的问题也存在于
tools/run_text_generation_server.py
The text was updated successfully, but these errors were encountered: