Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

推理示例代码问题 #105

Open
Iamlovingit opened this issue Jan 30, 2024 · 3 comments
Open

推理示例代码问题 #105

Iamlovingit opened this issue Jan 30, 2024 · 3 comments

Comments

@Iamlovingit
Copy link
Contributor

在参照tools/run_text_generation_server_hf.py进行推理任务开发时,发现下面这3行的中的tokenizermodel 应为 self.tokenizerself.model

inputs = tokenizer(ques_list[0]["ques"], return_tensors="pt")["input_ids"].to(self.args.device)

response_raw = model.generate(inputs, do_sample=do_sample, top_k=top_k, temperature=temperature, num_beams=num_beams, max_length=tokens_to_generate)

response = tokenizer.decode(response_raw[0])

同样的问题也存在于tools/run_text_generation_server.py

@Iamlovingit Iamlovingit changed the title 推理示例代码 推理示例代码问题 Jan 30, 2024
@Shawn-IEITSystems
Copy link
Collaborator

@zhaoxudong01-ieisystem

@zhaoxudong01
Copy link
Collaborator

@lilianlhl

@lilianlhl
Copy link
Contributor

非常感谢您提出的问题,经验证两种方式都可以正常跑通代码。
当然您提出的self.tokenizer 和 self.model 更符合编程规范,我们会马上提交PR优化这段代码,谢谢您的宝贵建议。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants