We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
加载模型方式 config = GenerationConfig.from_pretrained( "/data/model/Baichuan-13B-Base" ) model = AutoModelForCausalLM.from_pretrained( "/data/model/Baichuan-13B-Base", torch_dtype=torch.float16, device_map="auto", trust_remote_code=True ) model.generation_config = config tokenizer = AutoTokenizer.from_pretrained( "/data/model/Baichuan-13B-Base", use_fast=False, trust_remote_code=True ) 也按照其他人issues的问题进行了配置 但是还是会报如下错误 请问我该如何使用Baichuan-13B-Base做问答呢?
The text was updated successfully, but these errors were encountered:
我也报了相同的错,请问有解决方法吗~
Sorry, something went wrong.
同问,如何用basemodel做问答呢?
No branches or pull requests
加载模型方式
config = GenerationConfig.from_pretrained(
"/data/model/Baichuan-13B-Base"
)
model = AutoModelForCausalLM.from_pretrained(
"/data/model/Baichuan-13B-Base",
torch_dtype=torch.float16,
device_map="auto",
trust_remote_code=True
)
model.generation_config = config
tokenizer = AutoTokenizer.from_pretrained(
"/data/model/Baichuan-13B-Base",
use_fast=False,
trust_remote_code=True
)
也按照其他人issues的问题进行了配置
但是还是会报如下错误
请问我该如何使用Baichuan-13B-Base做问答呢?
The text was updated successfully, but these errors were encountered: