Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

恢复超token错误提示 #468

Closed
5 tasks done
SunnyGPT opened this issue Aug 25, 2023 · 2 comments
Closed
5 tasks done

恢复超token错误提示 #468

SunnyGPT opened this issue Aug 25, 2023 · 2 comments
Labels
bug Something isn't working

Comments

@SunnyGPT
Copy link

SunnyGPT commented Aug 25, 2023

例行检查

  • 我已确认目前没有类似 issue
  • 我已确认我已升级到最新版本
  • 我已完整查看过项目 README,尤其是常见问题部分
  • 我理解并愿意跟进此 issue,协助测试和提供反馈
  • 我理解并认可上述内容,并理解项目维护者精力有限,不遵循规则的 issue 可能会被无视或直接关闭

问题描述
原先超token错误提示是这样的:
{"error":{"message":"This model's maximum context length is 4097 tokens, however you requested 4179 tokens (3965 in your prompt; 214 for the completion). Please reduce your prompt; or completion length.","type":"invalid_request_error","param":"","code":null}} attempt 6
现在是这样的:
{"error":{"message":"bad status code: 400","type":"one_api_error","param":"","code":"bad_status_code"}} attempt 6

这种错误提示会让客户认为是我们的中转出现问题,而不是他自身的请求出现问题,为避免不必要的误会和频繁向客户解释,建议恢复原样。

相关截图
image

@SunnyGPT SunnyGPT added the bug Something isn't working label Aug 25, 2023
@zwluoqi
Copy link

zwluoqi commented Aug 25, 2023

确实 现在没有详细提示 很蛋疼

@songquanpeng
Copy link
Owner

已修复,测试后反馈一下结果

ChongzhengZhao pushed a commit to ChongzhengZhao/llm-one-api that referenced this issue Sep 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants