You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
结果报错 This model's maximum context length is 128000 tokens. However, your messages resulted in 180050 tokens. Please reduce the length of the messages. (type: invalid_request_error)
问题描述
尝试使用gpt-4o模型概括该pdf内容,fpubh-12-1368933.pdf
结果报错
This model's maximum context length is 128000 tokens. However, your messages resulted in 180050 tokens. Please reduce the length of the messages. (type: invalid_request_error)
多模态模型图片如何计算占用tokens参考OpenAI官方 https://openai.com/api/pricing ,这么一个小小的pdf,里面也没几张图,怎么也不可能跑到180050 tokens
查看通过 https://blob.chatnio.net/ 解析后的内容( 解析出来的内容见 https://gist.github.com/whitewatercn/d1dd7488a158e0e0b0a29d9008ff1a77 ),有一段超长的base64编码图片,通过转码工具可以确认是pdf里的图片被转成了base64编码,其规格为
怀疑这段超长的编码被当成文本处理,占用了大量token
在之前的issue中( #215 )提到
可是我使用的是gpt-4o
The text was updated successfully, but these errors were encountered: