Skip to content

Commit

Permalink
feat: support the latest GPT-4 Turbo (gpt-4-1106-preview) model (Chan…
Browse files Browse the repository at this point in the history
  • Loading branch information
PeterDaveHello authored Nov 24, 2023
1 parent a6605e8 commit ed1e41c
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ services:
# API接口地址,可选,设置 OPENAI_API_KEY 时可用
OPENAI_API_BASE_URL: xxx
# API模型,可选,设置 OPENAI_API_KEY 时可用,https://platform.openai.com/docs/models
# gpt-4, gpt-4-0314, gpt-4-0613, gpt-4-32k, gpt-4-32k-0314, gpt-4-32k-0613, gpt-3.5-turbo-16k, gpt-3.5-turbo-16k-0613, gpt-3.5-turbo, gpt-3.5-turbo-0301, gpt-3.5-turbo-0613, text-davinci-003, text-davinci-002, code-davinci-002
# gpt-4, gpt-4-1106-preview, gpt-4-0314, gpt-4-0613, gpt-4-32k, gpt-4-32k-0314, gpt-4-32k-0613, gpt-3.5-turbo-16k, gpt-3.5-turbo-16k-0613, gpt-3.5-turbo, gpt-3.5-turbo-0301, gpt-3.5-turbo-0613, text-davinci-003, text-davinci-002, code-davinci-002
OPENAI_API_MODEL: xxx
# 反向代理,可选
API_REVERSE_PROXY: xxx
Expand Down
5 changes: 5 additions & 0 deletions service/src/chatgpt/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,11 @@ let api: ChatGPTAPI | ChatGPTUnofficialProxyAPI
options.maxModelTokens = 32768
options.maxResponseTokens = 8192
}
// if use GPT-4 Turbo
else if (model.toLowerCase().includes('1106-preview')) {
options.maxModelTokens = 128000
options.maxResponseTokens = 4096
}
else {
options.maxModelTokens = 8192
options.maxResponseTokens = 2048
Expand Down

0 comments on commit ed1e41c

Please sign in to comment.