Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Invalid URL (POST /v1/chat/completions) #2175

Closed
johnfelipe opened this issue Jun 27, 2023 · 20 comments
Closed

[Bug] Invalid URL (POST /v1/chat/completions) #2175

johnfelipe opened this issue Jun 27, 2023 · 20 comments
Labels
enhancement New feature or request

Comments

@johnfelipe
Copy link
Contributor

latest commit show me this issue:

SNAG-0129

latest one of that is OK:

SNAG-0130

@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

have you set Endpoint?

image

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


have you set Endpoint?

image

@johnfelipe
Copy link
Contributor Author

johnfelipe commented Jun 28, 2023 via email

@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

If you have direct access to api.openai.com, please set it as https://api.openai.com.
If you do not have direct access to api.openai.com, please keep the option as /api/openai. This will utilize Vercel to proxy all your requests.

@Yidadaa Yidadaa added the enhancement New feature or request label Jun 28, 2023
@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

Later, I will add a default logic to the code, so if the user forgets to set this option, the default endpoint will be automatically used.

@QAQQL
Copy link

QAQQL commented Jun 28, 2023

当我使用 https://github.com/THUDM/ChatGLM2-6B 时,官方有推荐使用该项目
感谢 [@hiyouga](https://github.com/THUDM/ChatGLM2-6B/blob/main) 实现了 OpenAI 格式的流式 API 部署,可以作为任意基于 ChatGPT 的应用的后端,比如 [ChatGPT-Next-Web](https://github.com/Yidadaa/ChatGPT-Next-Web)。可以通过运行仓库中的[openai_api.py](https://github.com/THUDM/ChatGLM2-6B/blob/main/openai_api.py) 进行部署:

我进行了如下配置
图片
发现下拉模型列表并不能触发/v1/models接口,且接口访问全为跨域
图片

#1179

@QAQQL
Copy link

QAQQL commented Jun 28, 2023

图片
下面的模型无法选择

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Image
The following models cannot be selected

@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

@QAQQL 兼容接口无需选中 chatglm 模型,使用默认的 gpt-3.5-turbo 即可。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@QAQQL compatible interface does not need to select the chatglm model, just use the default gpt-3.5-turbo.

@QAQQL
Copy link

QAQQL commented Jun 28, 2023

但是他接口访问失败
图片
他会先调用option

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


But his interface access failed

@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

@QAQQL 这是浏览器的默认策略,你可以给他们提个 PR 来解决这个问题,只需要允许所有的 OPTIONS 请求,并返回 200 即可。

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@QAQQL This is the browser's default policy, you can file a PR with them to fix this, just allow all OPTIONS requests and return 200.

@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

@QAQQL 你可以参考 fastapi 的 api 文档,为它添加 cors 支持:https://fastapi.tiangolo.com/tutorial/cors/#steps

只需要在:

https://github.com/THUDM/ChatGLM2-6B/blob/ba60190296e3451c0e7e6a0f26c60a8ba7720dfe/openai_api.py#L26

处,加几行代码:

from fastapi.middleware.cors import CORSMiddleware

app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

@Yidadaa
Copy link
Collaborator

Yidadaa commented Jun 28, 2023

等待此 PR 合入:THUDM/ChatGLM2-6B#102

@QAQQL
Copy link

QAQQL commented Jun 28, 2023

好的,谢谢您

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Ok, thank you

@KAIYI-HSU
Copy link

@Yidadaa
您好:

我也是透過ChatGLM2-6B 的openai_api.py進行部署後端。
我只修改文件中的model路徑跟改對外端口如下:
image
我有確定成功開啟端口並且能成功從 <我的固定IP>:19745/docs造訪說明書,也能用POST測試成功,GLM2可以生成回應。
image

請問我使用您的前端網頁: https://chatgpt1.nextweb.fun/
image
Endpoint填寫我的<我的固定IP>:19745 or https://<我的固定IP>:19745
對話時會出現以下錯誤:
{
"error": true,
"message": "Failed to fetch"
}
image

=====
希望能尋求幫助,
為何我後端部署好,任何前端網頁都無法跟 我的<我的固定IP>:19745 or https://<我的固定IP>:19745
進行互動?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


@Yidadaa
Hello:

I also deploy the backend through the openai_api.py of ChatGLM2-6B.
I only modify the model path in the file and change the external port as follows:
image
I have confirmed that the port is successfully opened and I can successfully access the manual from :19745/docs, and I can also use POST to test successfully, and GLM2 can generate a response.
image

Please let me use your front-end webpage: https://chatgpt1.nextweb.fun/
image
Endpoint fill in my :19745 or https://:19745
The following error occurs while talking:
{
"error": true,
"message": "Failed to fetch"
}
image

=====
Hoping for help,
Why can't any front-end webpage follow my backend deployment? My :19745 or https://:19745
to interact?

alchemist139 pushed a commit to alchemist139/ChatGPT-Next-Web that referenced this issue Sep 21, 2023
chenzeyu pushed a commit to neutronsg/ChatGPT-Next-Web that referenced this issue Nov 8, 2023
gaogao1030 pushed a commit to gaogao1030/ChatGPT-Next-Web that referenced this issue May 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants