Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] - OpenAI API calls returning 404 in logs #4209

Open
6 tasks done
TheGief opened this issue Sep 14, 2024 · 8 comments
Open
6 tasks done

[BUG] - OpenAI API calls returning 404 in logs #4209

TheGief opened this issue Sep 14, 2024 · 8 comments
Labels
bug Something isn't working triage

Comments

@TheGief
Copy link

TheGief commented Sep 14, 2024

First Check

  • This is not a feature request.
  • I added a very descriptive title to this issue (title field is above this).
  • I used the GitHub search to find a similar issue and didn't find it.
  • I searched the Mealie documentation, with the integrated search.
  • I already read the docs and didn't find an answer.
  • This issue can be replicated on the demo site (https://demo.mealie.io/).

What is the issue you are experiencing?

In setting up the OpenAI integration I see 404s in the logs.

I can only set base url. Looking in OpenAI docs it looks like the completion URL is https://api.openai.com/v1/completions which is different than the url above.

https://platform.openai.com/docs/api-reference/completions/create?lang=curl

Steps to Reproduce

  1. Setup OpenAI using the backend configs
  2. Goto the Parser page and choose OpenAI
  3. Click submit

Please provide relevant logs

INFO 2024-09-14T14:26:18 - HTTP Request: POST https://api.openai.com/chat/completions "HTTP/1.1 404 Not Found"

Mealie Version

Nightly
Build
d8dbcac1964c9e485be066c6ab49feda14e31220

Deployment

Docker (Linux)

Additional Deployment Details

No response

@TheGief TheGief added bug Something isn't working triage labels Sep 14, 2024
@michael-genson
Copy link
Collaborator

michael-genson commented Sep 14, 2024

If you're using Open AI you shouldn't be setting the base URL: https://docs.mealie.io/documentation/getting-started/installation/open-ai/

Additionally, if you're on the free tier, you're going to get a 404 error since you don't have access to all models, including the default one

@TheGief
Copy link
Author

TheGief commented Sep 14, 2024

Heya, thanks for your response. Confirming I'm not setting the baseurl and I'm not on the free tier of OpenAI.

I think OpenAI changed the URL of the "completions" API. My guess is that the python lib you're using to hit the completions API is using and old URL "/chat/completions". I think the new URL is "/v1/completions". I tried curling both with my API key and only the new url works.

It also seems that "gtp-4o" isn't supported by the /v1/completions api. However, "gpt-3.5-turbo-instruct" is.

@michael-genson
Copy link
Collaborator

I just checked on my instance with all default settings and it works as expected. Are you sure you cleared the base URL? Are you sure your instance can reach OpenAI?

Also make sure you're on the latest nightly, which has the latest OpenAI Python client

@hernil
Copy link

hernil commented Oct 10, 2024

Confirming that I'm seeing this in latest (1.12.0) and some other issue (I think my API-key is not submitted?) in the nightly release.

I'm available for debugging if desired!

@michael-genson
Copy link
Collaborator

Please share your docker compose and the logs with the failed request

@hernil
Copy link

hernil commented Oct 11, 2024

Hope this helps!

docker-compose.yml
version: "3.8"
networks:
 web:
   external: true
 internal:
   external: false
services:
 mealie:
   image: ghcr.io/mealie-recipes/mealie:latest

   container_name: mealie
   restart: always
   expose:
       - 9000
   deploy:
     resources:
       limits:
         memory: 1000M
   volumes:
     - ./mealie-data:/app/data/
   environment:
     PUID: 1000
     PGID: 1000
     TZ: Europe/Someplace
     MAX_WORKERS: 1
     WEB_CONCURRENCY: 1
     BASE_URL: https://mealie.redacted.com
     OPENAI_API_KEY: ${OPENAI_API_KEY}
     # Database Settings
     DB_ENGINE: postgres
     POSTGRES_USER: mealie
     POSTGRES_PASSWORD: ${DB_PASSWORD}
     POSTGRES_SERVER: mealie_postgres
     POSTGRES_PORT: 5432
     POSTGRES_DB: mealie
   depends_on:
     postgres:
       condition: service_healthy
   labels:
     - traefik.http.routers.mealie.rule=Host(`mealie.redacted.com`)
     - traefik.http.routers.mealie.tls=true
     - traefik.http.routers.mealie.tls.certresolver=lets-encrypt
   networks:
     - web
     - internal

 postgres:
   container_name: mealie_postgres
   image: postgres:15
   restart: always
   volumes:
     - ./mealie-pgdata:/var/lib/postgresql/data
   environment:
     POSTGRES_PASSWORD: ${DB_PASSWORD}
     POSTGRES_USER: mealie
     PGUSER: mealie
   healthcheck:
     test: ["CMD", "pg_isready"]
     interval: 30s
     timeout: 20s
     retries: 3
   labels:
     - traefik.enable=false
   networks:
     - internal

log
INFO     2024-10-11T10:56:21 - HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 404 Not Found"
ERROR    2024-10-11T10:56:21 - OpenAI Request Failed
Traceback (most recent call last):
  File "/app/mealie/services/openai/openai.py", line 175, in get_response
    response = await self._get_raw_response(prompt, user_messages, temperature, force_json_response)
  File "/app/mealie/services/openai/openai.py", line 140, in _get_raw_response
    return await client.chat.completions.create(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1339, in create
    return await self._post(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1816, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1510, in request
    return await self._request(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1611, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
ERROR    2024-10-11T10:56:21 - OpenAI Request Failed
Traceback (most recent call last):
  File "/app/mealie/services/openai/openai.py", line 175, in get_response
    response = await self._get_raw_response(prompt, user_messages, temperature, force_json_response)
  File "/app/mealie/services/openai/openai.py", line 140, in _get_raw_response
    return await client.chat.completions.create(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1339, in create
    return await self._post(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1816, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1510, in request
    return await self._request(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1611, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
INFO     2024-10-11T10:56:21 - [10.4.0.1:0] 500 Internal Server Error "POST /api/parser/ingredient HTTP/1.1"
ERROR    2024-10-11T10:56:21 - Exception in ASGI application
Traceback (most recent call last):
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in __call__
    await responder(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in __call__
    await self.app(scope, receive, self.send_with_gzip)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/app/mealie/routes/parser/ingredient_parser.py", line 21, in parse_ingredient
    response = await parser.parse([ingredient.ingredient])
  File "/app/mealie/services/parser_services/openai/parser.py", line 99, in parse
    response = await self._parse(ingredients)
  File "/app/mealie/services/parser_services/openai/parser.py", line 84, in _parse
    responses = [
ERROR    2024-10-11T10:56:21 - Exception in ASGI application
Traceback (most recent call last):
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in __call__
    await responder(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in __call__
  File "/app/mealie/services/parser_services/openai/parser.py", line 85, in <listcomp>
    OpenAIIngredients.parse_openai_response(response_json) for response_json in responses_json if responses_json
  File "/app/mealie/schema/openai/_base.py", line 24, in parse_openai_response
    return cls.model_validate_json(response or "")
    await self.app(scope, receive, self.send_with_gzip)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/pydantic/main.py", line 597, in model_validate_json
    return cls.__pydantic_validator__.validate_json(json_data, strict=strict, context=context)
pydantic_core._pydantic_core.ValidationError: 1 validation error for OpenAIIngredients
  Invalid JSON: EOF while parsing a value at line 1 column 0 [type=json_invalid, input_value='', input_type=str]
    For further information visit https://errors.pydantic.dev/2.8/v/json_invalid
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
  File "/app/mealie/routes/parser/ingredient_parser.py", line 21, in parse_ingredient
    response = await parser.parse([ingredient.ingredient])
  File "/app/mealie/services/parser_services/openai/parser.py", line 99, in parse
    response = await self._parse(ingredients)
  File "/app/mealie/services/parser_services/openai/parser.py", line 84, in _parse
    responses = [
  File "/app/mealie/services/parser_services/openai/parser.py", line 85, in <listcomp>
    OpenAIIngredients.parse_openai_response(response_json) for response_json in responses_json if responses_json
  File "/app/mealie/schema/openai/_base.py", line 24, in parse_openai_response
    return cls.model_validate_json(response or "")
  File "/opt/pysetup/.venv/lib/python3.10/site-packages/pydantic/main.py", line 597, in model_validate_json
    return cls.__pydantic_validator__.validate_json(json_data, strict=strict, context=context)
pydantic_core._pydantic_core.ValidationError: 1 validation error for OpenAIIngredients
  Invalid JSON: EOF while parsing a value at line 1 column 0 [type=json_invalid, input_value='', input_type=str]
    For further information visit https://errors.pydantic.dev/2.8/v/json_invalid

@michael-genson
Copy link
Collaborator

The model gpt-4o does not exist or you do not have access to it

Looks like you're on the free tier. The free tier won't work with Mealie. See the configuration docs: https://docs.mealie.io/documentation/getting-started/installation/open-ai/

@hernil
Copy link

hernil commented Oct 12, 2024

Looks like my initial payment didn't go through or something. Repeated it and it now works. Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

3 participants