Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unsupported Endpoint Request for LM Studio Model #69

Open
jingrongx opened this issue May 12, 2024 · 2 comments
Open

Unsupported Endpoint Request for LM Studio Model #69

jingrongx opened this issue May 12, 2024 · 2 comments

Comments

@jingrongx
Copy link

While attempting to use the LM Studio model, it was identified that the model only supports the following endpoints:

  • /v1/chat/completions
  • /v1/embeddings
  • /v1/models

However, an error occurred when executing a request:
2024/05/12 04:35:34 failed to process input: failed to get message summary: Unexpected endpoint or method. (POST /v1/api/chat)

Please verify that the requested endpoint matches the Supported endpoints of the LM Studio model to ensure proper handling of requests and prevent unexpected errors.

@jingrongx
Copy link
Author

jingrongx commented May 12, 2024

(base) mystic@mystic-virtual-machine:~/Downloads/codel$ docker run \
  -e OLLAMA_MODEL="LM Studio Community/Meta-Llama-3-8B-Instruct-GGUF" \
  -e OLLAMA_SERVER_URL=http://20.13.111.165:5480/v1 \
  -p 8080:8080 \
  -v /var/run/docker.sock:/var/run/docker.sock \
  ghcr.io/semanser/codel:latest
2024/05/12 04:35:20 OK   20240325154630_initial_migration.sql (2.38ms)
2024/05/12 04:35:20 OK   20240325193843_add_logs_table.sql (1.6ms)
2024/05/12 04:35:20 OK   20240328114536_tool_call_id_field.sql (1.67ms)
2024/05/12 04:35:20 OK   20240403115154_add_model_to_each_flow.sql (1.47ms)
2024/05/12 04:35:20 OK   20240403132844_add_model_provider_to_each_flow.sql (1.7ms)
2024/05/12 04:35:20 goose: successfully migrated database to version: 20240403132844
2024/05/12 04:35:20 Migrations ran successfully
[GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.

[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
 - using env:   export GIN_MODE=release
 - using code:  gin.SetMode(gin.ReleaseMode)

[GIN-debug] GET    /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] POST   /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] PUT    /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] PATCH  /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] HEAD   /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] OPTIONS /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] DELETE /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] CONNECT /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] TRACE  /graphql                  --> github.com/semanser/ai-coder/router.graphqlHandler.func4 (5 handlers)
[GIN-debug] GET    /playground               --> github.com/semanser/ai-coder/router.New.playgroundHandler.func3 (5 handlers)
[GIN-debug] GET    /terminal/:id             --> github.com/semanser/ai-coder/router.New.wsHandler.func4 (5 handlers)
[GIN-debug] GET    /browser/*filepath        --> github.com/gin-gonic/gin.(*RouterGroup).createStaticHandler.func1 (5 handlers)
[GIN-debug] HEAD   /browser/*filepath        --> github.com/gin-gonic/gin.(*RouterGroup).createStaticHandler.func1 (5 handlers)
2024/05/12 04:35:20 Docker client initialized: mystic-virtual-machine, x86_64
2024/05/12 04:35:20 Docker server API version: 24.0.5
2024/05/12 04:35:20 Docker client API version: 1.43
2024/05/12 04:35:20 Spawning container ghcr.io/go-rod/rod "codel-browser"
2024/05/12 04:35:20 Image ghcr.io/go-rod/rod found locally: true
2024/05/12 04:35:20 Creating container codel-browser...
2024/05/12 04:35:20 Container codel-browser created
2024/05/12 04:35:20 Container codel-browser started
2024/05/12 04:35:20 connect to http://localhost:8080/playground for GraphQL playground
[GIN] 2024/05/12 - 04:35:25 | 301 |      93.697µs |    20.1.200.148 | GET      "/chat/1"
[GIN] 2024/05/12 - 04:35:25 | 304 |      77.008µs |    20.1.200.148 | GET      "/"
[GIN] 2024/05/12 - 04:35:26 | 200 |     453.811µs |    20.1.200.148 | POST     "/graphql"
[GIN] 2024/05/12 - 04:35:26 | 200 |     192.196µs |    20.1.200.148 | POST     "/graphql"
2024/05/12 04:35:34 Starting tasks processor for queue 1
2024/05/12 04:35:34 Using provider: ollama. Model: LM Studio Community/Meta-Llama-3-8B-Instruct-GGUF
[GIN] 2024/05/12 - 04:35:34 | 200 |    3.323563ms |    20.1.200.148 | POST     "/graphql"
2024/05/12 04:35:34 Waiting for a task
[GIN] 2024/05/12 - 04:35:34 | 200 |     301.219µs |    20.1.200.148 | POST     "/graphql"
2024/05/12 04:35:34 Command 1 added to the queue 1
[GIN] 2024/05/12 - 04:35:34 | 200 |    3.879094ms |    20.1.200.148 | POST     "/graphql"
2024/05/12 04:35:34 Processing command 1 of type input
[GIN] 2024/05/12 - 04:35:34 | 200 |    4.254372ms |    20.1.200.148 | POST     "/graphql"
2024/05/12 04:35:34 failed to process input: failed to get message summary: Unexpected endpoint or method. (POST /v1/api/chat)
2024/05/12 04:35:34 Waiting for a task
^C2024/05/12 04:37:52 Shutting down...
2024/05/12 04:37:52 Removing tmp files...
2024/05/12 04:37:52 Cleaning up containers and making all flows finished...
2024/05/12 04:37:52 Deleting container eda73e1e27395f263f155993cf554719d7af45212c7f1b56cc7407584f7e301a...
2024/05/12 04:37:52 Container eda73e1e27395f263f155993cf554719d7af45212c7f1b56cc7407584f7e301a stopped
2024/05/12 04:37:52 Container eda73e1e27395f263f155993cf554719d7af45212c7f1b56cc7407584f7e301a removed
2024/05/12 04:37:52 Shutdown complete

image

@maxcurrent420
Copy link

maxcurrent420 commented Jun 28, 2024

Yeah I have this error also. Can we work LM Studio support into this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants