Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot open after incorrect settings used #2

Open
arakinas opened this issue Jul 29, 2024 · 3 comments
Open

Cannot open after incorrect settings used #2

arakinas opened this issue Jul 29, 2024 · 3 comments

Comments

@arakinas
Copy link

I was trying to use an LM studio hosted local server, but apparently put in the wrong end point. Every end point I attempted to enter as the server showed up with an error. I haven't connected an agent to a server before, so I could easily have done something incorrect and assume that I added an incorrect setting. When I attempted to use the chat end point, to start a query, the button seemed as though it was attempting to access my llm.

I saw that there were errors in the log, and was waiting for the app to present the error in return. It did not. It spun indefinitely. I attempted to close out the app, my browser, and restarted my pc. I have uninstall/reinstalled the extension but I cannot do anything to bring the extension up.

End points I attempted to use, per the LMStudio log:
[2024-07-29 08:56:43.019] [INFO] [LM STUDIO SERVER] Supported endpoints:
[2024-07-29 08:56:43.020] [INFO] [LM STUDIO SERVER] -> GET http://localhost:1234/v1/models
[2024-07-29 08:56:43.020] [INFO] [LM STUDIO SERVER] -> POST http://localhost:1234/v1/chat/completions
[2024-07-29 08:56:43.020] [INFO] [LM STUDIO SERVER] -> POST http://localhost:1234/v1/completions
[2024-07-29 08:56:43.021] [INFO] [LM STUDIO SERVER] -> POST http://localhost:1234/v1/embeddings

I also attempted to connect with just http://localhost:1234/ and http://localhost:1234/v1/ but those gave me errors in the UI as well when trying to set it up.

I attempted /chat/completions when the error occurred.

What do I need to do to reset my instance, and what settings should I have used?

@rabelenda
Copy link
Contributor

Hello, thank you for contacting, trying the extension and aksing about this.

The extension is not currently compatible with LM studio (I guess you mean https://lmstudio.ai/).

It can nowadays interact with agents that comply with the contract specified here. Some example of agents that you can use to start your own agent can be found here, here and here.

One thing you can try is creating an agent with one of this projects to proxy to the LM studio server.

If you want you can also leave this issue open as a way for requesting support in the extension for LM Studio apps. You can even create a PR to contribute changes for such support.

Regards

@arakinas
Copy link
Author

Roger that, I understand. My real concern is not about LM studio, as I do have Ollama on my system as well. I had only attempted to use LMStudio, since I am aware they use an OpenAI compatible interface, and hoped it may work instead. I did not mean to imply it does or should use LMStudio by default, just that it was part of my workflow to get where I am.

My bigger concern though is the second part of my initial report in that when I attempted any chat completions, the button to send started spinning and never stopped. I do not see any way to change the settings to be able to try to use ollama. I did not see any way to reset the configuration to a correct one. Sorry for not being clear.

@rabelenda
Copy link
Contributor

Sorry, I didn't yet understand what is the issue you are reporting :(. Can you upload some gif or video showing the issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants