-
-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Context lookup requires tool support in chat models since v.2.3.37 #903
Comments
Hi @EasternPA Here is a link to see which models from Open Router are supposed to support tools https://openrouter.ai/models?supported_parameters=tools The earlier versions of Smart Chat supported lookup for chat models that didn't use tools. But, this was a bit of a hackjob that I decided not to include in the latest version since tool support is seemingly becoming ubiquitous among newly released models. However, if enough people reply to this post that they want tool-less model retrieval, then I will have to consider re-adding the feature. Lastly, while the semantic lookup will not work without tools, other methods for including context, like specifying notes directly using |
@EasternPA yeah, in my testing I noticed that some models were failing to use tools even when they said they supported them, In the second screenshot, the message is using the folder syntax, which will still use the lookup tool. Instead, you will have to mention files directly like this (screenshot is using the meta-llama-3.2-3b-instruct:free model) It might still be possible to get tools working with that model, but it will take some time to play around with it. To start, I would try to remove the specified property from the request in the chat model adapter for open_router. But that may have other unintended side effects. And the logic should probably be model specific since the existing adapter works with other open_router models. In case anyone is interested, the relevant adapter file is https://github.com/brianpetro/jsbrains/blob/main/smart-chat-model/adapters/open_router.js 🌴 |
I'm still trying models on the list and running into roadblocks. The first attempt is with ministral-3b (not mistral), and the second attempt is with google Gemini 1.5 flash 8B. The errors do not appear to be the same Edit: claude-3.5-haiku worked as expected Edit 2: I updated on a different vault and claude-3.5-haiku did not work. It only shows me lookup and context but no results. |
I use OpenRouter, and before SC 2.3.37, I could flip between the models and try them out. Now, nearly every attempt to chat with a model results in an error that this model does not support tools. When I can manage to get around that warning, I'm faced with this instead:
No idea what, if anything, I can do. Any hints on which models I should focus on? I use the SC sidebar as a writing coach to help with fiction.
Thank you.
The text was updated successfully, but these errors were encountered: