[Local LLMs] Support ChatGLM3, whose has function call abilities #169
Replies: 2 comments 1 reply
-
yes please this is a great idea |
Beta Was this translation helpful? Give feedback.
-
If ChatGLM3 supports native OpenAI ChatCompletion function calling, all you have to do is swap out From a brief look at a google translated README, it seems like you might have to write your own wrapper code to support their API calls, because they seem custom to ChatGLM3 (eg "role" / "content" + "tool"). If there's enough demand for this we might try to add this ourselves but in the meantime you can try to implement it following these instructions and open a PR: https://github.com/cpacker/MemGPT/tree/main/memgpt/local_llm#-adding-support-for-new-llms--improving-performance |
Beta Was this translation helpful? Give feedback.
-
Hi,
THe latestest release of chatglm3 enables openai gpt like function call abilities https://github.com/THUDM/ChatGLM3/blob/main/tool_using/README.md
It seems we need to re-format the function calling conventions to adapt to chatglm3 so that we can have a local llms able to follow tool calling
Beta Was this translation helpful? Give feedback.
All reactions