-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add LiteLLM as an agent for model connections #53
Conversation
Please mark whether you used Copilot to assist coding in this PR
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added some comments
src/solace_ai_connector/components/general/litellm/litellm_base.py
Outdated
Show resolved
Hide resolved
src/solace_ai_connector/components/general/litellm/litellm_base.py
Outdated
Show resolved
Hide resolved
src/solace_ai_connector/components/general/litellm/litellm_chat_model_with_history.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few things to address
# add any other parameters here | ||
- model_name: "claude-3-5-sonnet" # model alias | ||
litellm_params: | ||
model: ${OPENAI_MODEL_NAME} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This example should use different model names and keys here
return response | ||
|
||
def prune_history(self, session_id, history): | ||
current_time = time.time() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it may be time to refactor this code, since I expect it is identical to other components
src/solace_ai_connector/components/general/litellm/litellm_chat_model_with_history.py
Outdated
Show resolved
Hide resolved
src/solace_ai_connector/components/general/utils/chat_history_handler.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Addressed comments
- Replied to some comments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
src/solace_ai_connector/components/general/llm/litellm/litellm_base.py
Outdated
Show resolved
Hide resolved
examples/llm/litellm_embedding.yaml
Outdated
source_expression: | | ||
template:You are a helpful AI assistant. Please help with the user's request below: | ||
<user-question> | ||
{{text://input.payload:text}} | ||
</user-question> | ||
dest_expression: user_data.llm_input:messages.0.content | ||
- type: copy | ||
source_expression: static:user | ||
dest_expression: user_data.llm_input:messages.0.role | ||
input_selection: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Embedding example should not modify the user query. It should be used to get back the vector
src/solace_ai_connector/components/general/llm/litellm/litellm_embeddings.py
Outdated
Show resolved
Hide resolved
src/solace_ai_connector/components/general/llm/litellm/litellm_embeddings.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. Thanks for all the refactoring. It is much better organized now.
from ...component_base import ComponentBase | ||
from ....common.log import log | ||
|
||
class ChatHistoryHandler(ComponentBase): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It feels strange that this inherits from ComponentBase since it isn't a component, however I see that it does use a bunch of the services of the parent (timer, kv_store, etc). I am not necessarily saying we shouldn't do it, but I would be interested in hearing other opinions on it @cyrus2281
What is the purpose of this change?
How is this accomplished?
Anything reviews should focus on/be aware of?
The model name typically combines the model provider and the specific model name (e.g., azure/chatgpt-v-2). Find the accurate names from the list of Proviers.
How to test?
pip install litellm
cd src && python3 -m solace_ai_connector.main ../config.yaml
Repeat steps 3 to 5 for 'litellm_embedding.yaml' and 'litellm_chat_with_history.yaml'