Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix message history limiter for tool call #3178

Merged
merged 10 commits into from
Aug 9, 2024
Merged

Fix message history limiter for tool call #3178

merged 10 commits into from
Aug 9, 2024

Conversation

GaoxiangLuo
Copy link
Contributor

Why are these changes needed?

It will return an error

BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.", 'type': 'invalid_request_error', 'param': 'messages.[1].role', 'code': None}}

because OpenAI API has to takes in pairs of tool_calls and tool.

The following is a minimal example to reproduce the error.

import os
from autogen import ConversableAgent
from typing import Annotated, Literal
from autogen.agentchat.contrib.capabilities import transform_messages, transforms
from autogen.agentchat.contrib.multimodal_conversable_agent import MultimodalConversableAgent

# Define a tool
Operator = Literal["+", "-", "*", "/"]
def calculator(a: int, b: int, operator: Annotated[Operator, "operator"]) -> int:
    if operator == "+":
        return a + b
    elif operator == "-":
        return a - b
    elif operator == "*":
        return a * b
    elif operator == "/":
        return int(a / b)
    else:
        raise ValueError("Invalid operator")

# Initialize agents
assistant = ConversableAgent(
    name="Assistant",
    system_message="You are a helpful AI assistant. "
    "You can help with simple calculations. "
    "Return 'TERMINATE' when the task is done.",
    llm_config={"config_list": [{"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}]},
)
user_proxy = ConversableAgent(
    name="User",
    llm_config=False,
    is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"],
    human_input_mode="NEVER",
)

# Register the tool
assistant.register_for_llm(name="calculator", description="A simple calculator")(calculator)
user_proxy.register_for_execution(name="calculator")(calculator)

# Add MessageHistoryLimiter
context_handling = transform_messages.TransformMessages(
    transforms=[
        transforms.MessageHistoryLimiter(max_messages=7), # Set it to 7 that will seperate role "tool_calls" and "tool"
    ]
)
context_handling.add_to_agent(assistant)
# Start the chat
chat_result = user_proxy.initiate_chat(assistant, message="What is (44232 + 13312 / (232 - 32)) * 5?", cache=None)

Related issue number

#3121

Checks

@GaoxiangLuo
Copy link
Contributor Author

@microsoft-github-policy-service agree

Copy link

gitguardian bot commented Jul 20, 2024

️✅ There are no secrets present in this pull request anymore.

If these secrets were true positive and are still valid, we highly recommend you to revoke them.
Once a secret has been leaked into a git repository, you should consider it compromised, even if it was deleted immediately.
Find here more information about risks.


🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.

@codecov-commenter
Copy link

codecov-commenter commented Jul 23, 2024

Codecov Report

Attention: Patch coverage is 0% with 18 lines in your changes missing coverage. Please review.

Project coverage is 19.79%. Comparing base (6279247) to head (06770c1).
Report is 72 commits behind head on main.

Files Patch % Lines
...togen/agentchat/contrib/capabilities/transforms.py 0.00% 18 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #3178       +/-   ##
===========================================
- Coverage   32.90%   19.79%   -13.12%     
===========================================
  Files          94      103        +9     
  Lines       10235    10913      +678     
  Branches     2193     2491      +298     
===========================================
- Hits         3368     2160     -1208     
- Misses       6580     8547     +1967     
+ Partials      287      206       -81     
Flag Coverage Δ
unittests 19.73% <0.00%> (-13.17%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@marklysze marklysze self-requested a review July 24, 2024 08:41
@marklysze
Copy link
Collaborator

marklysze commented Jul 24, 2024

Thanks @GaoxiangLuo for putting this together and for fixing this error.

I've tested your updated transform code and it does work. However, in testing I have identified that by truncating the messages more (e.g. max_messages=5) the original user message gets removed and then the LLM can no longer understand the task and terminates early. I think this will be a general issue with the use of this history limiter (separate to your tools issue but also affecting it).

I'm proposing a change to the MessageHistoryLimiter that now includes an additional parameter called keep_first_message that will always keep the first message and then concatenate the truncated messages. This ensures that the first message, which typically contains core task information, is kept in the conversation.

Let me know what you think.

Change class MessageHistoryLimiter's __init__ function:

    def __init__(self, max_messages: Optional[int] = None, keep_first_message: Optional[bool] = False):
        """
        Args:
            max_messages Optional[int]: Maximum number of messages to keep in the context. Must be greater than 0 if not None.
            keep_first_message Optional[bool]: Whether to keep the original first message in the conversation history.
                Defaults to False. Does not count towards truncation.
        """
        self._validate_max_messages(max_messages)
        self._max_messages = max_messages
        self._keep_first_message = keep_first_message

and the apply_transform updated (with your code followed by new code):

    def apply_transform(self, messages: List[Dict]) -> List[Dict]:
        """Truncates the conversation history to the specified maximum number of messages.

        This method returns a new list containing the most recent messages up to the specified
        maximum number of messages (max_messages). If max_messages is None, it returns the
        original list of messages unmodified.

        Args:
            messages (List[Dict]): The list of messages representing the conversation history.

        Returns:
            List[Dict]: A new list containing the most recent messages up to the specified maximum.
        """

        if self._max_messages is None:
            return messages

        truncated_messages = messages[-self._max_messages :]
        # If the last message is a tool message, include its preceding message that must be a tool_calls message
        if truncated_messages[0].get("role") == "tool":
            start_index = max(-self._max_messages - 1, -len(messages))
            truncated_messages = messages[start_index:]

        # Keep the first message if required
        if self._keep_first_message and messages[0] != truncated_messages[0]:
            truncated_messages = [messages[0]] + truncated_messages

        return truncated_messages

Let me know what you think and whether you want to include it in this change, otherwise I'll create a new PR.

@GaoxiangLuo
Copy link
Contributor Author

Thank you @marklysze. LGTM. Please include it in this change.

@marklysze
Copy link
Collaborator

Thank you @marklysze. LGTM. Please include it in this change.

Great, thanks @GaoxiangLuo, I've updated the branch to include those changes. It would worth testing and let us know how you go :).

Copy link
Contributor

@WaelKarkoub WaelKarkoub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @GaoxiangLuo @marklysze for these fixes! I'm just worried that the total count of messages, after applying this transform, becomes unintuitive. For example, if I set max_messages = 5, keep_first_message = True, intuitively I would think I would have a list of 5 messages with the first being the first created message. But with this fix, I believe I get back 6 messages.

What do you think of using a queue and just anchoring the important messages (first message, tool call etc..) to respect themax_messages argument? Everything else looks great and I'll let you guys decide on the final solution!

autogen/agentchat/contrib/capabilities/transforms.py Outdated Show resolved Hide resolved
@WaelKarkoub WaelKarkoub self-requested a review August 7, 2024 01:00
Copy link
Contributor

@WaelKarkoub WaelKarkoub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

@marklysze
Copy link
Collaborator

Looks good!

Thanks @WaelKarkoub!

@marklysze
Copy link
Collaborator

CC @qingyun-wu, @sonichi - I think we're good to go on this one.

@sonichi sonichi enabled auto-merge August 9, 2024 03:11
@sonichi sonichi added this pull request to the merge queue Aug 9, 2024
Merged via the queue into microsoft:main with commit 972b4ed Aug 9, 2024
137 of 151 checks passed
@GaoxiangLuo GaoxiangLuo deleted the gaoxiangluo-fix-message-history-limiter-for-function-call branch August 9, 2024 17:07
victordibia pushed a commit that referenced this pull request Aug 28, 2024
* fix: message history limiter to support tool calls

* add: pytest and docs for message history limiter for tool calls

* Added keep_first_message for HistoryLimiter transform

* Update to inbetween to between

* Updated keep_first_message to non-optional, logic for history limiter

* Update transforms.py

* Update test_transforms to match utils introduction, add keep_first_message testing

* Update test_transforms.py for pre-commit checks

---------

Co-authored-by: Mark Sze <[email protected]>
Co-authored-by: Chi Wang <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants