Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong input key for Cohere models in BedrockLLM #43

Closed
HFR1994 opened this issue May 9, 2024 · 5 comments
Closed

Wrong input key for Cohere models in BedrockLLM #43

HFR1994 opened this issue May 9, 2024 · 5 comments
Labels

Comments

@HFR1994
Copy link

HFR1994 commented May 9, 2024

I'm using the cohere.command-r-v1:0 model to test some prompts. After noticing that Cohere doesn't support Chat #33, I changed to BedrockLLM. However I keep getting the following message:

ValueError: Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: #: required key [message] not found#: extraneous key [prompt] is not permitted, please reformat your input and try again.

I have tried several methods including:

import boto3 

session = boto3.Session(region_name=region)
boto3_bedrock = session.client(service_name="bedrock-runtime")

llm = BedrockLLM(region_name="us-east-1", model_id="cohere.command-r-v1:0" client=boto3_bedrock, verbose=verbose)
llm.invoke("Hey") # Fails

# ----------------------------------

prompt = ChatPromptTemplate.from_messages(
        [
            ("system", "You are a greeting chatbot, greet users only"),
            ("human", "{greeting}"),
        ]
    )
chain = (prompt | llm)

chain.invoke({"greeting": "Hey"}) # Fails

After validating the error message and line 134 in libs/aws/langchain_aws/llm/bedrock I think it should be rewritten as:

@classmethod
def prepare_input(
    cls,
    provider: str,
    model_kwargs: Dict[str, Any],
    prompt: Optional[str] = None,
    system: Optional[str] = None,
    messages: Optional[List[Dict]] = None,
) -> Dict[str, Any]:
    input_body = {**model_kwargs}
    if provider == "anthropic":
        if messages:
            input_body["anthropic_version"] = "bedrock-2023-05-31"
            input_body["messages"] = messages
            if system:
                input_body["system"] = system
            if "max_tokens" not in input_body:
                input_body["max_tokens"] = 1024
        if prompt:
            input_body["prompt"] = _human_assistant_format(prompt)
            if "max_tokens_to_sample" not in input_body:
                input_body["max_tokens_to_sample"] = 1024
    elif provider == "cohere"):
        input_body["message"] = prompt
    elif provider in ("ai21", "meta", "mistral"):
        input_body["prompt"] = prompt
    elif provider == "amazon":
        input_body = dict()
        input_body["inputText"] = prompt
        input_body["textGenerationConfig"] = {**model_kwargs}
    else:
        input_body["inputText"] = prompt

    return input_body

Here are my dependencies:

langchain==0.1.19
langchain-aws==0.1.3

I'm using Python 3.11

@ksaegusa
Copy link

Hello.
I am facing the same problem.

I think the format of json_body is different between command-r and command, so I have to rewrite the following.

  # Command-R
  if messages:
      input_body["chat_history"] = messages['chat_history']
      input_body["message"] = messages['message']
  # Command
  else:
      input_body["prompt"] = prompt

I assume that Command R will be currently unsupported, just as Claude3 does not support BedrockBase.
https://github.com/langchain-ai/langchain-aws/blob/main/libs/aws/langchain_aws/llms/bedrock.py#L744

    @root_validator()
    def validate_environment(cls, values: Dict) -> Dict:
        model_id = values["model_id"]
        if model_id.startswith("anthropic.claude-3"):
            raise ValueError(
                "Claude v3 models are not supported by this LLM."
                "Please use `from langchain_community.chat_models import BedrockChat` "
                "instead."
            )
        return super().validate_environment(values)

It would be better to use ChatBedrock for this purpose.

I am now implementing my own Rv request with the following PR.
#42

@HFR1994
Copy link
Author

HFR1994 commented May 17, 2024

I was trying to do my own commit; however, I'm not really sure how to locally test it. Any pointers @ksaegusa?

@ksaegusa
Copy link

@HFR1994
I followed the steps below to install my repository locally:

git clone https://github.com/ksaegusa/langchain-aws.git
cd langchain-aws
git switch add_cohere_model
pip install -e libs/aws/

After that, I wrote a simple script to verify the functionality.
image

I hope this answers your question

@3coins 3coins added the bedrock label May 23, 2024
@ventz
Copy link

ventz commented Jul 1, 2024

Thanks for this update!

Any news on when #43 will be merged?

@3coins
Copy link
Collaborator

3coins commented Oct 11, 2024

Use ChatModelConverse or ChatBedrock(beta_use_converse_api=True), as mentioned in #238. We will be tracking any new gaps with cohere in that issue, please feel free to comment there.

@3coins 3coins closed this as completed Oct 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants