Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extra config required for with_structured_output with Provisioned Throughput #264

Open
athewsey opened this issue Oct 30, 2024 · 2 comments

Comments

@athewsey
Copy link

(Sorry there's not as much detail here as I'd like, but it's a second-hand report from ~3 weeks ago, from a customer we've been working with - and I haven't yet been able to get my hands on a PT deployment to fully reproduce, get stack traces, etc)

Symptom: ChatBedrockConverse.with_structured_output fails when using Claude (3) on Provisioned Throughput, but works with On-Demand

Cause:

  • ChatBedrock and ChatBedrockConverse both use model_id-based logic to determine default values for supports_tool_choice_values.
  • When using a Provisioned Throughput endpoint, this logic fails because the PT ARN (passed as model_id) is opaque as to what model is deployed.
  • With the inferred supports_tool_choice_values = () value, with_structured_output fails to parse the response into the provided data model.

Workaround: Explicitly specify supports_tool_choice_value = ["auto", "any", "tool"] in the constructor

Asks:

  1. Would it be feasible for LangChain to:
    • Detect when the passed model_id is a PT ARN
    • Attempt to look up the underlying model ID via e.g. the GetProvisionedModelThroughput API, and use that for the tool choice (and whatever else needs model ID) if possible
    • (I guess log a warning in case the model ID couldn't be automatically looked-up, due to e.g. not having IAM permission for that separate API call)
  2. It seems unexpected that this feature would fail if tools aren't available - shouldn't it default to client-side orchestration? It'd be great if we could check exactly what the expected behaviour here is and maybe expand the documentation?
@3coins
Copy link
Collaborator

3coins commented Oct 30, 2024

@athewsey
A current workaround for this is to pass the provider value when initiating the class. This should pick the right set of settings for the anthropic models when used with the ChatBedrockConverse class. We are tracking fix for this in #253, which will include some of the suggestions you mentioned.

@3coins
Copy link
Collaborator

3coins commented Dec 9, 2024

@athewsey
Checking back on this issue, if you were able to use the provider field to fix this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants