Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return model selection metadata when using a prompt router #304

Open
nickovs opened this issue Dec 11, 2024 · 0 comments
Open

Return model selection metadata when using a prompt router #304

nickovs opened this issue Dec 11, 2024 · 0 comments
Labels

Comments

@nickovs
Copy link

nickovs commented Dec 11, 2024

Bedrock now supports the use of prompt routers to choose between multiple models based on the input. This can reduce cost and latency for simple questions while still using more powerful models for more complex inputs.

When a prompt router is used, the Bedrock API will include information about the selected model alongside other response metadata, either in response['trace']['promptRouter'] for invocation with the Converse endpoint or as part of the final metadata event in event['metadata']['trace']['promptRouter'] when using the ConverseStream endpoint.

It would be helpful to return this metadata to the caller when accessing routed LLMs through landchain-aws, since the model choice determines the cost of execution of the query and this may be of importance to the caller.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants