-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatBedrock does not support Custom Model Import models #153
Comments
@baskaryan could you please let us know if there is a fix for this -- the API examnple can be found here - https://github.com/aws-samples/amazon-bedrock-samples/blob/main/custom_models/import_models/llama-3/llama3-ngrammedqa-fine-tuning.ipynb scroll to the def call_invoke_model_and_print(native_request): function |
@lopezfelipe @rsgrewal-aws |
Here is an example of the same request being sent to the model using the
The code from above returns |
am wondering why would it respond with a null value, does that means there is an error being thrown somewhere ? Second part to this how do we want to handle no response / null from the model. Ideally we should return the response as is -- which means if it null then a string like "null" ? |
@3coins could you please let us know when we can expect some fixes ? Secondly as part of restructure can we move invoke to a separate method. That will let this class be open to extension easily |
Issue: Trying to invoke a custom Llama 3 8B model imported with Bedrock Custom Model Import. The model was fine tuned and tested with the
invoke_model
function as described here.Observed issue: The following validation error is returned
The text was updated successfully, but these errors were encountered: