Change OpenAI API key at Runtime, uses placeholder key instead of configurable field #695
Unanswered
rarchitgupta
asked this question in
Q&A
Replies: 1 comment
-
I’m encountering the same issue where the placeholder key is being used instead of the configurable key, even though the correct API key is present in the config object after debugging. Has anyone found a solution or workaround for this? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've tried to implement a ChatOpenAI runnable which receives the openai_api_key from the request headers
I then use it to extend my chain, initializing the llm like so:
When I run the chain, I get the error:
Is there a reason why the placeholder key is being picked up instead of the configurable key even though I have debugged the per_req_config_modifier to make sure that the config object indeed contains the correct API key?
Things I've referred to so far for this:
https://stackoverflow.com/questions/77857466/with-langchain-how-can-i-change-the-openai-api-key-at-runtime
https://github.com/langchain-ai/langserve/blob/main/examples/configurable_chain/server.py#L95-#L112
Beta Was this translation helpful? Give feedback.
All reactions