Breaking change: consolidate LLM callback functions #228
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Breaking Changes
How LLM callbacks are registered has changed. The callback function's arguments have also changed.
Specifically, this refers to the callbacks:
on_llm_new_delta
on_llm_new_message
on_llm_ratelimit_info
on_llm_token_usage
The callbacks are still supported, but how they are registered and the arguments passed to the linked functions has changed.
Previously, an LLM callback's first argument was the chat model, it is now the LLMChain that is running it.
A ChatModel still has the
callbacks
struct attribute, but it should be considered private.Why the change
Having some callback functions registered on the chat model and some registered on the chain was confusing. What goes where? Why the difference?
This change moves them all to the same place, removing a source of confusion.
The primary reason for the change is that important information about the context of the callback event was not available to the callback function. Information stored in the chain's
custom_context
can be valuable and important, like a user's account ID, but it was not easily accessible in a callback likeon_llm_token_usage
where we might want to record the user's token usage linked to their account.This important change passes the entire
LLMChain
through to the callback function, giving the function access to thecustom_context
. This makes the LLM (aka chat model) callback functions expect the same arguments as the other chain focused callback functions.This both unifies how the callbacks operate and what data they have available, and it groups them all together.
Adapting to the change
A before example:
This is updated to: (comments highlight changes)
If you still need access to the LLM in the callback functions, it's available in
chain.llm
.The change is a breaking change, but should be fairly easy to update.
This consolidates how callback events work and them more powerful by exposing important information to the callback functions.
If you were using the
LLMChain.add_llm_callback/2
, the change is even easier:From:
To:
Details of the change
LangChain.ChatModels.LLMCallbacks
module.LangChain.Chains.ChainCallbacks
.LangChain.Chains.LLMChain.add_llm_callback/2
LangChain.ChatModels.ChatOpenAI.new/1
andLangChain.ChatModels.ChatOpenAI.new!/1
no longer accept:callbacks
on the chat model.LangChain.ChatModels.ChatModel.add_callback/2