Releases: ju-bezdek/langchain-decorators
Releases · ju-bezdek/langchain-decorators
0.6.0
Version 0.5.4
v0.5.0
v0.4.2
v0.4.1
v0.4.0
Version 0.4.0 (2023-11-25)
- Input kwargs augmentations by implementing the llm_prompt function (checkout example: code_examples/augmenting_llm_prompt_inputs.py )
- support for automatic json fix using if
json_repair
is installed
(not even OpenAI JSON format is not yet perfect)
0.3.0
Version 0.3.0 (2023-11-15)
- Support for new OpenAI models (set as default, you can turn it off by setting env variable
LANGCHAIN_DECORATORS_USE_PREVIEW_MODELS=0
) - automatically turn on new OpenAI JSON mode if
dict
is the output type / JSON output parser - added timeouts to default models definitions
- you can now reference input variables from
__self__
of the object thellm_function
is bound to (not only thellm_prompt
) - few bug fixes
v0.2.1
v0.2.0
Version 0.2.0 (2023-09-20)
- Support for custom template building, to support any kind of prompt block types (#5)
- Support for retrieving a chain object with preconfigured kwargs for more convenient use with the rest of LangChain ecosystem
- support for followup handle for convenient simple followup to response without using a history object
- hotfix support for pydantic v2