Skip to content

Releases: ju-bezdek/langchain-decorators

0.6.0

06 May 22:35
Compare
Choose a tag to compare

Version 0.6.0 (2024-05-07)

  • support for LangChain Runnables as llms, that allows use of llm.with_fallback syntax when defining llms
  • support for passing llm directly as kwarg to prompt

Version 0.5.4

02 Apr 11:01
Compare
Choose a tag to compare

Version 0.5.4 (2024-04-02)

  • minor improvement in JSON output parser

v0.5.0

06 Jan 15:10
Compare
Choose a tag to compare

Version 0.5.0 (2024-01-06)

  • ability to pass in function to augment function arguments before executing in OutputWithFunctionCall

v0.4.2

20 Dec 19:23
Compare
Choose a tag to compare

Version 0.4.2 (2023-12-20)

  • critical bugfix - Assistant messages without context (text) but only with arguments were ignored

v0.4.1

18 Dec 13:10
Compare
Choose a tag to compare

Version 0.4.1 (2023-12-18)

  • support for func_description passed as part of llm_function decorator
  • allowed not having func_description
  • minor fixes

v0.4.0

25 Nov 09:17
Compare
Choose a tag to compare

Version 0.4.0 (2023-11-25)

  • Input kwargs augmentations by implementing the llm_prompt function (checkout example: code_examples/augmenting_llm_prompt_inputs.py )
  • support for automatic json fix using if json_repair is installed
    (not even OpenAI JSON format is not yet perfect)

0.3.0

15 Nov 22:12
Compare
Choose a tag to compare

Version 0.3.0 (2023-11-15)

  • Support for new OpenAI models (set as default, you can turn it off by setting env variable LANGCHAIN_DECORATORS_USE_PREVIEW_MODELS=0 )
  • automatically turn on new OpenAI JSON mode if dict is the output type / JSON output parser
  • added timeouts to default models definitions
  • you can now reference input variables from __self__ of the object the llm_function is bound to (not only the llm_prompt)
  • few bug fixes

v0.2.1

21 Sep 07:28
Compare
Choose a tag to compare

Version 0.2.1 (2023-09-21)

  • Hotfix of bug causing simple (without prompt blocks) prompts not working

v0.2.0

20 Sep 12:24
Compare
Choose a tag to compare

Version 0.2.0 (2023-09-20)

  • Support for custom template building, to support any kind of prompt block types (#5)
  • Support for retrieving a chain object with preconfigured kwargs for more convenient use with the rest of LangChain ecosystem
  • support for followup handle for convenient simple followup to response without using a history object
  • hotfix support for pydantic v2

Version 0.1.0 (2023-08-09)

10 Aug 06:17
Compare
Choose a tag to compare
  • Support for dynamic function schema, that allows augment the function schema dynamically based on the input more here
  • Support Functions provider, that allows control function/tool selection that will be fed into LLM more here
  • Minor fix for JSON output parser for array scenarios