Skip to content

Commit

Permalink
Release 0.19a0
Browse files Browse the repository at this point in the history
Refs #610, #641
  • Loading branch information
simonw committed Nov 20, 2024
1 parent cfb10f4 commit 02852fe
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 1 deletion.
8 changes: 8 additions & 0 deletions docs/changelog.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# Changelog

(v0_19a0)=
## 0.19a0 (2024-11-19)

- Tokens used by a response are now logged to new `input_tokens` and `output_tokens` integer columns and a `token_details` JSON string column, for the default OpenAI models and models from other plugins that {ref}`implement this feature <advanced-model-plugins-usage>`. [#610](https://github.com/simonw/llm/issues/610)
- `llm prompt` now takes a `-u/--usage` flag to display token usage at the end of the response.
- `llm logs -u/--usage` shows token usage information for logged responses.
- `llm prompt ... --async` responses are now logged to the database. [#641](https://github.com/simonw/llm/issues/641)

(v0_18)=
## 0.18 (2024-11-17)

Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
from setuptools import setup, find_packages
import os

VERSION = "0.18"
VERSION = "0.19a0"


def get_long_description():
Expand Down

0 comments on commit 02852fe

Please sign in to comment.