Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#477 Estimate tokens usage with streaming completions #507

Merged

Conversation

alxmiron
Copy link
Contributor

Resolves #477

When making requests with stream mode (passing onProgress()) we don't have access to the tokens usage object
Alternatively, we can calculate tokens tiktoken - tokenized that, as they say, itself used in ChatGPT. Actually, cahtgpt-api is already doing it for calculating allowed length of message history sent to ChatGPT.
So the only thing I've added:

  • calculated tokens of complete response in a streamed message
  • added it to a message object before return to user
  • added types for ChatMessage.detail to safely read those fields in TS

src/chatgpt-api.ts Show resolved Hide resolved
export interface ChatMessage {
id: string
text: string
role: Role
name?: string
delta?: string
detail?: any
detail?:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It actually should not be nullable. In the current logic it can never be undefined in final message object. But I could not find a good way how to type it correctly with current mutation-like codestyle

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's fine to leave it nullable for now. appreciate the note

src/chatgpt-api.ts Show resolved Hide resolved
Copy link
Owner

@transitive-bullshit transitive-bullshit left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this; will merge and make a few small fixes as noted in my comments.

huge thanks! 🙏

@@ -59,13 +59,25 @@ export type SendMessageBrowserOptions = {
abortSignal?: AbortSignal
}

interface CreateChatCompletionStreamResponse

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i like to export all interfaces for ease of access by libs

export interface ChatMessage {
id: string
text: string
role: Role
name?: string
delta?: string
detail?: any
detail?:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's fine to leave it nullable for now. appreciate the note

src/chatgpt-api.ts Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

How to get the usage of "CompletionResponseUsage" returned by OpenAI API?
2 participants