-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#477 Estimate tokens usage with streaming completions #507
#477 Estimate tokens usage with streaming completions #507
Conversation
export interface ChatMessage { | ||
id: string | ||
text: string | ||
role: Role | ||
name?: string | ||
delta?: string | ||
detail?: any | ||
detail?: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It actually should not be nullable. In the current logic it can never be undefined in final message object. But I could not find a good way how to type it correctly with current mutation-like codestyle
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's fine to leave it nullable for now. appreciate the note
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this; will merge and make a few small fixes as noted in my comments.
huge thanks! 🙏
@@ -59,13 +59,25 @@ export type SendMessageBrowserOptions = { | |||
abortSignal?: AbortSignal | |||
} | |||
|
|||
interface CreateChatCompletionStreamResponse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i like to export all interfaces for ease of access by libs
export interface ChatMessage { | ||
id: string | ||
text: string | ||
role: Role | ||
name?: string | ||
delta?: string | ||
detail?: any | ||
detail?: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's fine to leave it nullable for now. appreciate the note
Resolves #477
When making requests with
stream
mode (passingonProgress()
) we don't have access to the tokensusage
objectAlternatively, we can calculate tokens
tiktoken
- tokenized that, as they say, itself used in ChatGPT. Actually, cahtgpt-api is already doing it for calculating allowed length of message history sent to ChatGPT.So the only thing I've added:
ChatMessage.detail
to safely read those fields in TS