Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Token usage stats from responses should be returned to the client. #618

Closed
SOE-YoungS opened this issue Apr 24, 2023 · 1 comment · Fixed by #1181
Closed

Token usage stats from responses should be returned to the client. #618

SOE-YoungS opened this issue Apr 24, 2023 · 1 comment · Fixed by #1181

Comments

@SOE-YoungS
Copy link
Contributor

OpenAI / Azure OpenAI both return token stats with responses.

These are useful for tracking usage etc... on a case by case basis for things like "inter-departmental billing" etc...

Currently SK only supports returning the completion (string), meaning token usage is not accessible to the end user & unavailable for future evaluation / tracking.

SK, should return an object containing token stats & completion string.

Example use case(s):

  1. A large enterprise (parent company pays the bills) with subsidiaries (AI implementation end users). The parent will want to track token usage by each subsidiary to ensure cross charging for usage is correct.
  2. Stored as a statistic in an end users account. This allows the end user to see & be aware of token usage over time VS individual queries.
@evchaki
Copy link
Contributor

evchaki commented Apr 27, 2023

@SOE-YoungS , great idea, we will take a look at getting this added.

lemillermicrosoft pushed a commit that referenced this issue Jun 7, 2023
… etc) (#1181)

### Motivation and Context

Common feature request and missing capability to acquire the actual
details from the models we are integrating.

Resolves #802
Resolves #618  
Resolves Partially #351 
Getting the results alternative to #693 

### Description

This change introduces a new Property in `object?
SKContext.LastPromptResults` which will contain a list of each result
detail for a given prompt.

The Connector namespace will also provide a Extension methods that
converts this object in the actual class that you will be using to
serialize or get the detailed information about the Model result.

Normal suggested usage 
```
var textResult = await excuseFunction.InvokeAsync("I missed the F1 final race");
```

Getting model result as json:
```
var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);
```

Getting model result as a traversable object (using a connector
extension):
```
var modelResultJson = textResult.GetOpenAILastPromptResult()?.Usage.TotalTokens;
```
shawncal pushed a commit to shawncal/semantic-kernel that referenced this issue Jul 6, 2023
… etc) (microsoft#1181)

### Motivation and Context

Common feature request and missing capability to acquire the actual
details from the models we are integrating.

Resolves microsoft#802
Resolves microsoft#618  
Resolves Partially microsoft#351 
Getting the results alternative to microsoft#693 

### Description

This change introduces a new Property in `object?
SKContext.LastPromptResults` which will contain a list of each result
detail for a given prompt.

The Connector namespace will also provide a Extension methods that
converts this object in the actual class that you will be using to
serialize or get the detailed information about the Model result.

Normal suggested usage 
```
var textResult = await excuseFunction.InvokeAsync("I missed the F1 final race");
```

Getting model result as json:
```
var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);
```

Getting model result as a traversable object (using a connector
extension):
```
var modelResultJson = textResult.GetOpenAILastPromptResult()?.Usage.TotalTokens;
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants