Skip to content

Commit

Permalink
UsageMetadata - add cachedContentTokenCount field (#178)
Browse files Browse the repository at this point in the history
Add a `cachedContentTokenCount` field to the `UsageMetadata` interface returned by `generateContent` responses.
  • Loading branch information
DellaBitta authored Jun 24, 2024
1 parent 92662ca commit fb1c0f2
Show file tree
Hide file tree
Showing 5 changed files with 22 additions and 0 deletions.
5 changes: 5 additions & 0 deletions .changeset/angry-hotels-learn.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@google/generative-ai": minor
---

Add a `cachedContentTokenCount` field to the `UsageMetadata` interface returned by `generateContent` responses.
1 change: 1 addition & 0 deletions common/api-review/generative-ai.api.md
Original file line number Diff line number Diff line change
Expand Up @@ -638,6 +638,7 @@ export interface ToolConfig {

// @public
export interface UsageMetadata {
cachedContentTokenCount?: number;
candidatesTokenCount: number;
promptTokenCount: number;
totalTokenCount: number;
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [UsageMetadata](./generative-ai.usagemetadata.md) &gt; [cachedContentTokenCount](./generative-ai.usagemetadata.cachedcontenttokencount.md)

## UsageMetadata.cachedContentTokenCount property

Total token count in the cached part of the prompt, i.e. in the cached content.

**Signature:**

```typescript
cachedContentTokenCount?: number;
```
1 change: 1 addition & 0 deletions docs/reference/main/generative-ai.usagemetadata.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ export interface UsageMetadata

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [cachedContentTokenCount?](./generative-ai.usagemetadata.cachedcontenttokencount.md) | | number | _(Optional)_ Total token count in the cached part of the prompt, i.e. in the cached content. |
| [candidatesTokenCount](./generative-ai.usagemetadata.candidatestokencount.md) | | number | Total number of tokens across the generated candidates. |
| [promptTokenCount](./generative-ai.usagemetadata.prompttokencount.md) | | number | Number of tokens in the prompt. |
| [totalTokenCount](./generative-ai.usagemetadata.totaltokencount.md) | | number | Total token count for the generation request (prompt + candidates). |
Expand Down
2 changes: 2 additions & 0 deletions packages/main/types/responses.ts
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,8 @@ export interface UsageMetadata {
candidatesTokenCount: number;
/** Total token count for the generation request (prompt + candidates). */
totalTokenCount: number;
/** Total token count in the cached part of the prompt, i.e. in the cached content. */
cachedContentTokenCount?: number;
}

/**
Expand Down

0 comments on commit fb1c0f2

Please sign in to comment.