Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: log wrap_openai runs with unified usage_metadata #1071

Merged
merged 11 commits into from
Oct 14, 2024

Conversation

agola11
Copy link
Contributor

@agola11 agola11 commented Oct 7, 2024

No description provided.

python/docs/create_api_rst.py Outdated Show resolved Hide resolved
@agola11 agola11 changed the title [draft]: openai token counting feat: log wrap_openai runs with unified usage_metadata Oct 8, 2024
@agola11 agola11 marked this pull request as ready for review October 8, 2024 05:08
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These test files will be synced over to langsmith repo for test data

@@ -160,12 +167,59 @@ def _reduce_completions(all_chunks: List[Completion]) -> dict:
return d


def _create_usage_metadata(oai_token_usage: dict) -> UsageMetadata:
input_tokens = oai_token_usage.get("prompt_tokens", 0)
output_tokens = oai_token_usage.get("completion_tokens", 0)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would you ever get "None" as a value here? Maybe better to do "or 0" to avoid a None + None scenario?

@@ -891,3 +891,64 @@ class PromptSortField(str, Enum):
"""Last updated time."""
num_likes = "num_likes"
"""Number of likes."""


class InputTokenDetails(TypedDict, total=False):
Copy link
Contributor

@baskaryan baskaryan Oct 8, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unrelated to this pr — wonder if langchain-core should import these from langsmith sdk now

@nfcampos nfcampos merged commit 5cfe416 into main Oct 14, 2024
17 checks passed
@nfcampos nfcampos deleted the ankush/wrap-openai-token-counts branch October 14, 2024 17:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants