Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated get_max_token_limit with latest models and numbers #972

Merged
merged 4 commits into from
Dec 18, 2023

Conversation

afourney
Copy link
Member

Why are these changes needed?

New models are available.

Related issue number

Closes #971

Checks

@codecov-commenter
Copy link

codecov-commenter commented Dec 13, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (605882b) 26.46% compared to head (365c67d) 32.25%.
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #972      +/-   ##
==========================================
+ Coverage   26.46%   32.25%   +5.78%     
==========================================
  Files          28       28              
  Lines        3801     3804       +3     
  Branches      865      907      +42     
==========================================
+ Hits         1006     1227     +221     
+ Misses       2724     2469     -255     
- Partials       71      108      +37     
Flag Coverage Δ
unittests 32.15% <100.00%> (+5.73%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@afourney afourney requested a review from a team December 13, 2023 22:44
Copy link
Collaborator

@thinkall thinkall left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @afourney !

autogen/token_count_utils.py Show resolved Hide resolved
@afourney
Copy link
Member Author

@thinkall Ahh, now I understand. Ok, re-added. I also added tests along those lines.

Copy link
Collaborator

@thinkall thinkall left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @afourney , LGTM.

@qingyun-wu qingyun-wu added this pull request to the merge queue Dec 18, 2023
Merged via the queue into main with commit b29f9a9 Dec 18, 2023
76 of 84 checks passed
rlam3 pushed a commit to rlam3/autogen that referenced this pull request Dec 19, 2023
…#972)

* Updated with latest models and numbers

* Updated as per comments.

* Added common Azure deployment names for models.

---------

Co-authored-by: Qingyun Wu <[email protected]>
@sonichi sonichi deleted the update_max_token_limits branch December 27, 2023 17:25
whiskyboy pushed a commit to whiskyboy/autogen that referenced this pull request Apr 17, 2024
…#972)

* Updated with latest models and numbers

* Updated as per comments.

* Added common Azure deployment names for models.

---------

Co-authored-by: Qingyun Wu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: token_count_utils.py does not list new OpenAI models
5 participants