Skip to content

Commit

Permalink
Swap model and text parameters in count_tokens
Browse files Browse the repository at this point in the history
  • Loading branch information
nicovank committed Oct 4, 2023
1 parent b9a207d commit a783141
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 2 deletions.
1 change: 0 additions & 1 deletion src/llm_utils/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +0,0 @@
__all__ = ["llm_utils"]
2 changes: 1 addition & 1 deletion src/llm_utils/llm_utils.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import tiktoken

# OpenAI specific.
def count_tokens(string: str, model: str) -> int:
def count_tokens(model: str, string: str) -> int:
"""Returns the number of tokens in a text string."""
encoding = tiktoken.encoding_for_model(model)
num_tokens = len(encoding.encode(string))
Expand Down

0 comments on commit a783141

Please sign in to comment.