Skip to content

Commit

Permalink
s/sentencepiece/tiktoken
Browse files Browse the repository at this point in the history
  • Loading branch information
ruanslv committed Apr 14, 2024
1 parent 9e8608f commit 3afbe13
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llama/tokenizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ def __init__(self, model_path: str):
mergeable_ranks=mergeable_ranks,
special_tokens=self.special_tokens,
)
logger.info(f"Reloaded SentencePiece model from {model_path}")
logger.info(f"Reloaded tiktoken model from {model_path}")

# BOS / EOS token IDs
self.n_words: int = self.model.n_vocab
Expand Down

0 comments on commit 3afbe13

Please sign in to comment.