Skip to content

Commit

Permalink
Merge pull request meta-llama#703 from yanxiyue/main
Browse files Browse the repository at this point in the history
fix max_batch_size for chat example
  • Loading branch information
jspisak authored Aug 26, 2023
2 parents ea9f33d + c25b02d commit a668741
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Examples using llama-2-7b-chat:
torchrun --nproc_per_node 1 example_chat_completion.py \
--ckpt_dir llama-2-7b-chat/ \
--tokenizer_path tokenizer.model \
--max_seq_len 512 --max_batch_size 4
--max_seq_len 512 --max_batch_size 6
```

Llama 2 is a new technology that carries potential risks with use. Testing conducted to date has not — and could not — cover all scenarios.
Expand Down

0 comments on commit a668741

Please sign in to comment.