Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lighteval with llama3.2 [RuntimeError: No executable batch size found, reached zero.] #525

Open
Nevermetyou65 opened this issue Jan 29, 2025 · 3 comments

Comments

@Nevermetyou65
Copy link

Hi

I tried to use lighteval with meta-llama/Llama-3.2-1B.
I want to run the evaluation for Thai finetasks. But I got this error instead RuntimeError: No executable batch size found, reached zero.
Any ideas? I have a full log with me if you want.
Here is my command

echo "Running lighteval for model: $model"
    lighteval accelerate \
    "pretrained=${model},dtype=bfloat16" \
    "examples/tasks/fine_tasks/mcf/th.txt" \
    --custom-tasks "src/lighteval/tasks/multilingual/tasks.py" \
    --no-use-chat-template

I am using lighteval 0.6.0.dev0 and torch 2.2.2+cu121
I didn't use pip install install lighteval but clone the repo instead and pip install -e . [dev]

@clefourrier
Copy link
Member

Hi!
Yes, the full log would be useful :)

@Nevermetyou65
Copy link
Author

log_llama_3_2_03.txt

Please scroll down to the bottom of this file

@Nevermetyou65
Copy link
Author

Hi, any update, sir?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants