-
Notifications
You must be signed in to change notification settings - Fork 138
Issues: huggingface/lighteval
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[FT] Faster generation with TransformersModel by using less padding
feature request
New feature/request
#531
opened Feb 3, 2025 by
rolshoven
lighteval with llama3.2 [RuntimeError: No executable batch size found, reached zero.]
#525
opened Jan 29, 2025 by
Nevermetyou65
[BUG] ImportError: cannot import name 'ExprExtractionConfig' from 'lighteval.metrics.dynamic_metrics'
bug
Something isn't working
#523
opened Jan 28, 2025 by
hlzhang109
[FT] Enable lazy model initialization
feature request
New feature/request
#496
opened Jan 11, 2025 by
JoelNiklaus
[FT] Custom model to TransformersModel
feature request
New feature/request
#489
opened Jan 7, 2025 by
Giuseppe5
[BUG] By default pip install lighteval is installing the cpu only torch version , its killing dependencies.
bug
Something isn't working
#487
opened Jan 6, 2025 by
kzos
[FT] Add and test multinode runs back
feature request
New feature/request
#482
opened Jan 2, 2025 by
clefourrier
[FT] Enhancing CorpusLevelTranslationMetric with Asian Language Support
feature request
New feature/request
#478
opened Dec 27, 2024 by
ryan-minato
[FT] JudgeLLM should support litellm backend
feature request
New feature/request
#474
opened Dec 22, 2024 by
JoelNiklaus
[BUG] Issue with LightevalTaskConfig.stop_sequence Attribute When Unset
bug
Something isn't working
#462
opened Dec 19, 2024 by
ryan-minato
[BUG] Issue with CACHE_DIR Default Value in Accelerate Pipeline
bug
Something isn't working
#460
opened Dec 19, 2024 by
ryan-minato
[FT] remove openai endpoint and only use litellm
feature request
New feature/request
#458
opened Dec 18, 2024 by
NathanHB
[FT] Align parameter names in config files and config classes
feature request
New feature/request
#439
opened Dec 12, 2024 by
albertvillanova
[FT] Fail faster when passing unsupported metrics to InferenceEndpointModel
feature request
New feature/request
#436
opened Dec 11, 2024 by
albertvillanova
[FT] Enable the evaluation of any function
feature request
New feature/request
#430
opened Dec 10, 2024 by
JoelNiklaus
[FT] Adding caching for each dataset run
feature request
New feature/request
#417
opened Dec 2, 2024 by
JoelNiklaus
[FT] Add System Prompt field in LightevalTaskConfig that can be used by model clients
feature request
New feature/request
#410
opened Nov 28, 2024 by
JoelNiklaus
[FT] The word "pretrained" is required in model_args but not in model_config_path
feature request
New feature/request
#405
opened Nov 25, 2024 by
albertvillanova
Previous Next
ProTip!
Adding no:label will show everything without a label.