Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Not able to update transformers relys on coqui-tts==0.25.3 #306

Closed
bigsk1 opened this issue Feb 20, 2025 · 3 comments · Fixed by #319
Closed

[Bug] Not able to update transformers relys on coqui-tts==0.25.3 #306

bigsk1 opened this issue Feb 20, 2025 · 3 comments · Fixed by #319
Labels
bug Something isn't working

Comments

@bigsk1
Copy link

bigsk1 commented Feb 20, 2025

Describe the bug

Unable to update transformers to 4.48.0 due to CVE, coqui-tts==0.25.3 relys on transformers==4.46.2, anyone else have this issue?

CVE ID
CVE-2024-11393

To Reproduce

my requirements.txt

https://github.com/bigsk1/voice-chat-ai/blob/main/requirements.txt

Looking to update transformers getting warning coqui-tts==0.25.3 relys on specific version transformers==4.46.2

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
coqui-tts 0.25.3 requires transformers<=4.46.2,>=4.43.0, but you have transformers 4.49.0 which is incompatible.

Expected behavior

updating transformers to none cve version 4.50.0+

Logs

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
coqui-tts 0.25.3 requires transformers<=4.46.2,>=4.43.0, but you have transformers 4.49.0 which is incompatible.

Environment

- python version 3.10.14 
- torch==2.3.1+cpu
torchaudio==2.3.1+cpu
torchvision==0.18.1+cpu
-f https://download.pytorch.org/whl/torch_stable.html

also is same using pytorch cuda 

torch==2.3.1+cu121
torchaudio==2.3.1+cu121
torchvision==0.18.1+cu121
-f https://download.pytorch.org/whl/torch_stable.html

Additional context

No response

@bigsk1 bigsk1 added the bug Something isn't working label Feb 20, 2025
@bigsk1 bigsk1 changed the title [Bug] Not able to update transformers replies on coqui-tts==0.25.3 [Bug] Not able to update transformers relys on coqui-tts==0.25.3 Feb 20, 2025
@eginhard
Copy link
Member

Yes, 4.46.2 is the highest transformers version currently supported. Newer versions require changes in the XTTS streaming code - this should be done in the next Coqui release (#150).

@eginhard
Copy link
Member

Any transformers>=4.47 can now be used with the dev branch and soon with coqui-tts release 0.26.0.

@ROBERT-MCDOWELL
Copy link

GREAT JOB! @eginhard !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants