-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA 11.8 not working #27
Comments
Hey! I haven't tried other versions of onnxruntime-gpu, but it installs by default for CUDA 12. onnxruntime-gpu does have legacy CUDA 11.8 packages, but they must be manually specified. There are directions here: https://onnxruntime.ai/docs/install/#requirements Also, I'm using a 2060 on Windows 11, and CUDA 12 installs and runs without an issue. It appears to have support for GPUs back to the 2060, so it's worth a try seeing if CUDA 12 now supports your 3060 Ti. Hope this helps. |
I tried a bunch of things to get CUDA 12 working. Reinstalled all NVIDIA related packages. Installed
What I think finally got it was referencing #29, I added torch to v0.20.2 which I had installed, but also during
at the suggestion of Copilot. A combination of these things and I got the warning while running:
but it is using GPU now and looks to be signficantly faster 👍 |
Fixed in v3, now it uses Torch. |
I have an NVIDIA GeForce 3060 Ti that I have gotten to work in WSL2 with CUDA 11.8 in other projects, such as those involving whisper and immich-machine-learning.
After installing
In a virtualenv with Python 3.12, and then run:
audiblez Katamari\ Damacy\ -\ L\ E\ Hall.epub -l en-us -v am_michael -s 1.1 --providers CUDAExecutionProvider
I have confirmed that CUDA is running
My GPU only supports CUDA 11.8 and was wondering if maybe I am missing something, or if the script needs updating to support 11.8 (I saw CUDA support was only recently added) or if I will have to use CPU?
The text was updated successfully, but these errors were encountered: