Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug]: Can not use local model as input #148

Closed
zjcDM opened this issue May 9, 2024 · 2 comments
Closed

[bug]: Can not use local model as input #148

zjcDM opened this issue May 9, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@zjcDM
Copy link

zjcDM commented May 9, 2024

Describe the bug

I download a model store in local path like this:
image

when pass local model path to LLMLingua, raise error:
image
image
image

plse tell me how to fix this, thx.

@zjcDM zjcDM added the bug Something isn't working label May 9, 2024
@iofu728 iofu728 self-assigned this May 10, 2024
@iofu728
Copy link
Contributor

iofu728 commented May 10, 2024

Hi @zjcDM, thanks for your support. It seems like you are encountering the same issue as in #106, where your environment cannot connect to openaipublic.blob.core.windows.net.

You can follow the solution in #106 and comment out the tiktoken-related code.

@zjcDM
Copy link
Author

zjcDM commented May 14, 2024

Thanks for help.

@zjcDM zjcDM closed this as completed May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants