Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for custom LLM provider. #7335

Closed
remyleone opened this issue Feb 19, 2025 · 1 comment
Closed

Add support for custom LLM provider. #7335

remyleone opened this issue Feb 19, 2025 · 1 comment

Comments

@remyleone
Copy link

remyleone commented Feb 19, 2025

As a user of LLM, I would like to be able to use either local provider (ollama, ...) or remote one while respecting the OpenAI API. This could be helpful for the commit message generation offered by gitbutler. Could you add a way to configure a different LLM provider?

@Byron
Copy link
Collaborator

Byron commented Feb 20, 2025

Thanks for sharing!

In the current stable version, Version 0.14.7 (20250204.195312) at the time of writing, in the preferences, AI options, one can configure Ollama and compatible protocols.

Image

Thus I think this issue can be closed as Ollama would support custom LLM providers.

Please feel free, however, to bring up a new one that is specific to your use-case and as narrowly scoped as possible. My feeling is that what's intended is a custom remote endpoint that is Open AI compatible and non-local.
Thanks again.

@Byron Byron closed this as not planned Won't fix, can't repro, duplicate, stale Feb 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants