You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a user of LLM, I would like to be able to use either local provider (ollama, ...) or remote one while respecting the OpenAI API. This could be helpful for the commit message generation offered by gitbutler. Could you add a way to configure a different LLM provider?
The text was updated successfully, but these errors were encountered:
In the current stable version, Version 0.14.7 (20250204.195312) at the time of writing, in the preferences, AI options, one can configure Ollama and compatible protocols.
Thus I think this issue can be closed as Ollama would support custom LLM providers.
Please feel free, however, to bring up a new one that is specific to your use-case and as narrowly scoped as possible. My feeling is that what's intended is a custom remote endpoint that is Open AI compatible and non-local.
Thanks again.
As a user of LLM, I would like to be able to use either local provider (ollama, ...) or remote one while respecting the OpenAI API. This could be helpful for the commit message generation offered by gitbutler. Could you add a way to configure a different LLM provider?
The text was updated successfully, but these errors were encountered: