Support for Local/Private models on SuperAGI #1259
Replies: 5 comments 3 replies
-
A very much needed features. Other people have also expressed interest. Local LLMS are the future! +1 |
Beta Was this translation helpful? Give feedback.
-
+1 on this feature, as integrations LM studio and Oogabooga are the most common ones. |
Beta Was this translation helpful? Give feedback.
-
I am being asked for local llm auth token? the video release showed nothing about that. |
Beta Was this translation helpful? Give feedback.
-
Have no idea if LiteLLM has been mentioned in other forums on the discussion page, but just got SuperAGI to work with LiteLLM. LiteLLM has pretty much become the sole proxy that I use now. Just want to say that I'm highly supportive of integrating such frameworks with local/self-hosted models and that I appreciate the hard work!! |
Beta Was this translation helpful? Give feedback.
-
Yeah nice I tried to run a new llm on superagi but it wasn't able to load it is that correct that not all elms are supported? that one I used was a gruff dolphin 2.5 mistral 8x7b!? |
Beta Was this translation helpful? Give feedback.
-
We're working to bring you a feature that has been highly requested by the community - Support for local/private models on SuperAGI
We're exploring possible integrations with Oogabooga, GPT4ALL, LM Studio & Fastchat. Please share your thoughts & insights, which one works best for you, and why?
Beta Was this translation helpful? Give feedback.
All reactions