-
-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Insights: mudler/LocalAI
Overview
-
- 4 Merged pull requests
- 0 Open pull requests
- 0 Closed issues
- 2 New issues
Could not load contribution data
Please try again later
4 Pull requests merged by 2 people
-
chore(model gallery): add mistralai_devstral-small-2507
#5834 merged
Jul 11, 2025 -
chore(model gallery): add huihui-ai_huihui-gemma-3n-e4b-it-abliterated
#5833 merged
Jul 11, 2025 -
chore(model gallery): add microsoft_nextcoder-32b
#5832 merged
Jul 11, 2025 -
chore: ⬆️ Update ggml-org/llama.cpp to
0b8855775c6b873931d40b77a5e42558aacbde52
#5830 merged
Jul 10, 2025
2 Issues opened by 2 people
-
AIO entrypoint.sh fail to detect amd GPU/APU
#5831 opened
Jul 11, 2025 -
GUI unusable due to CONNECTION_RESET errors
#5829 opened
Jul 10, 2025
4 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Failure to detect GPU driver
#4571 commented on
Jul 10, 2025 • 0 new comments -
Add support for the parallel_tool_calls parameter for the /chat/completions API
#3672 commented on
Jul 11, 2025 • 0 new comments -
chore: :arrow_up: Update leejet/stable-diffusion.cpp to `6d84a30c66cc619fe41a78bc87f83ba41f059cc0`
#5732 commented on
Jul 10, 2025 • 0 new comments -
[WIP] feat: build llama cpp externally
#5790 commented on
Jul 10, 2025 • 0 new comments