-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Insights: mlc-ai/mlc-llm
Overview
-
0 Active pull requests
-
- 0 Merged pull requests
- 0 Open pull requests
- 0 Closed issues
- 4 New issues
There hasn’t been any commit activity on mlc-ai/mlc-llm in the last week.
Want to help out?
4 Issues opened by 4 people
-
[Model Request] phi-4-mini-instruct
#3146 opened
Feb 27, 2025 -
[Bug] Mlc cli server gets stuck
#3145 opened
Feb 26, 2025 -
Question about flashinfer constraints
#3144 opened
Feb 25, 2025 -
[Bug]
#3143 opened
Feb 20, 2025
10 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
[Bug] Compiling the MLC from source is failed (cuda_fp8.h)
#3111 commented on
Feb 20, 2025 • 0 new comments -
[Bug] Failed to compile model on aarch64 platform with cuda12.8
#3110 commented on
Feb 20, 2025 • 0 new comments -
[Bug] mlc-llm server cannot return correct logprobs
#3130 commented on
Feb 20, 2025 • 0 new comments -
[Question] While waiting for the model's response on an Android phone, performing other operations may cause the phone to become unresponsive or reboot.
#3131 commented on
Feb 21, 2025 • 0 new comments -
[Question]
#3059 commented on
Feb 21, 2025 • 0 new comments -
How to stop a stream?
#3113 commented on
Feb 21, 2025 • 0 new comments -
[Question] Support for Flutter
#766 commented on
Feb 21, 2025 • 0 new comments -
[Bug] Android app does not take input; 'user 'role' is not defined' error
#3117 commented on
Feb 21, 2025 • 0 new comments -
[Question] mlc-llm server cannot return correct logprobs
#3142 commented on
Feb 24, 2025 • 0 new comments -
Very slow time to first token on ROCM
#3119 commented on
Feb 26, 2025 • 0 new comments