Skip to content

Issues: InternLM/lmdeploy

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

使用locust时发现的问题
#417 opened Sep 14, 2023 by frankxyy updated Sep 19, 2023
[Bug] llama2-70B NTK,输入长度只能支持到8K?
#479 opened Sep 26, 2023 by sjzhou4 updated Sep 27, 2023
2 tasks done
[Bug] int4 awq kernel
#498 opened Sep 27, 2023 by liang872 updated Oct 11, 2023
2 tasks done
codellama-Instruct版本地测试运行和服务部署返回的结果不一致
#579 opened Oct 18, 2023 by for-just-we updated Oct 19, 2023
1 of 2 tasks
精度损失很多
#617 opened Oct 26, 2023 by seeyourcell updated Nov 1, 2023
[Bug] "cos_cached" shape incorrect error.
#664 opened Nov 8, 2023 by WarrenZhao updated Nov 8, 2023
2 tasks
[Feature] how to open window attention in qwen-14B? backlog
#638 opened Nov 2, 2023 by amulil updated Nov 17, 2023
[Feature] 支持frequency_penalty采样 backlog
#704 opened Nov 17, 2023 by RytonLi updated Nov 19, 2023
[Bug] The GPU memory doesn't change after changing batch_size
#731 opened Nov 22, 2023 by hxdbf updated Nov 22, 2023
2 tasks
[bug] group_size = 64 has bug backlog
#322 opened Aug 28, 2023 by lippman1125 updated Dec 11, 2023
使用Qwen-7B 32K长度推理,不出结果(都是空)
#883 opened Dec 22, 2023 by CocaColaKing updated Dec 27, 2023
想请教怎么往kv cache中添加缓存信息
#994 opened Jan 19, 2024 by WCwalker updated Jan 19, 2024
internLM2 20B长本文部署后,调用接口异常
#1078 opened Jan 31, 2024 by testTech92 updated Feb 2, 2024
2 tasks
[Docs] presence_penalty设置无效
#1046 opened Jan 26, 2024 by RytonLi updated Feb 6, 2024
ProTip! no:milestone will show everything without a milestone.