Skip to content

Pull requests: vllm-project/vllm-gaudi

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

[FIX_FOR_VLLM_LATEST] Fix for #27022
#418 opened Oct 17, 2025 by adobrzyn Loading…
Fix docker cmdlines for v.0.11.0 work arounds
#417 opened Oct 17, 2025 by nngokhale Loading…
Update troubleshooting.md
#416 opened Oct 17, 2025 by michalkuligowski Loading…
reuse DP allgather tensor across layers
#415 opened Oct 17, 2025 by wuxun-zhang Loading…
Add async scheduling for unified attention
#414 opened Oct 16, 2025 by tianmu-li Loading…
Update docs: Quickstart - Executing inference documentation Improvements or additions to documentation skip-gaudi-tests
#410 opened Oct 15, 2025 by pawel-olejniczak Loading…
Gemma3 Multimodal optimization
#404 opened Oct 14, 2025 by jiminha Loading…
Change max bucket while using conti pa + defrag
#397 opened Oct 13, 2025 by adobrzyn Loading…
Docs installation, quick start and build fixes (#384) documentation Improvements or additions to documentation skip-gaudi-tests
#392 opened Oct 13, 2025 by PatrykWo Loading…
Buckets from file - alpha version
#375 opened Oct 9, 2025 by adobrzyn Loading…
4 of 5 tasks
Install pytorch from habanalabs-installer
#345 opened Oct 8, 2025 by cabelo Loading…
ProTip! Exclude everything labeled bug with -label:bug.