-
Notifications
You must be signed in to change notification settings - Fork 6k
Insights: huggingface/diffusers
Overview
Could not load contribution data
Please try again later
6 Pull requests merged by 5 people
-
Wan VACE
#11582 merged
Jun 6, 2025 -
[tests] add test for torch.compile + group offloading
#11670 merged
Jun 6, 2025 -
use deterministic to get stable result
#11663 merged
Jun 6, 2025 -
[examples] flux-control: use num_training_steps_for_scheduler
#11662 merged
Jun 5, 2025 -
[chore] bring PipelineQuantizationConfig at the top of the import chain.
#11656 merged
Jun 5, 2025 -
[CI] Some improvements to Nightly reports summaries
#11166 merged
Jun 5, 2025
10 Pull requests opened by 7 people
-
Update pipeline_flux_inpaint.py to fix padding_mask_crop returning only the inpainted area
#11658 opened
Jun 4, 2025 -
[WIP] [LoRA] support omi hidream lora.
#11660 opened
Jun 5, 2025 -
Bump torch from 2.2.0 to 2.7.1 in /examples/research_projects/realfill
#11664 opened
Jun 5, 2025 -
⚡️ Speed up method `AutoencoderKLWan.clear_cache` by 886%
#11665 opened
Jun 5, 2025 -
⚡️ Speed up method `BlipImageProcessor.postprocess` by 51%
#11666 opened
Jun 5, 2025 -
⚡️ Speed up method `Kandinsky3ConditionalGroupNorm.forward` by 7%
#11667 opened
Jun 5, 2025 -
Fix wrong param types, docs, and handles noise=None in scale_noise of FlowMatching schedulers
#11669 opened
Jun 6, 2025 -
enable cpu offloading of new pipelines on XPU & use device agnostic empty to make pipelines work on XPU
#11671 opened
Jun 6, 2025 -
[tests] tests for compilation + quantization (bnb)
#11672 opened
Jun 6, 2025 -
Support Expert loss for HiDream
#11673 opened
Jun 6, 2025
3 Issues closed by 3 people
-
Error in loading the pretrained lora weights
#11675 closed
Jun 7, 2025 -
[BUG]: Using args.max_train_steps even if it is None in diffusers/examples/flux-control
#11661 closed
Jun 5, 2025
3 Issues opened by 3 people
-
HunyuanVideoImageToVideoPipeline memory leak
#11676 opened
Jun 7, 2025 -
[FR] Please support ref image and multiple control videos in Wan VACE
#11674 opened
Jun 6, 2025 -
LoRA load issue
#11659 opened
Jun 4, 2025
15 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
[LoRA] parse metadata from LoRA and save metadata
#11324 commented on
Jun 6, 2025 • 11 new comments -
[benchmarks] overhaul benchmarks
#11565 commented on
Jun 7, 2025 • 5 new comments -
Add SkyReels V2: Infinite-Length Film Generative Model
#11518 commented on
Jun 7, 2025 • 1 new comment -
Allow remote code repo names to contain "."
#11652 commented on
Jun 4, 2025 • 1 new comment -
enable torchao test cases on XPU and switch to device agnostic APIs for test cases
#11654 commented on
Jun 5, 2025 • 1 new comment -
SD3 ControlNet Script (and others?): dataset preprocessing cache depends on unrelated arguments
#11497 commented on
Jun 5, 2025 • 0 new comments -
[BUG] [CleanCode] Tuple[int] = (16, 56, 56) in FluxTransformer2DModel
#11641 commented on
Jun 5, 2025 • 0 new comments -
Can't load flux-fill-lora with FluxControl
#11651 commented on
Jun 6, 2025 • 0 new comments -
how to load lora weight with fp8 transfomer model?
#11648 commented on
Jun 6, 2025 • 0 new comments -
[performance] investigating FluxPipeline for recompilations on resolution changes
#11360 commented on
Jun 7, 2025 • 0 new comments -
Attention Dispatcher
#11368 commented on
Jun 4, 2025 • 0 new comments -
[torch.compile] Make HiDream torch.compile ready
#11477 commented on
Jun 6, 2025 • 0 new comments -
Chroma as a FLUX.1 variant
#11566 commented on
Jun 4, 2025 • 0 new comments -
Added PhotoDoodle Pipeline
#11621 commented on
Jun 5, 2025 • 0 new comments -
[LoRA] support Flux Control LoRA with bnb 8bit.
#11655 commented on
Jun 6, 2025 • 0 new comments