You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently not supported, but I can easily support it.
However, typically, the alpha boundary will match the depth boundary, so that artifacts are likely to occur and the quality of the alpha channel is likely not good.
It might have some benification for segmentation that feeding the alpha channel to depth generation model, as already implemented inside bg_session, however, the result of embeded onnx u2net is not good. Why not just support loading external well edited alpha channel as segmentation mask, and also the stereorizer have to generate two new alpha channels for both eyes rendering, and encoded in png and yuva420 vp8/9 webm video.
I found keep_alpha is already done with waifu, and for iw3 if png with alpha channel is feed in, the output color for the transparent part turning to pure white, where can I modify the code to allow it happen now or just turn the pure white to pure green?
Does the iw3 project support importing transparency png and exporting with alpha channel keeped?
The text was updated successfully, but these errors were encountered: