Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

normalization in BGR #116

Closed
98mxr opened this issue Dec 1, 2022 · 3 comments
Closed

normalization in BGR #116

98mxr opened this issue Dec 1, 2022 · 3 comments

Comments

@98mxr
Copy link

98mxr commented Dec 1, 2022

I want to use LPIPS on BGR img instead of RGB img, can I just swap the order of mean/std in ScalingLayer?

@richzhang
Copy link
Owner

That alone won't fix it, since the network is expecting RGB. Simply reverse the channels in your image and you're good to go

@98mxr
Copy link
Author

98mxr commented Dec 2, 2022

I use LPIPS as a loss for backward. Reverse img channels after forward will cause my model to fail to converge. I tried many times and located the problem on reverse img channels. It is estimated that this operation caused the gradient On the problem, I can't even clamping img to (-1, 1).

@98mxr
Copy link
Author

98mxr commented Dec 3, 2022

I modified my model so that reverse can work well, thank you for your reply.

@98mxr 98mxr closed this as completed Dec 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants