Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LPIPS Loss producing negative values #72

Open
GuillaumeRochette opened this issue Jul 10, 2021 · 3 comments · May be fixed by #73
Open

LPIPS Loss producing negative values #72

GuillaumeRochette opened this issue Jul 10, 2021 · 3 comments · May be fixed by #73

Comments

@GuillaumeRochette
Copy link

Hi,

While running the LPIPS loss based on AlexNet, I obtained a negative value,

a = LPIPS(net="alex", verbose=False)
x = torch.rand(4, 3, 256, 256)
y = torch.rand(4, 3, 256, 256)
z = a(x, y, normalize=True)
print(z)

While looking at the values contained in res (defined in the forward()), I have noticed that the implementation does not match the Eq. 1 from the paper.

Here's Eq. 1:
image

While this is what is implemented,
image

The square operation ** 2 at line 94 should be removed and instead applied on the self.lins[kk].model(diffs[kk]) (at lines 98 and 100), and on diff[kk] (at lines 103 and 105).

Thanks in advance,

Guillaume

@markdjwilliams
Copy link

Is there a good workaround for this?

@richzhang
Copy link
Owner

richzhang commented Dec 8, 2022

If the code is installed and the weights are loaded properly (and weren't changed by accidentally fine-tuning them, for example), it is not possible to get negative values.

Check the weights at all non-negative, by doing the following

for ll in range(5):
    print(loss_fn_vgg.lins[ll].model[1].weight.flatten())

@markdjwilliams
Copy link

Thank you, this makes perfect sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants