Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upsample function leads to tensor size mismatch for certain input image sizes when spatial=True #51

Open
DanAndersen opened this issue Oct 2, 2020 · 3 comments

Comments

@DanAndersen
Copy link

DanAndersen commented Oct 2, 2020

Currently, the upsample function is as follows:

def upsample(in_tens, out_HW=(64,64)): # assumes scale factor is same for H and W
    in_H, in_W = in_tens.shape[2], in_tens.shape[3]
    scale_factor_H, scale_factor_W = 1.*out_HW[0]/in_H, 1.*out_HW[1]/in_W

    return nn.Upsample(scale_factor=(scale_factor_H, scale_factor_W), mode='bilinear', align_corners=False)(in_tens)

This ends up failing in the case where the input images being compared are of resolution 800x600. When this is the case, one of the layers passed in as in_tens has shape (1, 1, 149, 199). As a result, in_H * scale_factor_H = 600.0000000000001 and in_W * scale_factor_W = 799.9999999999999. The result of the Upsample is an output tensor of size (1,1,600,799), which leads to an exception when it is added to other tensors of size (1,1,600,800).

Instead of computing the scale_factor, a more robust solution is to just set the size parameter directly:

    return nn.Upsample(size=out_HW, mode='bilinear', align_corners=False)(in_tens)

This might also be the cause of this specific comment: #45 (comment)

@richzhang
Copy link
Owner

Thanks for pointing it out. I updated it!

@DanAndersen
Copy link
Author

Just wanted to mention that this bug is still present in the pip version of the lpips library (when doing pip install lpips, the upsample function hasn't been changed).

@DanAndersen DanAndersen reopened this Jun 29, 2021
@richzhang
Copy link
Owner

I updated the pip, so hopefully it should work now. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants