-
Notifications
You must be signed in to change notification settings - Fork 500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can we measure the difference of 2 gray scale images? #29
Comments
I've run the model with my two gray scales. I am not sure that the value given from network is correct. Furtheremore, the lower LPIPS is, the better image is. Am I correct. Thank you |
Yes, the lower means closer. We didn't verify on perceptual judgments with grayscale images only, but I think it is reasonable to expect consistent results to color images. |
Thank you so much. |
My apologies for re-opening this thread after such a long while - is it possible to obtain a greyscale version of vgg16_perceptual? I trained a styleGAN model on a greyscale dataset. Currently the projector uses this model to project images in latent space. Unfortunately when running the projector I get: If not, I have to retrain the GAN which I'm leaving as the last option. |
I think you can convert the images into 3 channels using expand |
Hello author,
Can we measure the difference of 2 gray scale images? Or this metric is only used for RGB images?
Thank you.
The text was updated successfully, but these errors were encountered: