Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Very strange GANed cats #6

Open
MNCTTY opened this issue Apr 26, 2018 · 4 comments
Open

Very strange GANed cats #6

MNCTTY opened this issue Apr 26, 2018 · 4 comments

Comments

@MNCTTY
Copy link

MNCTTY commented Apr 26, 2018

Hello!
Thank you for you implementation, it was really cool expierence to run it.
But I've got some strange results.
I've trained the model on dataset with cats, there were 200 pics on 100 epochs with batch size 10. And here what it's generated.
I really don't know how to interpret the results
4 step 5batch 100 epochs _ 1
4 step 5batch 100 epochs

@jhayes14
Copy link
Owner

Hi, it could be for a hundred reasons tbh. Maybe mode collapse? Do the losses get stuck at some values?

@MNCTTY
Copy link
Author

MNCTTY commented Apr 26, 2018

Yes, it first strange thing: Generator loss was 1.1920933e-07 almost all time , and Discriminator loss was changing from one batch to the next one.
Second thing, it looks like model doesn't train a lot on batches. Here some text from terminal


Epoch 1 Batch 1856
Training first discriminator..
Training first generator..
Initial batch losses :  Generator loss 1.1920933e-07 Discriminator loss 1.4226825e-07 Total: 2.6147757e-07
Final batch losses (after updates) :  Generator loss 1.1920933e-07 Discriminator loss 1.4226825e-07 Total: 2.6147757e-07

Epoch 1 Batch 1857
Training first discriminator..
Training first generator..
Initial batch losses :  Generator loss 1.1920933e-07 Discriminator loss 4.04652e-05 Total: 4.0584408e-05
Final batch losses (after updates) :  Generator loss 1.1920933e-07 Discriminator loss 4.04652e-05 Total: 4.0584408e-05

Epoch 1 Batch 1858
Training first discriminator..
Training first generator..
Initial batch losses :  Generator loss 1.1920933e-07 Discriminator loss 1.0960467e-07 Total: 2.28814e-07
Final batch losses (after updates) :  Generator loss 1.1920933e-07 Discriminator loss 1.0960467e-07 Total: 2.28814e-07

When I run it very first time, I had a lot of string of text on each batch, but from some time - only this.
This text is from the other code run (on whole dataset, 2000 batches with theirs size 2, and 2 epoches), but that time quantity of text lines was the same.

@jhayes14
Copy link
Owner

jhayes14 commented Apr 27, 2018

Your losses are essentially zero, which means it is not learning. Try a larger batch size. After that I would honestly try a more well-tested implementation such as https://github.com/jacobgil/keras-dcgan , since I have optimised this implementation at all.

@MNCTTY
Copy link
Author

MNCTTY commented May 2, 2018

Ok, thank you, I'll try.
By the way, I tried to use batch_size 50, 100 and 200 and 2 epochs. And, it generates something new, but it doesnt look like a cat :)
But I saw how losses really changes
batch size 200 epochs 2 _____ 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants