-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bad results when testing #162
Comments
overfitting |
Did you solve the problem? I also meet with this problem during testing. |
can we add a validation set to prevent overfitting in pix2pixHD model? I haven't seen anyone |
I don't think it's because overfitting. He says that he is "Testing the model with images from the training set". If it is overfitting. It shouldn't work with any other images but with images the model was trained with should work. I'm having this same issue, with no good outcome yet. |
I'm having this same issue, How did you fix it? @ShaniGam @wang-zm18 @cszer @yxwang1794 @edd2110-jac |
@ShaniGam @wang-zm18 @cszer @yxwang1794 @edd2110-jac @songyn95 |
I train the model on 2975 images from the cityscapes dataset with the following command:
python train.py --name label2city_512p_feat_cityscapes --instance_feat
The outputs I get during the training seem fine:
But when I test the model (on images from the training set) I get:
Does anyone has an idea of why it happens?
The text was updated successfully, but these errors were encountered: