You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an error at line 156 of fsns.py when running fsns_demo.py with an image from SVHN test dataset. images = F.reshape(images, (batch_size, num_channels, height, 4, -1))
batch_size = 1
num_channels = 3
height = 48
width = 182
the error is raised in chainer.function.reshape() :
Invalid operation is performed in: Reshape (Forward)
I don't get this error when using a cropped version of the image (64x64) obtained using the prepare_svhn_crops.py script.
but I get another error :
File "C:\Users\U043511\Anaconda3\envs\SEE\lib\site-packages\chainer\functions\pooling\average_pooling_2d.py", line 227, in average_pooling_2d
return AveragePooling2D(ksize, stride, pad, False).apply((x,))[0]
File ".\SEE\lib\site-packages\chainer\function_node.py", line 334, in apply
outputs = self.forward(in_data)
File ".\SEE\lib\site-packages\chainer\function_node.py", line 592, in forward
return self.forward_cpu(inputs)
File ".\SEE\lib\site-packages\chainer\functions\pooling\average_pooling_2d.py", line 27, in forward_cpu
col = conv.im2col_cpu(x[0], self.kh, self.kw, self.sy, self.sx,
File ".\SEE\lib\site-packages\chainer\utils\conv.py", line 74, in im2col_cpu
assert out_w > 0, 'Width in the output should be positive.'
AssertionError: Width in the output should be positive.
I have an error at line 156 of fsns.py when running fsns_demo.py with an image from SVHN test dataset.
images = F.reshape(images, (batch_size, num_channels, height, 4, -1))
batch_size = 1
num_channels = 3
height = 48
width = 182
the error is raised in chainer.function.reshape() :
Invalid operation is performed in: Reshape (Forward)
Expect: prod(x.shape) % known_size(=576) == 0
Actual: 288 != 0
command line is
python ./chainer/fsns_demo.py --gpu -1 ./downloads/model/ model_35000.npz ./downloads/svhn/test/2.png ./datasets/svhn/svhn_char_map.json
thanks
The text was updated successfully, but these errors were encountered: