You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am now doing some comparisons between pytorch and flux and one of the problems I found with Flux makes coding a little bit more difficult:
If I have a 4D tensor in size_x, size_y, channels, batch size, e.g.x=randn(10*10*3*20)
then if I set up a dense layer by Layer = Dense(3,5)
then Layer(x) will report an error ERROR: MethodError: no method matching *(::Array{Float32,2}, ::Array{Float32,4})
simply because Dense does not accept >=3D tensors,
while in PyTorch, nn.Linear could deal with multi-dimensional tensors easily as long as the channel is at the last index.
Is it a good idea for Flux to advise Dense layers as what Torch did? e.g. Conv function does a good job for multi-dimensional tensors just like what Torch did.
Thanks!
Francis
The text was updated successfully, but these errors were encountered:
Hi Flux developers,
I am now doing some comparisons between pytorch and flux and one of the problems I found with Flux makes coding a little bit more difficult:
If I have a 4D tensor in
size_x, size_y, channels, batch size
, e.g.x=randn(10*10*3*20)
then if I set up a dense layer by
Layer = Dense(3,5)
then
Layer(x)
will report an errorERROR: MethodError: no method matching *(::Array{Float32,2}, ::Array{Float32,4})
simply because
Dense
does not accept >=3D tensors,while in PyTorch, nn.Linear could deal with multi-dimensional tensors easily as long as the channel is at the last index.
Is it a good idea for Flux to advise Dense layers as what Torch did? e.g.
Conv
function does a good job for multi-dimensional tensors just like what Torch did.Thanks!
Francis
The text was updated successfully, but these errors were encountered: