NNlib.jl
NNlib.jl copied to clipboard
`pad_reflect` broken
seems the new release (v0.9.18) breaks padding. before: v0.9.17
julia> using NNlib
julia> x = rand(1:9, 3,3,1,1)
3×3×1×1 Array{Int64, 4}:
[:, :, 1, 1] =
7 3 2
1 5 5
7 7 6
julia> NNlib.pad_reflect(x, (1,0,1,0); dims=1:2)
4×4×1×1 Array{Int64, 4}:
[:, :, 1, 1] =
5 1 5 5
3 7 3 2
5 1 5 5
7 7 7 6
(@v1.10) pkg> st
Status `~/.julia/environments/v1.10/Project.toml`
[872c559c] NNlib v0.9.17
after update: v0.9.18
(@v1.10) pkg> st
Status `~/.julia/environments/v1.10/Project.toml`
[872c559c] NNlib v0.9.18
julia> NNlib.pad_reflect(x, (1,0,1,0); dims=1:2)
ERROR: DimensionMismatch: mismatch in dimension 2 (expected 3 got 1)
Stacktrace:
[1] _cs
@ ./abstractarray.jl:1785 [inlined]
[2] _cshp
@ ./abstractarray.jl:1771 [inlined]
[3] _cshp
@ ./abstractarray.jl:1782 [inlined]
[4] _cat_size_shape
@ ./abstractarray.jl:1761 [inlined]
[5] _cat_size_shape(dims::Tuple{Bool}, shape::NTuple{4, Int64}, X::Array{Int64, 4}, tail::Vector{Int64})
@ Base ./abstractarray.jl:1761
[6] cat_size_shape(::Tuple{Bool}, ::Array{Int64, 4}, ::Array{Int64, 4}, ::Vararg{Any})
@ Base ./abstractarray.jl:1759
[7] _cat_t(::Int64, ::Type{Int64}, ::Array{Int64, 4}, ::Vararg{Any})
@ Base ./abstractarray.jl:1800
[8] _cat(::Int64, ::Array{Int64, 4}, ::Array{Int64, 4}, ::Vararg{AbstractArray{Int64}})
@ Base ./abstractarray.jl:1995
[9] cat
@ ./abstractarray.jl:1993 [inlined]
[10] pad_reflect(x::Array{Int64, 4}, pad::Tuple{Int64, Int64}; dims::Int64)
@ NNlib ~/.julia/packages/NNlib/jLaeV/src/padding.jl:275
[11] pad_reflect(x::Array{Int64, 4}, pad::NTuple{4, Int64}; dims::UnitRange{Int64})
@ NNlib ~/.julia/packages/NNlib/jLaeV/src/padding.jl:263
[12] top-level scope
@ REPL[7]:1
It still works with even padding sizes (i.e. same padding on left and right).
julia> NNlib.pad_reflect(x, (1,1,1,1); dims=1:2)
5×5 Matrix{Int64}:
5 1 5 5 5
3 7 3 2 3
5 1 5 5 5
7 7 7 6 7
5 1 5 5 5
perhaps you know the fix @pxl-th ?
The tests should probably be updated to catch this. Right now they only test same padding on left and right.
Ah, the problem is when one of the padding values is 0. I'll push the fix later today, thanks for posting this!
@pxl-th any chance to fix this soon?
Sorry for the delay, should be fixed by https://github.com/FluxML/NNlib.jl/pull/595