Element-wise comparisons accept invalid dimension mismatch
using JuMP
function foo()
model = Model()
N=3
M=4
@variable(model, x[1:N,1:M], Bin)
LHS = sum(x, dims=4)
RHS = fill(1, (N))
@assert size(LHS) == (N, M)
@assert size(RHS) == (N, )
@assert size(LHS) != size(RHS)
@constraints(model, begin
LHS .<= RHS
end);
print(model)
end
foo()
LHS (erroneously) has dimensionality different from that of the RHS. I would have hoped this to be caught, but perhaps this is some kind of intentional broadcasting?
There's also difference in behavior depending on the hand: (LMK if this should be filed as a new issue?)
using JuMP
function foo()
model = Model()
N=2
M=3
K=4
@variable(model, x[1:N,1:M,1:K], Bin)
@constraints(model, begin
fill(1, (N,K)) .>= sum(x, dims=2) # fine.
sum(x, dims=2) .<= fill(1, (N,K)) # DimensionMismatch: destination axes (Base.OneTo(2), Base.OneTo(1), Base.OneTo(4)) are not compatible with source axes (Base.OneTo(2), Base.OneTo(4), Base.OneTo(4))
end);
print(model)
end
foo()
I'm guessing the topic-starter bug arises from an attempt to ignore such 1-element dimensions?
Your first example is expected behavior for broadcasting.
julia> x = rand(3, 4)
3×4 Matrix{Float64}:
0.724199 0.0505117 0.789073 0.977107
0.203153 0.245935 0.651512 0.641174
0.0745526 0.138428 0.0774511 0.900541
julia> sum(x; dims = 4)
3×4 Matrix{Float64}:
0.724199 0.0505117 0.789073 0.977107
0.203153 0.245935 0.651512 0.641174
0.0745526 0.138428 0.0774511 0.900541
julia> LHS = sum(x; dims = 4)
3×4 Matrix{Float64}:
0.724199 0.0505117 0.789073 0.977107
0.203153 0.245935 0.651512 0.641174
0.0745526 0.138428 0.0774511 0.900541
julia> RHS = fill(1, (3))
3-element Vector{Int64}:
1
1
1
julia> LHS .<= RHS
3×4 BitMatrix:
1 1 1 1
1 1 1 1
1 1 1 1
I don't understand why you have sum(x; dims = 4).
I need to look into your second example.
Second example is:
julia> using JuMP
julia> import JuMP._MA as MA
julia> y = ones(2, 3);
julia> model = Model();
julia> @variable(model, x[1:2, 1:1, 1:3]);
julia> MA.broadcast!!(MA.sub_mul, x, y);
julia> MA.broadcast!!(MA.sub_mul, y, x);
julia> MA.broadcast!!(MA.sub_mul, 1.0 .* x, y);
ERROR: DimensionMismatch: destination axes (Base.OneTo(2), Base.OneTo(1), Base.OneTo(3)) are not compatible with source axes (Base.OneTo(2), Base.OneTo(3), Base.OneTo(3))
Stacktrace:
[1] throwdm(axdest::Tuple{Base.OneTo{…}, Base.OneTo{…}, Base.OneTo{…}}, axsrc::Tuple{Base.OneTo{…}, Base.OneTo{…}, Base.OneTo{…}})
@ Base.Broadcast ./broadcast.jl:1080
[2] copyto!
@ ./broadcast.jl:992 [inlined]
[3] copyto!
@ ./broadcast.jl:956 [inlined]
[4] broadcast!(op::typeof(MutableArithmetics.sub_mul), A::Array{AffExpr, 3}, args::Matrix{Float64})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/iovKe/src/broadcast.jl:157
[5] broadcast_fallback!
@ ~/.julia/packages/MutableArithmetics/iovKe/src/broadcast.jl:213 [inlined]
[6] broadcast!!(::typeof(MutableArithmetics.sub_mul), ::Array{AffExpr, 3}, ::Matrix{Float64})
@ MutableArithmetics ~/.julia/packages/MutableArithmetics/iovKe/src/broadcast.jl:185
[7] top-level scope
@ REPL[105]:1
Some type information was truncated. Use `show(err)` to see complete types.
julia> MA.broadcast!!(MA.sub_mul, y, 1.0 .* x);
I don't understand why you have
sum(x; dims = 4).
Because i've summed along the wrong dimension, and "expected" it would get caught in comparison. It's possible i've oversimplified it, and it needs one more dimension, like in the second example.
This is what should happen:
julia> A = rand(2, 1, 3)
2×1×3 Array{Float64, 3}:
[:, :, 1] =
0.1524404350076013
0.8671340260214189
[:, :, 2] =
0.9476682352354726
0.6380799807793498
[:, :, 3] =
0.8768902839999501
0.4313691470659615
julia> B = rand(2, 3)
2×3 Matrix{Float64}:
0.777273 0.494098 0.824876
0.440939 0.450255 0.642786
julia> A .- B
2×3×3 Array{Float64, 3}:
[:, :, 1] =
-0.624833 -0.341658 -0.672435
0.426195 0.416879 0.224348
[:, :, 2] =
0.170395 0.45357 0.122792
0.197141 0.187825 -0.00470633
[:, :, 3] =
0.0996173 0.382792 0.0520145
-0.00956984 -0.0188854 -0.211417
julia> B .- A
2×3×3 Array{Float64, 3}:
[:, :, 1] =
0.624833 0.341658 0.672435
-0.426195 -0.416879 -0.224348
[:, :, 2] =
-0.170395 -0.45357 -0.122792
-0.197141 -0.187825 0.00470633
[:, :, 3] =
-0.0996173 -0.382792 -0.0520145
0.00956984 0.0188854 0.211417
Because i've summed along the wrong dimension, and "expected" it would get caught in comparison.
JuMP can't "see" that you've summed. We just see a matrix, and the sizes happen to be compatible with broadcasting.
julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> N=3
3
julia> M=4
4
julia> @variable(model, x[1:N,1:M], Bin)
3×4 Matrix{VariableRef}:
x[1,1] x[1,2] x[1,3] x[1,4]
x[2,1] x[2,2] x[2,3] x[2,4]
x[3,1] x[3,2] x[3,3] x[3,4]
julia> LHS = sum(x, dims=4)
3×4 Matrix{AffExpr}:
x[1,1] x[1,2] x[1,3] x[1,4]
x[2,1] x[2,2] x[2,3] x[2,4]
x[3,1] x[3,2] x[3,3] x[3,4]
Because i've summed along the wrong dimension, and "expected" it would get caught in comparison.
JuMP can't "see" that you've summed. We just see a matrix, and the sizes happen to be compatible with broadcasting.
I'm not really following. Those two statements are not in conflict.
It didn't get caught because your typo still resulted in valid code
This is what should happen:
Ouch. Well, now you have your answer to https://github.com/jump-dev/JuMP.jl/issues/3770#issuecomment-2162054182 :]
It didn't get caught because your typo still resulted in valid code
Correct. I'm just saying that i //thought// it would have been caught, instead of me having to notice it and spend time tracking it down throught the model...
Ouch. Well, now you have your answer
No, I think that is expected behavior (to match Base Julia). You probably want to use dropdims:
julia> A = rand(2, 3, 4)
2×3×4 Array{Float64, 3}:
[:, :, 1] =
0.659558 0.671503 0.442636
0.0383281 0.738598 0.235671
[:, :, 2] =
0.183098 0.608992 0.432493
0.761186 0.909809 0.670562
[:, :, 3] =
0.198515 0.510995 0.517809
0.776882 0.513872 0.982992
[:, :, 4] =
0.659864 0.772425 0.0286673
0.992181 0.220161 0.573799
julia> sum(A; dims = 2)
2×1×4 Array{Float64, 3}:
[:, :, 1] =
1.773697126725578
1.012597206611262
[:, :, 2] =
1.2245835938090597
2.341556988316256
[:, :, 3] =
1.2273194375111922
2.2737469813821107
[:, :, 4] =
1.4609564897911356
1.7861408833538186
julia> dropdims(sum(A; dims = 2); dims = 2)
2×4 Matrix{Float64}:
1.7737 1.22458 1.22732 1.46096
1.0126 2.34156 2.27375 1.78614
Ouch. Well, now you have your answer
No, I think that is expected behavior (to match Base Julia).
I do not disagree, but i do think we may be talking a bit past each other.
Moving to MutableArithmetics:
julia> using Revise
julia> import MutableArithmetics as MA
julia> A = rand(2, 1, 3)
2×1×3 Array{Float64, 3}:
[:, :, 1] =
0.19212165666948622
0.22056184518998745
[:, :, 2] =
0.24383425805314962
0.28541745336519664
[:, :, 3] =
0.92728595640186
0.8477968088951745
julia> B = rand(2, 3)
2×3 Matrix{Float64}:
0.899054 0.757108 0.113272
0.718785 0.552237 0.15519
julia> MA.broadcast!!(MA.sub_mul, A, B) ≈ A .- B
ERROR: DimensionMismatch: destination axes (Base.OneTo(2), Base.OneTo(1), Base.OneTo(3)) are not compatible with source axes (Base.OneTo(2), Base.OneTo(3), Base.OneTo(3))
Stacktrace:
[1] throwdm(axdest::Tuple{Base.OneTo{…}, Base.OneTo{…}, Base.OneTo{…}}, axsrc::Tuple{Base.OneTo{…}, Base.OneTo{…}, Base.OneTo{…}})
@ Base.Broadcast ./broadcast.jl:1080
[2] copyto!
@ ./broadcast.jl:992 [inlined]
[3] copyto!
@ ./broadcast.jl:956 [inlined]
[4] broadcast!(op::typeof(MutableArithmetics.sub_mul), A::Array{Float64, 3}, args::Matrix{Float64})
@ MutableArithmetics ~/.julia/dev/MutableArithmetics/src/broadcast.jl:161
[5] broadcast_fallback!
@ ~/.julia/dev/MutableArithmetics/src/broadcast.jl:217 [inlined]
[6] broadcast!!(::typeof(MutableArithmetics.sub_mul), ::Array{Float64, 3}, ::Matrix{Float64})
@ MutableArithmetics ~/.julia/dev/MutableArithmetics/src/broadcast.jl:189
[7] top-level scope
@ REPL[25]:1
Some type information was truncated. Use `show(err)` to see complete types.
julia> MA.broadcast!!(MA.sub_mul, B, A) ≈ B .- A
true
@odow thank you!