DiffEqFlux.jl icon indicating copy to clipboard operation
DiffEqFlux.jl copied to clipboard

Remove deprecations for breaking release

Open ChrisRackauckas opened this issue 3 years ago • 11 comments

Fixes https://github.com/SciML/DiffEqFlux.jl/issues/707

ChrisRackauckas avatar Jun 04 '22 00:06 ChrisRackauckas

@Abhishek-1Bhatt can you take this one from here? All that needs to be done is that the tests need to be updated in the same way as the tutorials are being updated (a lot of the tests and examples should be the same)

ChrisRackauckas avatar Jun 05 '22 11:06 ChrisRackauckas

Sure 👍

ba2tripleO avatar Jun 05 '22 11:06 ba2tripleO

Issues related to sciml_train should be closed or moved after this PR merged. https://github.com/search?q=sciml_train+repo%3ASciML%2FDiffEqFlux.jl+state%3Aopen+in%3Atitle&type=Issues

prbzrg avatar Jun 05 '22 12:06 prbzrg

Did a lot of issue cleanup. Most were solved already.

ChrisRackauckas avatar Jun 11 '22 04:06 ChrisRackauckas

Still doesn't run full CI. Also there are some conflicts with master, I can't see what the conflicts are because of access issues. Besides maybe we need to need to merge master into this branch, it says 10 commits behind.

ba2tripleO avatar Jun 15 '22 01:06 ba2tripleO

This line errors the Newton NeuralODE Test https://github.com/SciML/DiffEqFlux.jl/blob/c48a0e147bb2196ed277dcfa300ca4c90351683b/test/newton_neural_ode.jl#L51

Is this a compatibility issue with Lux?

ERROR: MethodError: no method matching initial_state(::Optim.KrylovTrustRegion{Float64}, ::Optim.Options{Float64, OptimizationOptimJL.var"#_cb#11"{var"#5#6", Optim.KrylovTrustRegion{Float64}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}}, ::NLSolversBase.TwiceDifferentiableHV{Float32, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), 
layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}}, ::ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}})
Closest candidates are:
  initial_state(::AcceleratedGradientDescent, ::Any, ::Any, ::AbstractArray{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\first_order\accelerated_gradient_descent.jl:35
  initial_state(::Optim.KrylovTrustRegion, ::Any, ::Any, ::Array{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\second_order\krylov_trust_region.jl:39
  initial_state(::SimulatedAnnealing, ::Any, ::Any, ::AbstractArray{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\zeroth_order\simulated_annealing.jl:62

ba2tripleO avatar Jun 18 '22 12:06 ba2tripleO

See if you can isolate it to Optim.KrylovTrustRegion() just requiring standard Array types: that would be my guess.

ChrisRackauckas avatar Jun 18 '22 12:06 ChrisRackauckas

https://github.com/JuliaNLSolvers/Optim.jl/blob/d5cb5dae049dcb7bcff94d691a099b3650d5d9d8/src/multivariate/solvers/second_order/krylov_trust_region.jl#L39

ba2tripleO avatar Jun 18 '22 12:06 ba2tripleO

What would be the best way to allow initial_x above to be a ComponentVector?

ba2tripleO avatar Jun 19 '22 02:06 ba2tripleO

Just use Flux.jl for that for now. The real answer is an upstream fix to Optim.jl but that shouldn't block this.

ChrisRackauckas avatar Jun 19 '22 10:06 ChrisRackauckas

Just a heads up here. I will be deprecating quite a few functionalities (mostly these were undocumented but they ended up in user code) in v0.4.8 for removal in v0.5 (See https://github.com/avik-pal/Lux.jl/blob/ap/tests/src/deprecated.jl). Might be worthwhile avoiding using these (most notably ActivationFunction)

avik-pal avatar Jun 30 '22 06:06 avik-pal

Continued in https://github.com/SciML/DiffEqFlux.jl/pull/794

ChrisRackauckas avatar Jan 17 '23 16:01 ChrisRackauckas