DiffEqFlux.jl
DiffEqFlux.jl copied to clipboard
Remove deprecations for breaking release
Fixes https://github.com/SciML/DiffEqFlux.jl/issues/707
@Abhishek-1Bhatt can you take this one from here? All that needs to be done is that the tests need to be updated in the same way as the tutorials are being updated (a lot of the tests and examples should be the same)
Sure 👍
Issues related to sciml_train should be closed or moved after this PR merged.
https://github.com/search?q=sciml_train+repo%3ASciML%2FDiffEqFlux.jl+state%3Aopen+in%3Atitle&type=Issues
Did a lot of issue cleanup. Most were solved already.
Still doesn't run full CI. Also there are some conflicts with master, I can't see what the conflicts are because of access issues. Besides maybe we need to need to merge master into this branch, it says 10 commits behind.
This line errors the Newton NeuralODE Test https://github.com/SciML/DiffEqFlux.jl/blob/c48a0e147bb2196ed277dcfa300ca4c90351683b/test/newton_neural_ode.jl#L51
Is this a compatibility issue with Lux?
ERROR: MethodError: no method matching initial_state(::Optim.KrylovTrustRegion{Float64}, ::Optim.Options{Float64, OptimizationOptimJL.var"#_cb#11"{var"#5#6", Optim.KrylovTrustRegion{Float64}, Base.Iterators.Cycle{Tuple{Optimization.NullData}}}}, ::NLSolversBase.TwiceDifferentiableHV{Float32, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))),
layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}, ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}}}, ::ComponentArrays.ComponentVector{Float32, Vector{Float32}, Tuple{ComponentArrays.Axis{(layer_1 = ViewAxis(1:10, Axis(weight = ViewAxis(1:5, ShapedAxis((5, 1), NamedTuple())), bias = ViewAxis(6:10, ShapedAxis((5, 1), NamedTuple())))), layer_2 = ViewAxis(11:16, Axis(weight = ViewAxis(1:5, ShapedAxis((1, 5), NamedTuple())), bias = ViewAxis(6:6, ShapedAxis((1, 1), NamedTuple())))))}}})
Closest candidates are:
initial_state(::AcceleratedGradientDescent, ::Any, ::Any, ::AbstractArray{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\first_order\accelerated_gradient_descent.jl:35
initial_state(::Optim.KrylovTrustRegion, ::Any, ::Any, ::Array{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\second_order\krylov_trust_region.jl:39
initial_state(::SimulatedAnnealing, ::Any, ::Any, ::AbstractArray{T}) where T at C:\Users\user\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\zeroth_order\simulated_annealing.jl:62
See if you can isolate it to Optim.KrylovTrustRegion() just requiring standard Array types: that would be my guess.
https://github.com/JuliaNLSolvers/Optim.jl/blob/d5cb5dae049dcb7bcff94d691a099b3650d5d9d8/src/multivariate/solvers/second_order/krylov_trust_region.jl#L39
What would be the best way to allow initial_x above to be a ComponentVector?
Just use Flux.jl for that for now. The real answer is an upstream fix to Optim.jl but that shouldn't block this.
Just a heads up here. I will be deprecating quite a few functionalities (mostly these were undocumented but they ended up in user code) in v0.4.8 for removal in v0.5 (See https://github.com/avik-pal/Lux.jl/blob/ap/tests/src/deprecated.jl). Might be worthwhile avoiding using these (most notably ActivationFunction)
Continued in https://github.com/SciML/DiffEqFlux.jl/pull/794