Xiaokai Yang
Xiaokai Yang
> Is your custom layer supposed to be trainable? > If it is, then you can try this changes > > ``` > Flux.@functor S > dudt = Chain( >...
> Just use what the neural de layers use: > > https://github.com/SciML/DiffEqFlux.jl/blob/master/src/neural_de.jl#L3 I added ``` Flux.trainable(n_ode::NeuralODE) = (n_ode.p[1:680],) ``` to discard the parameters for S.b, which is ```p[681:684]```. But the...
> @xiaoky97 try this lines > > ``` > Flux.@functor S > Flux.trainable(m::S) = (m.a,) > ``` I added these lines. It seems that everything in ``` n_ode.p``` were updated/optimized...
> What do you see printing parameters? > > ``` > ps = Flux.params(dudt) > > for p in ps > println(length(p)) > end > ``` > > My code...
I change the code from ``` julia Flux.@functor S Flux.trainable(m::S) = (m.a,) ``` to: ```julia Flux.@functor S (a,) ``` ------ Then I got: ```julia Flux.params(S) #I got Params([]) ``` ```julia...
Hi @DhairyaLGandhi I am not quite sure whether this is a Flux or DiffEqFlux error..... But I am still having the ```DimensionMismatch``` error even if I try ``` @functor S...