manyfeatures

Results 7 comments of manyfeatures

Is your custom layer supposed to be trainable? If it is, then you can try this changes ``` Flux.@functor S dudt = Chain( Dense(2,30,tanh), Dense(30,10), S(10, 2) ) ```

@xiaoky97 try this lines ``` Flux.@functor S Flux.trainable(m::S) = (m.a,) ```

What do you see printing parameters? ``` ps = Flux.params(dudt) for p in ps println(length(p)) end ``` My code keeps S.b fix. Why do you even have 684 weights? In...

Ok, this is weird. In NeuralODE layer parameters return again after `destructure` ```julia _p, re = Flux.destructure(dudt) # part from NeuralODE Layer _p1 = Flux.params(dudt) println("Original network _p $(length(_p))") #...

@Vaibhavdixit02 Yep, I'll try to remaster the PR in a couple of days

I've got some problem because can't limit the hidden state amplitude if I transform the task in NeuralODE layer. I'll describe the case on discourse

I created the [post](https://discourse.julialang.org/t/custom-neuralode-layer-trains-innefficiently/61602) with the code and also will examine it lately as well