NeuralPDE.jl
NeuralPDE.jl copied to clipboard
generate_training_sets improvements
We use generate_training_sets
to set up models with GridTraining. I managed to find a few breaking points in generate_training sets while playing with some more complex models.
- dif is defined as
dif = [eltypeθ[] for i=1:size(domains)[1]]
but is then used inside this loop
for _args in bound_args
for (i,x) in enumerate(_args)
if x isa Number
push!(dif[i],x)
end
end
end
In case length(bound_args) > length(domains), this will break. Also in case any _args
has a greater length than domains
or bound_args
, then this will error with a BoundsError
as well.
GridTraining is not the best training strategy, so I guess this is ok, but ideally it would be good to refactor this to make it more robust.