NeuralPDE.jl icon indicating copy to clipboard operation
NeuralPDE.jl copied to clipboard

generate_training_sets improvements

Open killah-t-cell opened this issue 2 years ago • 0 comments

We use generate_training_sets to set up models with GridTraining. I managed to find a few breaking points in generate_training sets while playing with some more complex models.

  1. dif is defined as dif = [eltypeθ[] for i=1:size(domains)[1]] but is then used inside this loop
  for _args in bound_args
        for (i,x) in enumerate(_args)
            if x isa Number
                push!(dif[i],x)
            end
        end
    end

In case length(bound_args) > length(domains), this will break. Also in case any _args has a greater length than domains or bound_args, then this will error with a BoundsError as well.

GridTraining is not the best training strategy, so I guess this is ok, but ideally it would be good to refactor this to make it more robust.

killah-t-cell avatar Nov 22 '21 07:11 killah-t-cell