David Widmann

Results 1463 comments of David Widmann

I still think one could consider dropping support for Flux.params and switching to scalar parameters instead. Both Ref and Vector seem a bit unnatural.

I haven't thought this through but maybe it is possible to write a custom Flux layer that contains only the parameter vector and a function that builds the kernel: ```julia...

As I said, I didn't spend much thought on it. But the main idea would be that instead of forcing all kernels to be mutable we just have one kernel...

> We would need a conversion from any kernel to a KernelLayer to be then used with any optimization tool? The main idea is that it only requires the output...

I came back to this issue some days ago and thought a bit more about the problems and the suggestion above and started to work on a draft. Currently my...

E.g., in https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/examples/deep-kernel-learning/ one approach would be to remove the `@functor` definitions (I don't think they are needed currently either, they are defined in KernelFunctions already and not useful for...

I was worried that such generic fallbacks introduce a) type piracy and b) surprising behaviour. I became also more and more convinced that it would be better to error if...

No, this was not what I had in mind and wanted to suggest. I do not want to allow ParameterHandling types in kernels, they are not even `Real`s and cause...

One doesn't. That's the whole point of `flatten` IMO - it is just one opinionated way of flattening the parameters and usually not the most efficient one. It doesn't allow...

I already have something in a local branch, I had to play around a bit before I ended up with the suggestion above. But before finishing and polishing it I...