Can sparse grid interpolation use GPU?
I used the Gaussian function to test bbai's sparse grid interpolation. The time required varies with the dimension as follows. My project needs to repeatedly solve an optimized high-dimensional function, so it is very sensitive to time.
Hi @Buantum, GPU support is something I might consider adding.
Have you done any benchmarking with the actual function you plan on fitting? Can you provide a ballpark figure for how long it takes to fit now vs how much of an improvement you'd need (excluding the cost of the evaluations at grid points since that's not done in the library)?
Also, are you fitting a function to evaluate the interpolant at other points? or are you just using the integral value?