bbai-kernel icon indicating copy to clipboard operation
bbai-kernel copied to clipboard

Can sparse grid interpolation use GPU?

Open Buantum opened this issue 4 months ago • 1 comments

I used the Gaussian function to test bbai's sparse grid interpolation. The time required varies with the dimension as follows. My project needs to repeatedly solve an optimized high-dimensional function, so it is very sensitive to time.

Image

Buantum avatar Aug 26 '25 08:08 Buantum

Hi @Buantum, GPU support is something I might consider adding.

Have you done any benchmarking with the actual function you plan on fitting? Can you provide a ballpark figure for how long it takes to fit now vs how much of an improvement you'd need (excluding the cost of the evaluations at grid points since that's not done in the library)?

Also, are you fitting a function to evaluate the interpolant at other points? or are you just using the integral value?

rnburn avatar Aug 27 '25 04:08 rnburn