Jonathan Feinberg
Jonathan Feinberg
As sparse grid is sensitive to noise, remove potential noise sources becomes important to maintain numerical stability. And finding out what that often require a bit of diagnostics. I have...
The former syntax is correct, and that is a bug. Good catch. I will start working on a fix. I'll let you know when it is in place.
The solution to the problem required a little bit of a workaround, which results in the code being a bit slower at calculating moments for MvNormal now. That being said,...
Yes, 256 random variables is quite a lot, and above what was the intended use for chaospy. The fact that this bugfix required me to move some numpy functionality to...
Note to self (or @Oo1Insane1oO, if you so graciously want to help): * Add pybind11 machinery * Move `chaospy.bertran.operator:bindex` to C++ (with numpy arrays instead pure python objects) * Move...
Yes, you are right. The issue has been on my radar a while now. In fact I am working on switching the backend for the polynomials these days, which hopefully...
Cool. The major components of the refactoring is already in place. It consist of redefining the polynomial class as a numpy data type. I have split it out into a...
You are understanding the problem correctly. numpoly allows the indeterminats to have any name, not only `q0, q1, ...`. The system is still positional in the sense that if you...
So update time, sine a couple of months have passed. Looking through the different option, I have come to the conclusion that though `numpoly` can do things by keyword, `chaospy`...
@tostenzel, the current implementation have two sets of definitions: `Sens_*` and `Sens_*_sample`. The former bypasses the sampling and calculates sensitivity without Monte-Carlo. Though implementation is faster than Monte-Carlo, it is...