math-php icon indicating copy to clipboard operation
math-php copied to clipboard

Additional Numerical Methods

Open Beakerboy opened this issue 9 years ago • 7 comments

It would be good to have numerical integration and differentiation. Integration would be using the Newton–Cotes formulas, and differentiation by the Symmetric derivative.

Beakerboy avatar Aug 01 '16 14:08 Beakerboy

Sounds good. Can you implement this and do a pull request? Thanks!

markrogoyski avatar Aug 01 '16 16:08 markrogoyski

I can eventually. I figured I'd leave it as a feature request that I think would benefit the project as a whole.

Beakerboy avatar Aug 01 '16 17:08 Beakerboy

I have a lot of recent experience with Numerical Methods so I can tackle a lot of these. On the top of my head, we can include things like:

  • Interpolation (Lagrange Polynomials, Hermite, Cubic Spline, etc.)
  • Numerical differentiation and integration (Richardson's extrapolation, Gaussian quadrature, Romberg integration, etc.)
  • Methods for solving IVPs (Euler's method, Runge-Kutta, etc.)
  • Methods for solving BVPs (Linear shooting, Finite difference, Rayleigh-Ritz, etc.)
  • Iterative techniques (Jacobi, Gauss-Siedel, Conjugate Gradient, etc.)
  • Approximation theory (Least squares, Chebychev polynomials, fast Fourier transform, etc.)

We can even include numerical methods for linear algebra (e.g. approximating eigenvalues) though it may make sense to extend these from some linear algebra tools.

jakobsandberg avatar Aug 29 '16 05:08 jakobsandberg

Thanks for offering to contribute!

markrogoyski avatar Aug 29 '16 06:08 markrogoyski

I was thinking about working on some of these techniques when I started on this project. My plan on Interpolation was to add it to the regression namespace. My thought process was that interpolation is another technique to produce a Y from a new X given a larger dataset, just like linear or non-linear regression. Look at what's there already, and you might be able to build upon the existing regression design philosophy to squeeze in what you want to add. I have non-linear least-squares in the pipeline, using Jacobian matricies and Gauss-Newton algorithm. (https://github.com/Beakerboy/math-php/blob/Nonlinear/src/Statistics/Regression/Methods/GaussNewton.php) I've coded it up but have not tested anything, and it is currently specialized for the Michaelis-Menten equation instead of a generic "Model"

Beakerboy avatar Aug 30 '16 15:08 Beakerboy

@jakobsandberg, I like to look at the C++ boost library for advice on implementing different functions. They seem to prefer the TOMS 748 algorithm for non-derivative root finding. Is this something you have under a different name, or are planning on using? Boost Reference

Beakerboy avatar Sep 04 '16 12:09 Beakerboy

It looks like the TOMS 748 algorithm uses "a mixture of cubic, quadratic and linear (secant) interpolation". I'm actually in the process of building interpolation techniques right now, so once those techniques are done (linear, cubic, and quadratic) we could definitely make something like (or identical to) TOMS 748 via a combination of these techniques.

On Sun, Sep 4, 2016 at 5:15 AM, Kevin Nowaczyk [email protected] wrote:

@jakobsandberg https://github.com/jakobsandberg, I like to look at the C++ boost library for advice on implementing different functions. They seem to prefer the TOMS 748 algorithm for non-derivative root finding. Is this something you have under a different name, or are planning on using? Boost Reference http://www.boost.org/doc/libs/1_61_0/libs/math/doc/html/math_toolkit/roots/roots_noderiv/TOMS748.html

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/markrogoyski/math-php/issues/42#issuecomment-244600096, or mute the thread https://github.com/notifications/unsubscribe-auth/AOlwVTEnHgO497C_O4QcE05o1v-VUSoOks5qmrZqgaJpZM4JZn0_ .

jakobsandberg avatar Sep 04 '16 17:09 jakobsandberg