Calculus.jl icon indicating copy to clipboard operation
Calculus.jl copied to clipboard

Calculus functions in Julia

Results 42 Calculus.jl issues
Sort by recently updated
recently updated
newest added

https://github.com/JuliaMath/Calculus.jl/blob/ffeaee8ab516c4dd598be246e2ec8745f1dd70ca/src/differentiate.jl#L172

tl;dr The finite_difference methods for the gradient and jacobian (but not hessian) temporarily mutate the input `x` vector in order to compute the derivatives. I think this is unwise since...

I think there is a problem with the `central` difference code. Here is a simply example: ``` julia> Calculus.derivative(x -> x/(x+ 1.4424183196362515e-9),2e-8,:central) -39.33717713979761 ``` ``` julia> Calculus.derivative(x -> x/(x+ 1.4424183196362515e-9),2e-8,:forward)...

This PR basically removes the lower bound on the size of `epsilon`. It fixes the current problem for small `x` with some functions, like, e.g., `f(x) = log(x)`. MWE of...

Is there a reason that `abs` not supported by `differentiate` or just an omission?

Ref: https://discourse.julialang.org/t/package-compatibility-caps/15301

I'm using `Calculus.jl` in `FEMBasis.jl` to calculate partial derivatives of interpolation polynomials symbolically before code generation. During precompilation of package, I get a lot of following messages: ``` WARNING: eval...

It looks like may be using `norm(matrix)`. In Julia 0.7, this will compute the Frobenius norm (`vecnorm` in Julia 0.6), due to JuliaLang/julia#27401. If you want the induced/operator norm as...

Since a recent update, there seems to be a problem with the `gradient` fucntion. The following works fine: ```julia julia> gradient([1:3;]) 3-element Array{Float64,1}: 1.0 1.0 1.0 julia> using Calculus julia>...

It seems that the finite difference routines would "just work" if `epsilon` was changed to the "correct value". Unless there's some kind of norm argument, I would propose changing each...