Implement more methods for a faster PDMats
In my understanding, one of the major advantages of this package over standard approaches is that it makes computations with positive-definite matrices faster. However, it seems to me that some methods are missing to guarantee that PDMats offers the best in all conditions.
As an example, the product of a ScalMat by a scalar is implemented in
https://github.com/JuliaStats/PDMats.jl/blob/4aaf961d15583b866849b7af0b2c190d9cb8232c/src/scalmat.jl#L47-L50
but there is no implementation for the product of a scalar by a ScalMat, which falls back to the standard LinearAlgebra multiplication:
julia> s = PDMats.ScalMat(3, 0.25);
julia> M = rand(3,3);
julia> @which s * M
*(a::PDMats.ScalMat, x::AbstractMatrix)
@ PDMats ~/.julia/packages/PDMats/8CRip/src/scalmat.jl:47
julia> @which M * s
*(A::AbstractMatrix, B::AbstractMatrix)
@ LinearAlgebra ~/.julia/juliaup/julia-1.12.2+0.aarch64.apple.darwin14/share/julia/stdlib/v1.12/LinearAlgebra/src/matmul.jl:114
The same happens for products with PDiagMat or PDMat. This, of course, has a potentially very negative impact on the efficiency of this package.
Similarly:
julia> x = rand(3);
julia> @which s * x
*(a::PDMats.ScalMat, x::AbstractVector)
@ PDMats ~/.julia/packages/PDMats/8CRip/src/scalmat.jl:43
julia> @which x' * s
*(x::Adjoint{T, <:AbstractVector} where T, A::AbstractMatrix)
@ LinearAlgebra ~/.julia/juliaup/julia-1.12.2+0.aarch64.apple.darwin14/share/julia/stdlib/v1.12/LinearAlgebra/src/matmul.jl:97
julia> @which dot(x, s, x)
dot(x::AbstractVector, A::AbstractMatrix, y::AbstractVector)
@ LinearAlgebra ~/.julia/juliaup/julia-1.12.2+0.aarch64.apple.darwin14/share/julia/stdlib/v1.12/LinearAlgebra/src/generic.jl:1028
Along these lines, I am confused by the issue #201 (see my comment there).
In summary: if we think that the PDMats should always provide the most efficient solution, we should make an effort to implement the missing methods. In case there is an agreement, I am willing to help.
This method for ScalMat should actually be removed, or rather it should not return a ScalMat. The result is just not a pd matrix in general, and changing the output depending on the value would introduce a type instability.
Sorry @devmotion, I do not think I understand your point. Obviously the return type of the method will depend on the input values, and will not in general be a ScalMat. But this does not necessarily introduce a type instability, as the result will not depend on the value, but rather on the type. Also, why do you think the method returns a ScalMat? It does not.
The product of a ScalMat and a scalar currently returns a ScalMat. That's wrong though unless the scalar is positive. Returning eg a Diagonal for negative scalars and a ScalMat for positive scalars would introduce a type instability. Thus the method should just always return a Diagonal (ideally with a FillArrays.Fill diagonal).
Sorry, I missed the positive requirement. However, I tend to agree with @andreasnoack, as I tried to argue here https://github.com/JuliaStats/PDMats.jl/issues/201#issuecomment-3677892352