Vladimir Kostyukov
Vladimir Kostyukov
I think if you swap to a dense (from CRS) you should be fine in that test. It's cheaper to write 15000x15000 dense matrix than to write 15000x15000 sparse matrix...
Matrices/Vectors are used to be serializeable. I decided to not do that by the same reason no one else should do that. Java's serialization is completely broken (both performance-wise and...
I'm afraid not, there is no easy fix to that.
Thre is a limit on a max sparse matrix size in la4j. It's `MxN=Int.MaxValue`. There is no way you can do 1m x 1m with la4j, just because there is...
Yeah, you should be fine with 18600x18600 matrices. I admit there is a bug in `.toBinary` (w.r.t to integer overflow) and if it's possible to allocate this matrix with la4j...
This might be a good idea for the next milestone (0.5.0).
Right. Caching the intermediate result (aka memoization) is the most common practice of performace improvements. Since, the la4j library is a single-threaded solution we could do that w/o worrying on...
Thanks for the ticket @LsKoder! Is `symmetric` a part of the MatrixMarket format spec? Also, maybe you want to work on a fix?
Right. The plan is following: - Establish a robust functional testing - Establish performance testing with JMH Then migrate to double if it's possible.
Does it mean calculation of `e^A` where `A` is a matrix?