Carlo Lucibello
Carlo Lucibello
Replacement for #19. Still many [oustanding issues](https://github.com/JuliaSparse/SuiteSparseGraphBLAS.jl/issues?q=is%3Aissue+author%3ACarloLucibello) in SuiteSparseGraphBLAS prevent finalizing this PR
Replacement for #136 built on top of CUDA.jl#master. Besides the need to wait for a new CUDA.jl release, also the following issues still stand in the way of proper integration...
This package is incompatible with Flux 0.14 and NNlib 0.9. Can the compatibility be updated?
Rebuilding the cache is more painful than needed, especially for new contributors, see e.g. https://github.com/CarloLucibello/GraphNeuralNetworks.jl/pull/253 First of all, you have to know you have to rebuild the cache, then you...
Reductions on `SparseMatrixCSC` return dense arrays. Should the same happen for `GBMatrix`? Current behavior is ```julia julia> using SparseArrays, SuiteSparseGraphBLAS # dense matrices from sparse arrays' reductions for base types...
Types here could hookup into SparseArrays show implementation for a nice printing. ```julia julia> using SparseArrays, SuiteSparseGraphBLAS julia> x = sprand(10, 10, 0.5) 10×10 SparseMatrixCSC{Float64, Int64} with 39 stored entries:...
Can we implement something like `reinterpret` here?
https://graphblas.juliasparse.org/dev/utilities/
It would be useful to display the fill value when printing a GBMatrix ```julia julia> x = GBMatrix([1,2], [2, 3], [1,2], fill=17) 2x3 GraphBLAS int64_t matrix, bitmap by row 2...
In https://graphblas.juliasparse.org/dev/arrays/#Construction it is shown the now removed constructor sporting `nrows` and `ncols`. ```julia julia> GBMatrix([1,2],[2,3],[1,1], nrows=3, ncols=3) ERROR: MethodError: no method matching GBMatrix(::Vector{Int64}, ::Vector{Int64}, ::Vector{Int64}; nrows=3, ncols=3) Closest candidates...