Johannes Blaschke

Results 113 comments of Johannes Blaschke

Another question @andreasnoack -- why do you use the deprecated elemental repo instead of https://github.com/LLNL/Elemental ? The LLNL version comes with CUDA support.

@andreasnoack a user might want a specific version/make their own changes. This seems to be more maintainable: unless the `libEl` build instructions change -- if I understand this correctly --...

Hi @dhiepler -- I also saw your trouble ticket at NERSC, let me respond here so that the community can weigh in. without having tried it myself (I am currently...

I have more information thanks also to @dhiepler, here is an expanded test program: ```julia using Distributed using Elemental using DistributedArrays using LinearAlgebra @everywhere using Random @everywhere Random.seed!(123) A =...

Thanks @andreasnoack -- I just verified that this also works on Cori. This leaves an open question given what @dhiepler is trying to do with the original code: How to...

Thanks @vchuravy for looking into this also. Unfortunately it doesn't work on Cori. Is the `MPIClusterManagers` part necessary? (I find that it seems to hang/time out on Cori -- maybe...

Oh Feck! So now I have to fix `MPIClusterManager` @vchuravy should I open an issue on https://github.com/JuliaParallel/MPIClusterManagers.jl?

@vchuravy Based on our discussion in JuliaParallel/MPIClusterManagers.jl#26 your solution was to run the `addprocs` version _without_ `srun`. That won't work here though, because without `srun` running: ```julia using Elemental ```...

And if I avoid the `addprocs` and run the following within `srun`, I get: ```julia using MPIClusterManagers, Distributed manager = MPIClusterManagers.start_main_loop(MPI_TRANSPORT_ALL) using Elemental using LinearAlgebra using DistributedArrays A = drandn(50,50)...