amica icon indicating copy to clipboard operation
amica copied to clipboard

AMICA on distributed systems crashing with large number of models and high density decompositions

Open vlawhern opened this issue 1 year ago • 19 comments

I'm trying to run AMICA across a large number of nodes in a distributed fashion and I'm finding that for certain combinations of (1) num_models, (2) number of compute nodes and (3) the number of channels of the EEG data that AMICA crashes. I suspect it might be for very large number of compute nodes that there isn't enough data per node to do high density decompositions.

I was wondering if there were general rules about when AMICA would work in a distributed manner for say

T = length of the data N = number of compute nodes M = number of models C = number of channels

Seems like T/N is the amount of data given per node, so it comes to some relationship between M and C for T/N, but it'd be nice if there was some guidance on what values of T/N/M/C would work for distributed AMICA.

vlawhern avatar May 04 '23 15:05 vlawhern