Core dump with continuous covariates
Good morning, I'm trying to test differential expression according to a model with three continuous independent variables (here A, B and C). Since I'm interested in all coefficients, I'm doing like this:
test = de.test.wald(
data=adata,
formula_loc="~ 1 + A + B + C",
factor_loc_totest=["A" ,"B", "C"],
as_numeric=["A" ,"B", "C"]
)
but I get a segmentation fault (plus a warning on logarithm)
/home/dcittaro/miniconda3/envs/graph_tool/lib/python3.7/site-packages/dask/core.py:121: RuntimeWarning: invalid value encountered in log
return func(*(_execute_task(a, cache) for a in args))
training location model: True
training scale model: True
Segmentation fault (core dumped)
Since I had some issues with installing diffxpy in the past, I thought it was a system error, so I tried a simple model on cell groups (t-test, here):
simple_test = de.test.t_test(
data = adata,
grouping='leiden'
)
and it works. I also tried a model in which only a single coefficient is used:
test = de.test.wald(
data=adata,
formula_loc="~ 1 + C",
factor_loc_totest="C",
as_numeric=[ "C"]
)
and it fails. Any hint would be appreciated.
de.__version__
'v0.7.4+16.g3689ea8'
I should add that it crashes on an old hardware (CentOS 7), it seems to work on my laptop (latest OS X), although I cannot complete calculations as it consumed all resources.
I should add that it crashes on an old hardware (CentOS 7), it seems to work on my laptop (latest OS X), although I cannot complete calculations as it consumed all resources.
Thanks for the note! I think this is likely because dask or numpy try to use more swap than they are allowed to in the linux distribution, macos is very lenient with such memory spikes, I am working on solving this and will update once it s done!