Deeplexicon run error..
hello again..!
i run the deeplexicon but error is occured...
how can i solve this..?
This looks related to the numpy version. Double check it's at the correct version (not 2.0)
@Psy-Fer Thanks for advice!
i encountered another error that core dumped. it may be the issue about my operating environment. i ran Deeplexicon with CPU.
does it work well with CPU, or it works only with GPU..?
It was designed to use GPU, but can use CPU, however it will be very slow.
It is also very sensitive to package and cuda/driver versions, as this is all before tensorflow 2.0.
I would highly recommend using these docker containers
https://github.com/Psy-Fer/deeplexicon/tree/master/dockerfiles
James
@Psy-Fer
i cannot use Docker because of authoritu of server... so i tried again... there are another error occur
how can i resolve this problem..? i attach the list of tools
Ahh this is to do with h5py it seams https://github.com/Psy-Fer/deeplexicon/issues/13#issuecomment-876567110
try using h5py==2.10
james
@Psy-Fer
thanks you James i downgrade h5py and re-run
but there is still error like below
when i process my job, i saw the error message that ont-fast5-api == 4.1.3 cannot work under h5py ==2.10 and installation is cancelled.
so i re-download ont-fast5-api and h5py is auto-update like below
is this process is related with my problems?? it seems like Deeplexicon cannot read my raw data...
Deeplexicon was written in 2018/19 and a lot has changed since then in terms of file schemas, libraries, data. Could it be related to VBZ compression of the fast5 files?
@enovoa has a working tool that she is happy to share with researchers (tool in review atm) if deeplexicon can't work on your data. See other issues for more details about the current limited support for deeplexicon.
Regards, James
HI @Seongmin-Jang-1165,
In addition to what @Psy-Fer mentions, I'd just like to add that DeePlexiCon (as well as EpiNano, for which you also opened a GitHub issue recently) are both available within MasterOfPores, a nextflow workflow that will precisely overcome all installation/dependency issues. I'd suggest you to try it.
Also I'd suggest to give a go to using the dockerfile as @Psy-Fer suggests, but running it via singularity which is typically the solution used by many institutes with computing clusters. You might want to try running some other dockerfile as "test" to check whether your issue is related to running dockerfiles, or to the dockerfile of deeplexicon per se.
Best, Eva
@Psy-Fer @enovoa thanks for kind reply, i will take a look about MOP
i already e-mailed to enovoa about new developing tool for demultiplexing Direct RNA seq data and got friendly response. Thank you so much enovoa !
but my professor doesn’t seem too thrilled about meeting all the requirements to use the tool, so i'm trying out other methods instead.
i thought the error is about VBZ compression, so i tried to pre-process my data like below : POD5(raw) --> convert to fast5 --> split with single reads & merge to single fast5 data(multi_to_single_fast5, single_to_multi_fast5) --> repack the data with VBZ compression(h5repack)
so i made one fast5 file and ran the Deeplexicon
but, as always, error is occured like below :
how can i solve this..?
i read several issue document, and i think i have to decompress the VBZ compression and use VBZ plugin...