Alexios Chatzigoulas

Results 17 comments of Alexios Chatzigoulas

Thank you @guillaumeeb for your answer. Yes I wait until the nodes are active. The stderr and stdout files are empty. I work in a miniconda environment with Python version...

Thank you for your answer and the readability corrections @lesteve . I 'll keep that in mind for the future. I have already specified the interface to be ib0. If...

The work-around and the alternative way both worked and the IP is now the Infiniband's. I also found out that I was using a wrong shebang and I fixed that...

Yeap, still empty. I tried the manual way ```qsub -b y ~/miniconda3/envs/alekos/bin/dask-scheduler --scheduler-file /users/pr008/user/scheduler.json --interface ib0``` ```qsub -b y ~/miniconda3/envs/alekos/bin/dask-worker --scheduler-file /users/pr008/user/scheduler.json``` ``` squeue -u user ``` ``` 805813 compute...

I just saw that it does not create the ```scheduler.json``` file.

Yes these logs are again empty. I tried both ```qsub``` and ```sbatch``` and both of them start the scheduler and the worker but still no connections with the client. Also,...

The webpage is http://doc.aris.grnet.gr/ Yes I mean the login node and indeed the ```ib0``` is an interface on the login node. The sysadmin told me that although he doesn't know...

I managed to run a job with a simple python script with correct logs. Then a job with the ```dask-scheduler``` ``` #!/bin/bash -l #SBATCH --job-name=pyth_script # Job name #SBATCH --output=jobname.%j.out...

@lesteve thank you very much for your help so far. Indeed my conda environment python version was 3.6.9 and I changed it to 3.6.5 to be the same with the...

Dear @lesteve happy new year! That solved the issue. I can now manually connect the workers with the scheduler. Thank you very much for your help and cooperation. I guess...