Giridhar Pemmasani
Giridhar Pemmasani
In that case, you can try couple of approaches. First, dispynode.py from 4.6.8 should work with rest of dispy version 4.6.9 (to isolate/confirm issue is in dispynode and not the...
Can you also describe outline of your program? Are you using 'setup' function to load modules? Where are you executing the block you mentioned above: ``` for m in to_delete:...
That is puzzling that those 4 lines affect your program, as you don't use 'setup' and 'cleanup' (which is the reason for those 4 lines). Can you clarify that this...
While dispynode should work with your implementation, I have a few suggestions: - Cluster's attributes with underscore are not meant for users; they are for implementation only and can change....
I am guessing that your computation jobs (core_job) depends on modules loaded in setup_env cluster. This would've worked prior to 4.6.9 because modules are not cleaned after a computation is...
Well, when you create cluster for 'setup_env' in to 'cluster' again and again, and later use same variable for creating 'core_job' cluster, behind the scenes, the 'cluster' is being closed...
Can you update if this problem is resolved, or still having issues?
Can you update if this is still an issue / close if solved?
The data is sent from client to nodes over network, so data has to be serialized (and not because of multiprocessing). As mentioned in documentation, if objects can't be serialized...
If you are sending an object with `submit` method of cluster, then that object must be serializable. Python's pickle can serialize data in most cases, but if objects have attributes...