dispy icon indicating copy to clipboard operation
dispy copied to clipboard

How to update globals on all nodes?

Open tigermask1978 opened this issue 5 years ago • 5 comments

I know that dispy can updating globals by using multiprocessing module on one node.But I want to share globals which can be updated by any nodes in the cluster, any idea? Thanks a lot.

tigermask1978 avatar Dec 05 '19 09:12 tigermask1978

I don't quite understand what the question is, so likely this may not work: You may want to run another function (e.g., by creating another cluster, as done in MapReduce example) and have jobs on it update nodes (use submit_node on each node to update nodes).

pgiri avatar Dec 10 '19 04:12 pgiri

@pgiri Thanks for your reply. I have read the MapReduce example in doc. But I think it may not what I want.For example, I have a var(maybe a counter) in client, could any node read/write it simultaneously(with a lock)?

def compute(): # How to read and update the COUNTER here(maybe need a lock)? COUNTER += 1 return 0

if name == 'main':
import dispy # Here is a COUNTER I want to share to all nodes. COUNTER = 100
cluster = dispy.JobCluster(compute)
jobs = [] for i in range(20): job = cluster.submit() jobs.append(job) for job in jobs: job() # waits for job to finish and returns results stdout = job.stdout print(stdout)
cluster.print_status() # shows which nodes executed how many jobs etc.

tigermask1978 avatar Dec 11 '19 02:12 tigermask1978

See node_shvars.py in examples.

pgiri avatar Dec 15 '19 17:12 pgiri

@pgiri Thanks for your reply.I think the example in node_shvars.py could only share variables on a node(but not by jobs in OTHER NODES).So how to share variables on all nodes in the cluster?Thanks again.

tigermask1978 avatar Dec 16 '19 11:12 tigermask1978

I am not sure I understand your question, but in case you are asking about replacing in-memory data, see latest release and example replace_inmem.py.

pgiri avatar Mar 15 '20 13:03 pgiri