signac-flow
signac-flow copied to clipboard
Workflow management for signac-managed data spaces.
This issue is part of glotzerlab/signac#527 but requires a more focused thread. signac-flow currently stores a number of pieces of internal data, such as bundles, submission status, and configuration information...
@bdice @mikemhenry @csadorf following up on #498, I think we should consider removing scheduler-based environment detection entirely. @bdice correct me if I've misunderstood the problem, but #498 came up because...
### Description When a job operation is run with `@with_jobs` and `@cmd` in an environment which uses `jsrun` to run jobs on the compute node (ie. Summit), the job will...
### Feature description The `next` branch has many ambiguous variable names, and anyone could get confused with those names. A specific example would be the use of jobs v/s aggregate...
When requesting more GPUs than are available on a single node during submission, signac-flow needs to consider how the CPU tasks are distributed. It doesn't make sense to allocate all...
### Feature description I have a somewhat large signac project (~500) jobs with some complicated conditions. I've profiled and cleaned up the conditions, but I was still finding the status...
### Feature description Right now submission on a specific environment uses a template filter to pull the account from the environment's namespace in the config. This behavior is undocumented, but...
By default, we perform thread-parallel execution of status fetching operations, but it's not clear that this mode is always desirable relative to serial execution since it only accelerates highly I/O...
### Description Submission of bundles on stampede2 produces wrong submission script. Might be a template issue. ### To reproduce ` python3 project.py submit -n 3 -b 3 -w 0.2 --parallel...
### Feature description tl;dr: We need a way to control parallelism within and between groups. Parallel operation within a group would be "intra-group" and parallel operation between groups would be...