Guido Petretto
Guido Petretto
Thanks @rkingsbury for updating the MemoryStore. The limited number of features supported by mongomock was ineed an issue. Concerning the test that fails, I made a few tests and figured...
> Thanks for investigating! So if I understand correctly, the _original_ problem that prompted #791 is resolved by this PR. Indeed I think that this would have avoided the need...
Thanks @JaGeo for opening this. I am linking the jobflow-remote issue, as there is a more detailed description of the problems encountered: https://github.com/Matgenix/jobflow-remote/issues/79. And I believe this is also linked...
Good point. Then to me this is even a more compelling reason to focus on a solution that would allow to cover a larger number of cases. If you have...
I think that one of the main issues, even with the solution that I have implemented, is that it requires to read the full output and create a trajectory object...
Just to clarify, while I think that it would be beneficial for many reasons, I understand that switching to the parsing of the hdf5 file for all the outputs would...
Thanks @utf for bringing up the topic again. After discussing with @davidwaroquiers, here is what we came up with. Given the typical use cases for jobflow's workflows, I would mainly...
Hi @shyuep, thanks for looking into this. Indeed the first point is one to be careful about. My idea was to clear the cached_data dictionary in the `_do_check` method, after...
If the option to use the files as input for the workflows is needed, I believe it would be better to add it directly here. Otherwise the risk is that...
I like @utf's proposals. An additional downside of the second one is that `file` would not be a good argument for a Job function anymore. Overall I would say the...