Raimondas Galvelis
Raimondas Galvelis
Could you share `struct.pdb` and a script to generate `model.pt`. So, it is possible to reproduce the issue.
Also, could you add the imports to the script? So it is possible to run it.
I have created the environment: ``` conda env create mmh/openmm-8-beta-linux conda activate openmm-8-beta-linux conda install -c conda-forge pytorch_cluster ``` The scirt works with problem. @JustinAiras try to create a new...
@JustinAiras this might be a `conda` issue (https://github.com/openmm/openmm-torch/issues/88#issuecomment-1310477870). Could you try to install with `mamba`?
OpenMM-PLUMED pass the script to the PLUMED parser (https://github.com/openmm/openmm-plumed/blob/5406068b5a36b1208f1340180cfc5b8e4ee7aff9/platforms/reference/src/ReferencePlumedKernels.cpp#L97). Most likely, FunnelMD or Maze tags aren't recognized because PLUMED was built without these extra modules.
It would be possible to implement a support for such models: ```python class IntegratorModule(torch.nn.Module): def forward(self, positions, velocities, forces, arbitrary_number_of_scalars): # Do some computation return new_positions, new_velocities ```
Under the hood, PyTorch constructs a computational graph, which represents the operations and associated input-output dependencies. Probably it is possible to wrap `openmm::Context` into `torch::CustomClass` and let PyTorch to act...
@jchodera do you have some examples which go beyond the capability of `CustomIntegrator`. How much more flexibility we want?
No, I don't have.
@sukritsingh feel free to create a new directory. Something like `tutorials` should be good.