Michelle Richer
Michelle Richer
OK, can we have a meeting?
@FarnazH @PaulWAyers I think I was able to make some progress with this. I changed line 60 of gbasis/evals/density.py and got this: ``` 112 993.4 MiB 0.0 MiB 1 if...
@leila-pujal I believe it's ok. `einsum` will try to avoid intermediates where possible.
OLD  NEW  The memory is reduced somewhat. I used the `mprof` executable that comes with `memory_profiler` for this, which tracks memory over time. EDIT: I had these mixed...
I have some more information here. I profiled this functionality while indicating the boundaries between functions. It seems the biggest spike occurs in `evaluate_basis()`, while evaluating contractions. Then, I was...
I have come up with a mask function: ```python3 def _evaluate_orb_mask(basis, points, tol_screen): orb_mask = np.zeros((len(basis), points.shape[0]), dtype=bool) for mask, shell in zip(orb_mask, basis): cmin = shell.coeffs.min() amin = shell.exps.min()...
OK, so I have the mask function, and the chunking is implemented. My issue now is that, using the mask, I am not sure of where I can save memory....
See [this comment](https://github.com/theochem/gbasis/issues/121#issuecomment-2214160910). It's dynamic, though, based on the amount of memory available. I just set it manually to 4 chunks to show this.
 I was able to make the chunking better. This is with 8 chunks. Normally, the chunks will be dynamically determined so that no chunking occurs if there is enough...
Libcint does not support Windows.