ocannl icon indicating copy to clipboard operation
ocannl copied to clipboard

Enable (and ensure proper) memory management for arrays embedded in temporary or only-non-top-level tensors

Open lukstafi opened this issue 2 years ago • 2 comments

Fortunately, by their nature such arrays are not needed on the host, so we just need to make sure they are correctly initialized on the devices (i.e. the from_host direction).

Prominently, these are temporary tensors in optimized updates (currently, Train.sgd_one).

lukstafi avatar Sep 26 '23 14:09 lukstafi

To clarify the wording: the tensors are temporary, but the arrays persist across the optimizer steps. This makes sense in OCANNL -- in PyTorch and and similar frameworks the arrays are the tensors. It's better to avoid using "temporary tensor" in this sense more broadly as it will be confusing.

lukstafi avatar Sep 26 '23 14:09 lukstafi

I no longer remember what the issue is about... Maybe memory modes make the situation clearer.

lukstafi avatar Mar 18 '24 20:03 lukstafi

There's a related issue that I will fix tomorrow: at link time, require that tensor nodes in the graph of a tensor are either embedded, or already part of the parent context (the context passed to link that will be the parent of the new context).

lukstafi avatar Jul 15 '24 21:07 lukstafi