TensorNetwork
TensorNetwork copied to clipboard
Array class
It seems there is widespread agreement that, in the longer term, Node
is eventually to be replaced by or supplemented with some sort of "I can't believe it's not an array!" class Tensor
that is less stateful. Users would then by default interact with Tensor
rather than with Node
directly.
In the shorter term, to motivate adoption by existing physicist practitioners who will primarily interact with the library via ncon
, we need something now that behaves more or less like an array, but which nevertheless shields the user from the backend. Originally, I had thought this might be done by modifying the interface to Node
to include a subset of NumPy functions, but I now see this might be complicated, due to the need to account for the various possible Edge states in order to do e.g. reshape
or transpose
.
I therefore propose that we instead directly provide an array class (I suggest we call it Array
), which in essence would just be a wrapper around the backend array. Node.tensor
would then be an instance of Array
instead of the backend array directly.
We provide a subset of NumPy-like functions to act on Array
. Contractions can be handled using either ncon
or by building Nodes and Edges around Array
s; users who only want to use Node
would just instantiate them from backend arrays. I think this would simplify e.g. PEPS a lot; it would also lay much of the groundwork for Tensor
, which would differ from Array
at the interface level more or less only via the contraction API.
Thoughts?