TensorNetwork
TensorNetwork copied to clipboard
Giving ncon a pythonic makeover
The function ncon
is powerful and beloved by veteran tensor networkers. Yet despite its prowess, its matlabian exterior prevents it from feeling like a natural python function. First, edge indices that start at 1 don't mesh well with python nor with TN in general where tensor axes that start at index 0. Second, python has a nice alternative to passing in two lists of associated things.
Here are a few ideas on how to adapt ncon
to python.
Given tensors left
, mid
and right
, an example of the current syntax is:
ncon([left, mid, right], [[1, -1], [1, -2, 2], [2, -3]])
If instead of negative numbers, complex numbers are used to represent dangling edges, 0-based indexing is possible. In addition, replacing the two related lists with a dictionary brings edge indices next to their associated tensors, removes a nesting of lists and reduces the number of function arguments by one:
ncon({left:[0, 0j], mid:[0, 1j, 1], right: [1, 2j]})
Alternatively, one could use strings instead of lists with d
representing dangling edges:
ncon({left:'0, 0d', mid:'0, 1d, 1', right:'1, 2d'})
I'd be curious to hear thoughts from the various stakeholders, especially the black belt ncon
users.