All UniTensor initializers should share the same standard arguments
Could it be checked that all UniTensor initializers support the same arguments as just calling UniTensor(Tensor, ...)?
For example, this does not work currently
uT = cytnx.UniTensor.uniform([4,4], low=-1., high=1., labels=["a","b"], name="uT", rowrank=1)
because the arguement name of uniform() is in_labels instead of labels and rowrank is not supported at all.
This is exactly why I want to move them away from constructor, __ init__. Some of those things don't need to be know at init stage.
You can do the same thing with:
cytnx.UniTensor.uniform([4,4], low=-1., high=1.).relabels(["a","b"]).set_name("uT").set_rowrank(1)
In my opinion we should decide on one way or the other to initialize UniTensors and use it consistently, in the API and in the paper. Currently, some initializers work in the standard way as for UniTensor(Tensor, ...), others do not. If we want the user to use the form
cytnx.UniTensor.uniform([4,4], low=-1., high=1.).relabels(["a","b"]).set_name("uT").set_rowrank(1)
it might even be better not to provide the arguments label, rowrank, set_name at all in all the initializers.
I personally prefer the usage of initializers though that create the UniTensor with the right attributes already instead of having it uninitialized and then setting each attribute one by one.
We agreed that the train syntax like
cytnx.UniTensor.uniform([4,4], low=-1., high=1.).relabels(["a","b"]).set_name("uT").set_rowrank(1)
is the better option, and it works the same way for C++.
Still, I think if we allow for optional arguments like dtype, name, labels, rowrank, it should be provided in all initializers in the same way. Some arguments are missing or have different names for some of the initializers.
This is not a high priority though since we use the trains now.