Zachary DeVito
Zachary DeVito
It is partially fixed. Yet we should move the issues we still want to fix over to pytorch.
Yes, I think all things on Type can be const.
(Also, because Tensor is actually a shared pointer to the tensor implementation most things on Tensor can be const too! But it seems disingenuous to mark them so since 'const...
Yes, the whole point of having `Tensor` rather than `shared_ptr` is to make sure the libraries API stays simple. It is intended to be directly usable as a lower-overhead way...
I think that would probably work, assuming writing a Type by hand doesn't force you to implement so many methods that it would become overly verbose (I think it is...
Is there something wrong with `Scalar(0).toTensor().expand(the_size)`
I've considered this but until we see perf problems based on the current approach I don't want to do it. It adds complexity to the public API -- people need...
Undefined Tensor means a TH tensor with no backing storage, but `info` itself is pointing to a valid TensorImpl? Just trying to understand the situation.
I see, I think we need to mark it optional like @killeent did in some THNN functions.