tch-rs
tch-rs copied to clipboard
Why not impl Sync for Tensor?
I don't know much about this question, but I also met this question when I want to use the same tensor in multiple threads. After a lot of trying, I used a TensorWrapper struct to wrap the tensor and then impl send and sync for the struct. It works at last.
Struct TensorWrapper{tensor:Arc<Tensor>}
unsafe impl Send for TensorWrapper{}
unsafe impl Sync for TensorWrapper{}
Hope this can solve your problem. The tensor wrapped in TensorWrapper is just used to train a model thus I won't make any changes to this tensor, I'm also curious about the danger this could pose. I am also a beginner of rust so I don't know whether !sync of tensor is because of the possible danger of sync Tensor or because the underlying design of Rust makes sync isn't unnecessary for Tensor.
Originally posted by @neveranever98 in https://github.com/LaurentMazare/tch-rs/issues/460#issuecomment-1064077067
I don't think Tensor is thread safe in pytorch.
In general, all the objects in pytorch are thread safe to read. But are not to write into.