pyjulia icon indicating copy to clipboard operation
pyjulia copied to clipboard

Question: What is the best practice for working with PyTorch?

Open famura opened this issue 2 years ago • 2 comments

Hi PyJulia team,

I'm just getting started with porting a Julia package to a Python module. On the Python side, all the interaction will be with PyTorch tensors.

I noticed that one can pass numpy arrays and lists to Julia, which will then be cast to the built-in Array type. AFAIK, this is not possible with PyTorch tensors. A simple workaround would be to convert the tensors from torch to numpy (and the reverse). However, I am curious if there is a more elegant way. How can I control this mapping?

Best regards, Fabio

famura avatar Jun 26 '23 16:06 famura

I believe any Python array that uses the buffer protocol should be wrappable by Julia without copying. From the Julia perspective, we just need the dimensions, (d)type, and pointer for the data structure. We could then use Base.unsafe_wrap or its equivalent.

https://docs.julialang.org/en/v1/base/c/#Base.unsafe_wrap-Union{Tuple{N},%20Tuple{T},%20Tuple{Union{Type{Array},%20Type{Array{T}},%20Type{Array{T,%20N}}},%20Ptr{T},%20Tuple{Vararg{Int64,%20N}}}}%20where%20{T,%20N}

It is unclear to me what the status is of that in pytorch. The following issue appears relevant. https://github.com/pytorch/pytorch/issues/19143

mkitti avatar Jun 26 '23 20:06 mkitti

Thank you @mkitti for the hints and explanation. I'll have a look at them.

famura avatar Jun 27 '23 06:06 famura