extension-script icon indicating copy to clipboard operation
extension-script copied to clipboard

Do you have an example under windows?

Open yeyuxmf opened this issue 5 years ago • 17 comments

The custom layer uses JIT mechanism to invoke on the c++ side, which seems to require registration in scripts, such as: .# include <torch/script.h> Torch:: Tensor warp_perspective (torch:: Tensor image, torch:: Tensor warp){ Torch:: Tensor output = torch:: add (image, warp); Return output. clone (); } Static auto registry= Torch:: jit:: Register Operators ("my_ops:: warp_perspective", & warp_perspective);

This is the application under Ubuntu. Do you have an example under windows?I don't know how to run the example provided in extension-script under Windows.

yeyuxmf avatar Oct 11 '19 08:10 yeyuxmf

cc: @peterjc123

soumith avatar Oct 11 '19 22:10 soumith

Have you experienced any error when using this piece of code? The suggested way is to write a CMake script like this and then mkdir build && cmake .. and cmake --build ..

peterjc123 avatar Oct 12 '19 03:10 peterjc123

@peterjc123 Thank you very much for your answer. First, I tried this under the Windows platform. It can be compiled and passed. But the problem is that I want to use torch. jit. script mechanism to export the model, so that it can be easily deployed in c++ end.I couldn't successfully export the model. The code is as follows: import torch import warp_perspective

@torch.jit.script def compute(x, y): x = torch.ops.my_ops.warp_perspective.forward(y, y) return x.matmul(y)

print(compute.graph) print(compute(torch.randn(8, 8), torch.randn(8, 8))) compute.save("example.pt")

The implementation code of the custom warp_perspective function is as follows(warp_perspective .cpp): #include <torch/script.h> #include <torch/extension.h> torch::Tensor warp_perspective_forward(torch::Tensor image, torch::Tensor warp) { torch::Tensor output = torch::add(image, warp); return output.clone(); } static auto registry = torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward); PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { m.def("forward", &warp_perspective_forward, "WARP_PERSPECTIVE forward (CUDA)"); }

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

@huang229 So the issue is on the python side, right? What error does it throw?

peterjc123 avatar Oct 12 '19 03:10 peterjc123

@peterjc123 I think the problem is that the warp_perspective function has not been successfully registered in the JIT mechanism.Therefore, the JIT mechanism cannot be used to export the model on the python port.I don't know what to do.

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

If you remove x = torch.ops.my_ops.warp_perspective.forward(y, y), can you manage to export the model?

peterjc123 avatar Oct 12 '19 03:10 peterjc123

x= torch::add(y, y) The model can be successfully derived.

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

Okay, another question, will x = torch.ops.my_ops.warp_perspective.forward(y, y) work when not wrapped up with @torch.jit.script?

peterjc123 avatar Oct 12 '19 03:10 peterjc123

yes.This is a direct error report. The error is as follows: File "D:\python-3.7.3\lib\site-packages\torch\jit_init_.py", line 1077, in _compile_function script_fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(fn))

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

There is no save attribute without using the @torch.jit.script model.

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

I think there should be something wrong with the jit compiler code. It will be easier to deal with if you could post a full C++ stack trace.

peterjc123 avatar Oct 12 '19 03:10 peterjc123

@peterjc123 Successful compilation of scripts (warp_perspective.cpp).There are no errors displayed.I feel that the following code doesn't work under Windows. static auto registry = torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward);

The errors completed when exporting the model are as follows: Connected to pydev debugger (build 191.6605.12) Traceback (most recent call last): File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev\pydevd.py", line 1741, in main() File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev\pydevd.py", line 1735, in main globals = debugger.run(setup['file'], None, None, is_module) File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev\pydevd.py", line 1135, in run pydev_imports.execfile(file, globals, locals) # execute the script File "D:\PyCharm Community Edition 2019.1.1\helpers\pydev_pydev_imps_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "D:/vs+opencv/tmp/script.py", line 4, in @torch.jit.script File "D:\python-3.7.3\lib\site-packages\torch\jit_init_.py", line 1181, in script return _compile_function(fn=obj, qualified_name=qualified_name, _frames_up=_frames_up + 1, _rcb=rcb) File "D:\python-3.7.3\lib\site-packages\torch\jit_init.py", line 1077, in _compile_function script_fn = torch._C._jit_script_compile(qualified_name, ast, _rcb, get_default_args(fn)) RuntimeError: attribute lookup is not defined on builtin: at D:/vs+opencv/tmp/script.py:6:9 @torch.jit.script def compute(x, y): x = torch.ops.my_ops.warp_perspective.forward(y, y) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE return x.matmul(y)

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

cc @ezyang Do you know what might be the reason? I'm not so familiar with the JIT things here.

peterjc123 avatar Oct 12 '19 03:10 peterjc123

Here's all my code: It contains three scripts: setup.py, cpp, script.py. cpp: #include <torch/script.h> #include <torch/extension.h> torch::Tensor warp_perspective_forward(torch::Tensor image, torch::Tensor warp) { torch::Tensor output = torch::add(image, warp); return output.clone(); } static auto registry = torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward); PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) { m.def("forward", &warp_perspective_forward, "WARP_PERSPECTIVE forward (CUDA)"); }

setup.py: from setuptools import setup from torch.utils.cpp_extension import BuildExtension, CppExtension

setup( name="warp_perspective", ext_modules=[ CppExtension( "warp_perspective", ["example_app/warp_perspective/op.cpp"], ) ], cmdclass={"build_ext": BuildExtension}, )

script.py: import torch import warp_perspective

@torch.jit.script def compute(x, y): x = torch.ops.my_ops.warp_perspective.forward(y, y) return x.matmul(y)

print(compute.graph) print(compute(torch.randn(8, 8), torch.randn(8, 8))) compute.save("example.pt")

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

@peterjc123 Thank you very much for taking the time to help me. @ezyang Hello, I want to achieve this function under the Windows system.

yeyuxmf avatar Oct 12 '19 03:10 yeyuxmf

There might be problems with how the static initialization works in Windows, which wouldn't surprise me as we don't do as heavy testing on Windows. Does it work if you move the static initialization to, e.g., a main function?

ezyang avatar Oct 14 '19 15:10 ezyang

@ezyang Thank you very much for your time to help me. As I see in your official documents, using Python interface compiled by python setup.py install under windows, the JIT mechanism of torch does not support exporting models. Only after registration can the model be exported at the python end and deployed at the C + + end. Therefore, is there any way to register the extended C + + functions into the JIT mechanism under windows?According to the tutorial you provided, the extended c++ function can only export the trained model on the python side if it is registered into the JIT mechanism. The registration code is as follows: static auto registry = torch::jit::RegisterOperators("my_ops::warp_perspective", &warp_perspective_forward); I don't know if it's convenient for you to provide an example of an application under Windows. I'm very grateful for that.

yeyuxmf avatar Oct 15 '19 06:10 yeyuxmf