djl icon indicating copy to clipboard operation
djl copied to clipboard

Support for custom pytorch ops

Open mike-grayhat opened this issue 4 years ago • 4 comments

Is there a way to use torchscript model with custom ops?

I'm trying to run pytorch geometric model with DJL and have a problem with custom ops from pytorch scatter library. The error is

Exception in thread "main" ai.djl.engine.EngineException: 
Unknown builtin op: torch_sparse::ptr2ind.
Could not find any similar ops to torch_sparse::ptr2ind. This op may not exist or may not be currently supported in TorchScript.

Will it be enough to just compile and install the required library or its a more complex case with JNI involved ?

Thanks in advance

mike-grayhat avatar Dec 17 '20 14:12 mike-grayhat

I checked it out. You can try to build JNI from source and link jni with torch sparse package. The CMakeFile is here.

find_package(TorchSparse REQUIRED)
target_link_libraries(lmp "${TorchSparse_LIBRARIES}")

then run ./gradlew :pytorch:pytorch-native compileJNI

stu1130 avatar Dec 18 '20 02:12 stu1130

@stu1130 Maybe it's time to think of how we can dynamically load customOps instead of rebuild the JNI package. I think it should be doable through calling the registration along with the library itself?

lanking520 avatar Dec 18 '20 05:12 lanking520

I checked it out. You can try to build JNI from source and link jni with torch sparse package. The CMakeFile is here.

find_package(TorchSparse REQUIRED)
target_link_libraries(lmp "${TorchSparse_LIBRARIES}")

then run ./gradlew :pytorch:pytorch-native compileJNI

I tried it and by some reason libdjl_torch fails to include this library, but I managed to get it working just by calling

System.load("libtorchsparse.so")

mike-grayhat avatar Dec 18 '20 14:12 mike-grayhat

I checked it out. You can try to build JNI from source and link jni with torch sparse package. The CMakeFile is here.

find_package(TorchSparse REQUIRED)
target_link_libraries(lmp "${TorchSparse_LIBRARIES}")

then run ./gradlew :pytorch:pytorch-native compileJNI

I tried it and by some reason libdjl_torch fails to include this library, but I managed to get it working just by calling

System.load("libtorchsparse.so")

Good to know. This should work as long as you have it registered

lanking520 avatar Dec 22 '20 16:12 lanking520

👋 there, we are running into the same issue. Does anyone has some tips to build the JNI from source with TorchSparse I am not too sure how to get the build system to resolve the library. Or are there prebuilt libtorchsparse.so available to download somewhere?

pgherveou avatar Oct 20 '22 16:10 pgherveou

@pgherveou torch_sparse::ptr2ind can be found in _convert_cpu.so in pip package of torch-sparse. You can install the package and find it:

pip install torch-sparse
# find your python install location:
# python -m site
# the following assume you are using virtual env:
ls $(python -m site | grep $VIRTUAL_ENV | awk -F"'" '{print $2}')/torch_sparse/_convert_cpu.so

You can load _convert_cpu.so in DJL with the following environment variable:

export PYTORCH_EXTRA_LIBRARY_PATH=$(python -m site | grep $VIRTUAL_ENV | awk -F"'" '{print $2}')/torch_sparse/_convert_cpu.so

frankfliu avatar Oct 22 '22 19:10 frankfliu

@frankfliu thanks a lot will give it a try 🙏

pgherveou avatar Oct 22 '22 19:10 pgherveou