djl
djl copied to clipboard
Support for custom pytorch ops
Is there a way to use torchscript model with custom ops?
I'm trying to run pytorch geometric model with DJL and have a problem with custom ops from pytorch scatter library. The error is
Exception in thread "main" ai.djl.engine.EngineException:
Unknown builtin op: torch_sparse::ptr2ind.
Could not find any similar ops to torch_sparse::ptr2ind. This op may not exist or may not be currently supported in TorchScript.
Will it be enough to just compile and install the required library or its a more complex case with JNI involved ?
Thanks in advance
I checked it out. You can try to build JNI from source and link jni with torch sparse package. The CMakeFile is here.
find_package(TorchSparse REQUIRED)
target_link_libraries(lmp "${TorchSparse_LIBRARIES}")
then run ./gradlew :pytorch:pytorch-native compileJNI
@stu1130 Maybe it's time to think of how we can dynamically load customOps instead of rebuild the JNI package. I think it should be doable through calling the registration along with the library itself?
I checked it out. You can try to build JNI from source and link jni with torch sparse package. The CMakeFile is here.
find_package(TorchSparse REQUIRED) target_link_libraries(lmp "${TorchSparse_LIBRARIES}")then run
./gradlew :pytorch:pytorch-native compileJNI
I tried it and by some reason libdjl_torch fails to include this library, but I managed to get it working just by calling
System.load("libtorchsparse.so")
I checked it out. You can try to build JNI from source and link jni with torch sparse package. The CMakeFile is here.
find_package(TorchSparse REQUIRED) target_link_libraries(lmp "${TorchSparse_LIBRARIES}")then run
./gradlew :pytorch:pytorch-native compileJNII tried it and by some reason
libdjl_torchfails to include this library, but I managed to get it working just by calling
System.load("libtorchsparse.so")
Good to know. This should work as long as you have it registered
👋 there, we are running into the same issue.
Does anyone has some tips to build the JNI from source with TorchSparse
I am not too sure how to get the build system to resolve the library. Or are there prebuilt libtorchsparse.so available to download somewhere?
@pgherveou
torch_sparse::ptr2ind can be found in _convert_cpu.so in pip package of torch-sparse. You can install the package and find it:
pip install torch-sparse
# find your python install location:
# python -m site
# the following assume you are using virtual env:
ls $(python -m site | grep $VIRTUAL_ENV | awk -F"'" '{print $2}')/torch_sparse/_convert_cpu.so
You can load _convert_cpu.so in DJL with the following environment variable:
export PYTORCH_EXTRA_LIBRARY_PATH=$(python -m site | grep $VIRTUAL_ENV | awk -F"'" '{print $2}')/torch_sparse/_convert_cpu.so
@frankfliu thanks a lot will give it a try 🙏