neuropod icon indicating copy to clipboard operation
neuropod copied to clipboard

Have a way to set LD_PRELOAD for tensorflow_text

Open josephykwang opened this issue 2 years ago • 2 comments

Feature

Is your feature request related to a problem? Please describe.

Describe the solution you'd like

Describe alternatives you've considered

Additional context

josephykwang avatar Jan 13 '23 02:01 josephykwang

When running a NLP TF model, we get ```01/13/23 02:46:41.752060: E neuropod/backends/tensorflow/tf_utils.cc:73] [thread 310, process 149] TensorFlow error: {{function_node __inference_signature_wrapper_60293}} {{function_node __inference_signature_wrapper_60293}} {{function_node __inference__wrapped_model_54675}} {{function_node __inference__wrapped_model_54675}} {{function_node __inference_restored_function_body_53812}} {{function_node __inference_restored_function_body_53812}} {{function_node __inference_model_layer_call_fn_2449}} {{function_node __inference_model_layer_call_fn_2449}} Op type not registered 'CaseFoldUTF8' in binary running on phx4-tqp. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) tf.contrib.resampler should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.

even though the corresponding libraries are in LD_LIBRARY_PATH. 

josephykwang avatar Jan 13 '23 02:01 josephykwang

Note: this was explored offline and passing in custom_ops when creating the model (https://neuropod.ai/docs/master/packagers/tensorflow/#custom_ops) instead of using LD_LIBRARY_PATH solved the issue for a test model

VivekPanyam avatar Jan 31 '23 22:01 VivekPanyam