tvm icon indicating copy to clipboard operation
tvm copied to clipboard

Saving the compiled graph

Open bhagatindia opened this issue 5 years ago • 1 comments

Hi,

I have two questions:

  • How to save the compiled graph? Does saving the compiled graph save the tvm::CompilationGroup symbols along with the compiled subgraphs? I hope it will not re-compile while loading the saved graph/ScriptModule.
  • I am facing an issue in the basic test with the HEAD. Any ideas what is going wrong?

Appreciate your help

import torch
import torch_tvm

shape = 8
x = torch.rand(shape)
y = torch.rand(shape)
z = torch.rand(shape)

def add(a, b, c):
    return a + b + c

inputs = [x, y, z]

torch_tvm.enable()

trace_tvm = torch.jit.trace(add, inputs)

relay_graph = torch_tvm.to_relay(trace_tvm, inputs)

print(relay_graph)

Traceback (most recent call last):

File "basic_tvm.py", line 18, in relay_graph = torch_tvm.to_relay(trace_tvm, inputs)

File "/root/inferentia/tvm/torch_tvm/init.py", line 18, in to_relay handle = _push_relay_expr(pt_func.graph_for(*inputs), inputs)

RuntimeError: This program cannot be exported as a single Relay expression. (operator() at /root/inferentia/tvm/torch_tvm/register.cpp:53) frame #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&) + 0x6c (0x7f4bed67ba4c in /root/inferentia/tvm/env/lib/python3.6/site-packages/torch/lib/libc10.so) frame #1: + 0x8f8bf (0x7f4bdd4fd8bf in /root/inferentia/tvm/torch_tvm/_torch_tvm.cpython-36m-x86_64-linux-gnu.so) frame #2: + 0x86cbb (0x7f4bdd4f4cbb in /root/inferentia/tvm/torch_tvm/_torch_tvm.cpython-36m-x86_64-linux-gnu.so) frame #3: python() [0x50abc5] frame #5: python() [0x509ce8] frame #6: python() [0x50aa1d] frame #8: python() [0x5081d5] frame #10: python() [0x635082] frame #15: __libc_start_main + 0xe7 (0x7f4bf243fb97 in /lib/x86_64-linux-gnu/libc.so.6)


bhagatindia avatar Nov 16 '19 00:11 bhagatindia

Hey, you've certainly caught a bug. Would you mind printing trace_tvm.graph?

Generally, to use the to_relay API, you'll need the entire graph representable in relay (note the error message) -- which doesn't seem to be the case here. (The bug is that we should be handling that graph.)

bwasti avatar Jan 07 '20 18:01 bwasti