haskell icon indicating copy to clipboard operation
haskell copied to clipboard

meta graph saving and loading similar to python

Open silky opened this issue 8 years ago • 9 comments

as-is we can save and load individual tensors

it would be quite good if we could also save/load the metagraph thing, as in python

this way, it would be very easy to train in python and run in haskell, without having to re-create the whole model in haskell (and likewise the other way)

silky avatar Jun 05 '17 07:06 silky

There is limited support for this already, one of the oldest tests actually uses it to load a really old python created graph def and use it for inference: https://github.com/tensorflow/haskell/blob/master/tensorflow-mnist/tests/ParseTest.hs#L141

The graphs are saved as protobufs, so you can use proto-lens to read it, call addGraphDef to load it into tensorflow, then use tensorFromName to get references to parts of the graph.

fkm3 avatar Jun 05 '17 07:06 fkm3

Is this pull request relevant? https://github.com/tensorflow/haskell/pull/104

jcberentsen avatar Jun 16 '17 08:06 jcberentsen

Hello @fkm3, as I can read in the files you pointed (and the reletad imports), mnistPb is getting MNIST model form data/MNIST.pb and it is getting weights with last checkpoints. I wonder if it would work with last versions of Tensorflow (Version 1.3.0 in this comment). What would be the workflow ? More details below:

Suppose a graph saved with such function in Python:

def save_model(session, input_tensor, output_tensor):                                                                                                                                                  
    signature = tf.saved_model.signature_def_utils.build_signature_def(                                                                                                                                
            inputs  = {'input': tf.saved_model.utils.build_tensor_info(input_tensor)},                                                                                                                 
            outputs = {'output': tf.saved_model.utils.build_tensor_info(output_tensor)},                                                                                                               
                        )                                                                                                                                                                              
    b = tf.saved_model.builder.SavedModelBuilder('/tmp/model')                                                                                                                                         
    b.add_meta_graph_and_variables( session                                                                                                                                                            
                                  , [tf.saved_model.tag_constants.SERVING]                                                                                                                             
                                  , signature_def_map={tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature})                                                               
    b.save()-    

Then you get your graph model:

model
├── saved_model.pb
└── variables
    ├── variables.data-00000-of-00001
    └── variables.index

(there are no checkpoints files any more in last version if I am correct -- everything is in variables/*)

How to use it in an Haskell workflow please ? Is it possible with current version of tensorflow-haskell ? Thanks for any help.

delanoe avatar Sep 26 '17 06:09 delanoe

Hi. I haven't played with 1.3 yet or dug into the variable checkpoint format, so I'm not sure what the answer is.

The current restore function we have is using the "Restore" op: https://github.com/tensorflow/haskell/blob/master/tensorflow-ops/src/TensorFlow/Ops.hs#L273

There is a "RestoreV2" op available that is probably required to load the new format: https://tensorflow.github.io/haskell/haddock/tensorflow-core-ops-0.1.0.0/TensorFlow-GenOps-Core.html#v:restoreV2 You could try adding a restoreV2FromName based on restoreFromname.

fkm3 avatar Sep 26 '17 19:09 fkm3

Thanks for you reply and pointer. I wonder if using the serving API would not be more productive.

Do you know if there an equivalent of mnist_client.py in Haskell ?

delanoe avatar Sep 28 '17 14:09 delanoe

I haven't used tensorflow serving before, but it looks like it simply exposes a GRPC serving, so I think you are right and that is probably an easier route if you just want to eval a model from haskell. It looks like there are some GRPC libraries, but I've never tried them: https://github.com/awakesecurity/gRPC-haskell and https://github.com/grpc/grpc-haskell

fkm3 avatar Sep 28 '17 21:09 fkm3

*GRPC service

fkm3 avatar Sep 28 '17 21:09 fkm3

Would love to be able to pull in modules from the TensorFlow Hub and use those for e.g. easy out-of-the-box embeddings.

Looks like they're all stored in the newer format described by this issue. Has anyone looked into this more in the last year? Looking at the source for restoreV2, it's not immediately clear to me how to use this to load in the new format, but it is difficult to overstate my lack of familiarity with the codebase.

Happy to learn and try to build out a working connector to the TensorFlow Hub if folks can point me in the right direction!

cbeav avatar Dec 31 '18 19:12 cbeav

Has there been any update on support for loading tensorflow graphs from protobufs? What TF versions (if any) are currently supported?

o1lo01ol1o avatar Aug 14 '19 09:08 o1lo01ol1o