tensorrt
tensorrt copied to clipboard
C++ serving example: adding saved model path
This PR extends the current C++ image classification example with a saved model path. The two workflows are thus:
- Keras saved model ->TFTRT Python API -> frozen graph -> CPP serving
- Keras saved model ->TFTRT Python API -> saved model -> CPP serving
Reorganizing the two path into two separate subfolders and add a master readme.
Check out this pull request on
See visual diffs & provide feedback on Jupyter Notebooks.
Powered by ReviewNB
@meena-at-work: for review
@DEKHTIARJonathan @meena-at-work any feedback please?
Thanks @meena-at-work for the review. By and large, I'd keep the two paths separate and clean.
Conceptually it is possible to lump everything together and use "flags" to redirect, but I don't see much benefit if at all. Most users will use one path or another in their code base, not both. So better to keep them separate.
I think we can make the loading code in a separate file.
@vinhngx -- I was considering that too (command line option vs just making a new example), and I think that's fair from a demo application point of view. However, if that's the case && we want to gain most value from these examples, we should keep the common code & the model loading code in separate files (and I think we're in agreement on that, based on your comment!). So if you can refactor the code to reflect the above along with the other changes, I can approve the PR.
@vinhngx this path is completely useless:
- Keras saved model ->TFTRT Python API -> frozen graph -> CPP serving
TF-TRT freeze the graph already during the conversion process