drogon
drogon copied to clipboard
Deploy Libtorch C++ Model with Drogon
Most developers who use PyTorch right now deploy their algorithms into production in python using Django/Flask. With PyTorch 1.0, scientists/developers can now create their algorithms in python, then use touchscript to serialize the model from Python to C++. Libtorch doesn't setup C++ API's but inference the model at a faster speed compared to python. So instead of using python with Django/Flask to set up an API for the model, use C++ with dragon setup an API for the libtorch model.
@rchavezj Thank you so much. @rbugajewski @vedranmiletic @interfector18 What do you think about this topic?
I use PyTorch to mess around and test some ideas from time to time. I say it's a good idea. How would the implementation look like, a plugin, a separate project that builds on drogon? How does django/flask's support currently look like? Is the scope pretty much covered with official tutorial?
The link you provided is exactly what I had in mind but for the C++ community. I was thinking of a Libtorch C++ project that builds on Drogon using the following qualities:
How to wrap your trained PyTorch via Libtorch C++ model in a Drogon container to expose it via a web API How to translate incoming web requests into PyTorch tensors for your model How to package your model’s output for an HTTP response
The rust community attempted to integrate the language with PyTorch via libtorch c++ model ( https://github.com/LaurentMazare/tch-rs ) but the build is failing.
Additional Links: https://pytorch.org/blog/model-serving-in-pyorch/ https://pytorch.org/tutorials/advanced/cpp_export.html
I don't use libtorch (at least yet) but I think that's a great idea and it can help make Drogon more popular. I believe a separate library would be a proper way to approach it.
Example which covers imagenet classification if anyone is looking for it, https://github.com/SABER-labs/Drogon-torch-serve