wav2letter icon indicating copy to clipboard operation
wav2letter copied to clipboard

Can i loaded wav2letter model and use it to serve (infer) as a service?

Open phamvandan opened this issue 5 years ago • 4 comments
trafficstars

In the scense of service applications, we need to load model one time and serve for client request. But How can i do this?

phamvandan avatar Sep 28 '20 06:09 phamvandan

cc @avidov

tlikhomanenko avatar Sep 29 '20 18:09 tlikhomanenko

Probably this could help https://github.com/facebookresearch/wav2letter/wiki/Inference-Run-Examples#interactive-streaming-asr-example.

Or do you need to do this with w2l model, not inference model?

tlikhomanenko avatar Sep 29 '20 18:09 tlikhomanenko

i think it only supported stream convnet model?

phamvandan avatar Sep 30 '20 09:09 phamvandan

You can simply try to do your own main.cpp where you load the model and hang while communicating with some buffer where you get the data. I guess you can have a look on streaming example and example I send, directly into implementation and adapt it to your use case.

tlikhomanenko avatar Oct 01 '20 05:10 tlikhomanenko