DTLN icon indicating copy to clipboard operation
DTLN copied to clipboard

TFLite Android

Open zeynepgulhanuslu opened this issue 3 years ago • 6 comments

Hi, I want to use tflite model in android project. When I load model to android studio it generates a code like below:

` val model = Dtln.newInstance(context)

// Creates inputs for reference. val inputFeature0 = TensorBuffer.createFixedSize(intArrayOf(1, 1, 512), DataType.FLOAT32) inputFeature0.loadBuffer(byteBuffer) val inputFeature1 = TensorBuffer.createFixedSize(intArrayOf(1, 2, 128, 2), DataType.FLOAT32) inputFeature1.loadBuffer(byteBuffer)

// Runs model inference and gets result. val outputs = model.process(inputFeature0, inputFeature1) val outputFeature0 = outputs.outputFeature0AsTensorBuffer val outputFeature1 = outputs.outputFeature1AsTensorBuffer

// Releases model resources if no longer used. model.close() `

My question is what is the inputFeature0 and inputFeature1 in this code? Should I read wav file as byte array than reshape it? Or Should I create feature vector of wav file? Can you help me with this?

Thanks

zeynepgulhanuslu avatar Jan 21 '22 11:01 zeynepgulhanuslu

Hi , I also want to use this model on android , if you have used it on android then could you help?

ShivamSrivastavaAkaike avatar Oct 09 '22 15:10 ShivamSrivastavaAkaike

Hi,

I used it but did not have a chance to use it as streaming. It only runs as a batch. If you want, I can send the code but it's a little messy since I didn't use it and didn't get the time to finish it properly. If you give me your mail, I can send it to you as a reference.

zeynepgulhanuslu avatar Oct 10 '22 04:10 zeynepgulhanuslu

yeah sure ,it will help. Thanks in advance here is my email id :- [email protected]

ShivamSrivastavaAkaike avatar Oct 10 '22 04:10 ShivamSrivastavaAkaike

Okay I will send it to you, If you want you can delete the comment. I hope it helps, have a nice day.

zeynepgulhanuslu avatar Oct 10 '22 05:10 zeynepgulhanuslu

Hello all 👋, I have also been trying to make this model work on android devices with kotlin. And unfortunately, same like you @zeynepgulhanuslu i got stuck while running the model inference from a recorded batch and with real time data. I have tried to reimplement the real_time_processing_tf_lite.py in kotlin but it got way too complicated to perform array operations and fft calculations without numpy.

Have you open sourced or uploaded your version anywhere? Honestly, it would help me a lot at this point 😅.

mcig avatar May 02 '23 09:05 mcig

Hello Mustafa I didn't complete the code either. But I created a tflite model for fft calculations. You can find more information with searching "create a TFLite model from one or more concrete functions in TensorFlow." . This way you can do numpy operations in a tflite model. I hope this help.

zeynepgulhanuslu avatar May 02 '23 11:05 zeynepgulhanuslu