tflite-support
tflite-support copied to clipboard
Empty Output on iOS with “TensorFlowLiteTaskAudio 0.4.3” and “TensorFlow Lite Model Maker Model”
TLDR:
on iOS, when calling classify (a.k.a let results = try classifier.classify(audioTensor: inputAudioTensor)
) on a model created using “TensorFlow Lite Model Maker”, then results.classifications[1].categories
is always empty (in my case it was expected to output maximum of 5 values)
I think there’s a bug on pod “TensorFlowLiteTaskAudio” because the same model works as expected on Android.
Sample repository to reproduce this bug : https://github.com/ghashi/tensor-flow-lite-task-audio-poc
Long version
I created a model using "TensorFlow Lite Model Maker Model”
I followed this tutorial https://www.tensorflow.org/lite/models/modify/model_maker/audio_classification and created a .tflite model that recognizes 5 kinds of birds songs. Here’s the versions I used to create it:
Python 3.7
Numpy Version: 1.21.6
TensorFlow Version: 2.9.3
Model Maker Version: 0.4.2 (and 0.3.4)
portaudio: stable 19.7.0
Then, I integrated this model into an Android and an iOS application. I used the tensorflow “audio classification demo” repository to do this.
Android works ✅
The Android app works fine. In other words, when I execute
tensorAudio.load(recorder)
val output = classifier.classify(tensorAudio)
then:
-
output[0].categories
has a maximum of 521 values (as expected) -
ouput[1].categories
has a maximum of 5 values
the image below show a screenshot of the app (left) and a debugging session(right) that shows the values mentioned above.
iOS doesn't work ❌
The iOS app doesn’t work as expected. In other words, when I execute
try inputAudioTensor.load(audioRecord: audioRecord)
let results = try classifier.classify(audioTensor: inputAudioTensor)
then:
-
results.classifications[0].categories
has a maximum of 5 values ([the expected value is 521) -
results.classifications[1].categories
is always empty (the expected value is 5)
the image below show a screenshot of the app (left) and a debugging session(right) that shows the values mentioned above.
ps1: I'm using TensorFlowLiteTaskAudio 0.4.3
ps2: the iOS app works fine when using the default yamnet model. You can see in the image below that results.classifications[0].categories
has a maximum of 521 values (as expected)
Question
Is there a bug on the pod “TensorFlowLiteTaskAudio”? Or is there some special configuration that should be done on iOS when integrating a model generated by “TensorFlow Lite Model Maker”? Or should I change something when generating the model .tflite for iOS?
Steps to Reproduce
- create a a model using “TensorFlow Lite Model Maker”
- integrate this model into an iOS app
- run the app
- the model is expected to return some results but it always returns an empty array
ps: I created a repository to help you reproducing this behavior. It shows:
- how the model .tflite is created
- how it is integrated on Tensor flow iOS “audio classification” demo code
- how it is integrated on Tensor flow Android “audio classification” demo code
This is the link: https://github.com/ghashi/tensor-flow-lite-task-audio-poc . If you want to use it:
- clone it
- create your own model using
model-generator/model_maker_audio_classification-toreport.ipynb
or just skip this step - run the ios app (folder
ios
) - select “birds”
- debug line 162 of
ios/AudioClassification/TFLite/AudioClassificationHelper.swift
- check the variable “results” on the debugger
@priankakariatyml Can you take a look when you have time?