dialogflow-unity-client
dialogflow-unity-client copied to clipboard
Microphon listen not working and no errors
Hi all, Send Text works properly (with both default key and my client one), but I'm not able to make it work using the microphone. Once I press "Stop Listen" this error is logged:
NullReferenceException: Object reference not set to an instance of an object
ApiAiSDK.Http.MultipartHttpClient.getResponse ()
ApiAiSDK.AIDataService.VoiceRequest (System.IO.Stream voiceStream, ApiAiSDK.RequestExtras requestExtras)
Rethrow as AIServiceException: Object reference not set to an instance of an object
ApiAiSDK.AIDataService.VoiceRequest (System.IO.Stream voiceStream, ApiAiSDK.RequestExtras requestExtras)
ApiAiSDK.ApiAi.VoiceRequest (System.IO.Stream voiceStream, ApiAiSDK.RequestExtras requestExtras)
ApiAiSDK.ApiAi.VoiceRequest (System.Single[] samples)
UnityEngine.Debug:LogException(Exception)
<HandleOnError>c__AnonStorey2:<>m__0() (at Assets/Examples/ApiAiModule.cs:99)
ApiAiModule:Update() (at Assets/Examples/ApiAiModule.cs:116)
I'm quite sure the micro is recording, because I can print at each frame the variable Microphone.IsRecording(null), and is true. Even I tried to speak only when Microphon.GetPosition(null) equals 0. My micro works (tested with API.ai web interface) and I have not problems with access authorization. I'm using Unity 2017.1.0f3 for a desktop app.
Help? API.ai Unity forum is desolated.
Here another kind of log:
ObjectDisposedException: The object was used after being disposed.
System.Net.WebConnection.BeginWrite (System.Net.HttpWebRequest request, System.Byte[] buffer, Int32 offset, Int32 size, System.AsyncCallback cb, System.Object state)
System.Net.WebConnectionStream.BeginWrite (System.Byte[] buffer, Int32 offset, Int32 size, System.AsyncCallback cb, System.Object state)
System.Net.WebConnectionStream.Write (System.Byte[] buffer, Int32 offset, Int32 size)
System.IO.BinaryWriter.Write (System.Byte[] buffer, Int32 index, Int32 count) (at /Users/builduser/buildslave/mono/build/mcs/class/corlib/System.IO/BinaryWriter.cs:140)
ApiAiSDK.Http.MultipartHttpClient.addFilePart (System.String paramName, System.String fileName, System.IO.Stream data)
ApiAiSDK.AIDataService.VoiceRequest (System.IO.Stream voiceStream, ApiAiSDK.RequestExtras requestExtras)
Rethrow as AIServiceException: The object was used after being disposed.
ApiAiSDK.AIDataService.VoiceRequest (System.IO.Stream voiceStream, ApiAiSDK.RequestExtras requestExtras)
ApiAiSDK.ApiAi.VoiceRequest (System.IO.Stream voiceStream, ApiAiSDK.RequestExtras requestExtras)
ApiAiSDK.ApiAi.VoiceRequest (System.Single[] samples)
UnityEngine.Debug:LogException(Exception)
<HandleOnError>c__AnonStorey2:<>m__0() (at Assets/Examples/ApiAiModule.cs:99)
ApiAiModule:Update() (at Assets/Examples/ApiAiModule.cs:116)
Is there any update on this?? @xVir @matthewayne
Figure out that API.ai does not support automatic speech recognition (ASR) and text-to-speech services (TTS) solutions from May 29th, 2017
https://github.com/api-ai/apiai-xamarin-client/issues/5#issuecomment-324031772