siriwave icon indicating copy to clipboard operation
siriwave copied to clipboard

Wave is not visible on iOS

Open Saintenr opened this issue 3 years ago • 4 comments

Hello kopiro and team,

First of all, i would like to say thank you for setting up such a great project.

Good work. I'm already looking forward to the new version.

Unfortunately, I have the problem that on iOS devices (no matter which browser is used) the siri wave is not displayed. The div is rendered, but the line does not move.

I also don't get any errors in the developer console.

Any idea what this could be?

Saintenr avatar Apr 29 '21 09:04 Saintenr

Hej! Does the same page correctly renders on Desktop instead?

kopiro avatar Apr 29 '21 09:04 kopiro

Hey

It works perfectly in the desktop. It also works without problems under android.

image

on the iphone or ipad only the grey div and the stitch are displayed. but no waves.

for better understanding, here are my code blocks (sorry for the bad format):

navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then((stream)=>{ var siriWave = this.createVoiceVisualization(); this.styleVoiceVisualization(siriWave,stream,context); }

createVoiceVisualization() { var siriWave = new SiriWave({ container: document.getElementById("fi-wave-container"), width: 250, height: 75, style: 'ios9', speed: 0.0, amplitude: 0.0, // autostart: true }); return siriWave; }

`public styleVoiceVisualization(siriWave:SiriWave, stream:MediaStream, context) { let source = undefined; try{ console.log("style voice inner try{}") console.log(stream); //create source for sound input. source = context.createMediaStreamSource(stream);

  //create processor node.
  let processor = context.createScriptProcessor(1024, 1, 1);

  //create analyser node.
  let analyser = context.createAnalyser();

  //set fftSize to 4096
  analyser.fftSize = 4096;
  //array for frequency data.
  let myDataArray = new Float32Array(analyser.frequencyBinCount); 

  //connect source->analyser->processor->destination.
  source.connect(analyser);
  analyser.connect(processor);
  processor.connect(context.destination);

  //start siriwave
  siriWave.start();
  //event for change in audio data from source.
  processor.onaudioprocess = function(e) {
    // console.log("onaudioprocess");
   let amplitude = 0;
     let frequency = 0;

     //copy frequency data to myDataArray from analyser.
     analyser.getFloatFrequencyData(myDataArray);

     //get max frequency which is greater than -100 dB.
     myDataArray.map((item, index):number => {
         let givenFrequencyDB = item;
         if(givenFrequencyDB>-100){
             frequency = Math.max(index,frequency);
         }
         return 0;
     });
     //multipy frequency by resolution and divide it to scale for setting speed.
     frequency = ((1+frequency)*11.7185)/24000;
     //set the speed for siriwave
     siriWave.setSpeed(frequency);
     //find the max amplituded
    e.inputBuffer.getChannelData(0).map((item):number =>{
      amplitude = Math.max(amplitude, Math.abs(item));
      return 0;
    });
     amplitude = Math.abs(amplitude*17);

      if(amplitude<1&&amplitude>0.1){
        //min scale
        amplitude=1;
      }
      if(amplitude>3){
        //max scale
        amplitude=3;
      }
     //if amplitude is greater than 0 then set siriwave amplitude else set to 0.0.
     if(amplitude>=0){
         siriWave.setAmplitude(amplitude);
     }else{
         siriWave.setAmplitude(0.0);
     }
     
  };
}catch(e) {
  console.log(e);
}

}`

Saintenr avatar Apr 29 '21 10:04 Saintenr

hey, sorry I found out it's not a problem with the Siri wave but maybe someone can help me anyway. (if not just close the issue :( )

I think it's because I want to use the MediaStream in two functions.

Since Apple doesn't allow this, they mute the "first" stream and that is my processing of the SiriWave.

Then the stream is sent to Watson Speech to Text. This is then the 2nd stream which is then marked as active and why the first is muted.

So it's an Apple thing, but maybe someone has an idea how I can combine the two?

Thank you very much,

cheers Saintenr

Saintenr avatar May 05 '21 05:05 Saintenr

hey, sorry I found out it's not a problem with the Siri wave but maybe someone can help me anyway. (if not just close the issue :( )

I think it's because I want to use the MediaStream in two functions.

Since Apple doesn't allow this, they mute the "first" stream and that is my processing of the SiriWave.

Then the stream is sent to Watson Speech to Text. This is then the 2nd stream which is then marked as active and why the first is muted.

So it's an Apple thing, but maybe someone has an idea how I can combine the two?

Thank you very much,

cheers Saintenr

thanks for sharing this! in my case i put the code out the function that loads and plays and it starts work again

acosme avatar Apr 12 '24 17:04 acosme