guest271314

Results 602 comments of guest271314

You can slice the first 44 bytes from the discrete WAV file then concatenate the raw PCM into a single file. In brief see https://github.com/guest271314/audioInputToWav, https://github.com/guest271314/AudioWorkletStream/blob/master/audioWorklet.js#L19.

To play the audio at an `HTMLMediaElement` without creating a discrete WAV file you can use ``` function int16ToFloat32(inputArray) { const output = new Float32Array(inputArray.length); for (let i = 0;...

See also [merging / layering multiple ArrayBuffers into one AudioBuffer using Web Audio API](https://stackoverflow.com/a/18920291)

A WAV file can be created from each 'chunk' see https://github.com/guest271314/audioInputToWav/blob/master/array-typedarray-with-audio-context.html.

I used DevTools Local Overrides https://developers.google.com/web/updates/2018/01/devtools#overrides to workaround this issue for `AudioContext.audioWorklet.addModule(url)` https://github.com/WebAudio/web-audio-api-v2/issues/109.

Found this trying to convert a JavaScript `RegExp` to `grep` pattern. This would be useful for the ability to view JavaScript `RegExp` in other language formats, if that is possible.

If you use `decodeAudioData()` first you can get any part of the `AudioBuffer` from any offset. If you do not split the file first you can use `AudioWorklet` or Media...

This https://github.com/guest271314/audioInputToWav (is not fast) will create a `WAV` file from an `AudioBuffer`, or when modified, `Float32Array`(s). If you are trying to concantenate specific time slices of of multiple audio...

Can't you just send the data over a data channel?