web-audio-api icon indicating copy to clipboard operation
web-audio-api copied to clipboard

DSP graph should run in a different thread

Open sebpiq opened this issue 12 years ago • 10 comments
trafficstars

One idea is that nodes can be only proxies who send commands to the nodes in the DSP thread.

sebpiq avatar Aug 16 '13 21:08 sebpiq

this will be hard as node doesn't really support threading nor fork. So objects sent between 2 process are all serialized ; deserialized, which is not acceptable in our case.

sebpiq avatar Sep 10 '13 12:09 sebpiq

http://neilk.net/blog/2013/04/30/why-you-should-use-nodejs-for-CPU-bound-tasks/

sebpiq avatar Sep 10 '13 12:09 sebpiq

Try and benchmark solutions for avoiding buffer copies between processes

https://github.com/3rd-Eden/node-memcached https://github.com/supipd/node-shm https://github.com/kazupon/node-mappedbuffer

sebpiq avatar Sep 19 '13 06:09 sebpiq

This is one of the major shortcomings of the Web Audio API as it is currently. You need something like shared memory and/or locks etc. to actually implement Web Audio API. You can, however, implement most of the important parts without shared memory or running in the main thread. One should hardly ever be transferring big buffers between the main and audio thread. The problematic parts are when you are getting stuff from the audio thread, since you can't make it synchronous without using some native addon.

I hope we can fix the situation sooner rather than later on the spec side.

jussi-kalliokoski avatar Sep 20 '13 06:09 jussi-kalliokoski

Hmm ... I don't really understand why AudioBuffer#getChannelData() causes a problem. Could you explain to me? Actually, thinking about it, I don't see any operation between audio thread and main thread that needs to be synchronous. Is there something I forgot?

My concern here is more the copy of big buffers between processes, which happens for example when using AudioBufferSourceNode, or when decoding some audio. And in fact, that's not even a blocking issue, as it can be implemented simply with copies ... which is really, really unefficient, but works (probably).

sebpiq avatar Sep 20 '13 07:09 sebpiq

So objects sent between 2 process are all serialized ; deserialized, which is not acceptable in our case.

In the context of node.js, the inter-process memory bandwidth is unlikely to be a bottleneck. (See https://gist.github.com/srikumarks/6180450) It is, in other words, acceptable to have all the audio rendering happen in another process. This is, btw, the architecture of SuperCollider.

srikumarks avatar Sep 20 '13 07:09 srikumarks

Yes ... but in SuperCollider buffers are allocated server-side. The client-side instance of buffer is merely a proxy to do a bunch of selected operations. For example you don't have direct access to the buffer data. You can set/get values, but (being a SuperCollider user myself), I never had to do that. Buffer API in SC docs : http://doc.sccode.org/Classes/Buffer.html

Here the size of data I am thinking is not 4096 frames, but more like 13230000 frames (which is the number of frames in a 5mn sound file with 44100Hz sample rate) so it's an other order of magnitude! That's why I think it's not a good idea to copy between processes.

I think I will try putting all the buffers in a shared memory as suggested here https://wiki.mozilla.org/User:Roc/AudioBufferProposal#Implementation_Sketch small problem being that there is no established solution for shared memory in node.js ...

sebpiq avatar Sep 20 '13 07:09 sebpiq

To me this is the single most important issue before dev can progress on the library. The architecture should be right, and I think it won't be right if audio can't run in a separate thread.

sebpiq avatar Dec 15 '14 09:12 sebpiq

The right solution here is to make it so that you can run the DSP code in another thread in js. That's how Chrome, Safari and Firefox do it, and the only way to have decent latency. Maybe salvaging https://github.com/audreyt/node-webworker-threads to make it work with an audio thread would be a good start ? Or maybe the goal here is to have no native code ?

When you have threading working in your node process, then you can implement efficient buffer transfers (the way roc outlines it in the document you linked).

Also, getChannelData is not an issue anymore (it was an issue when roc and Jussi were talking, but we fixed it, at implementation and spec level).

padenot avatar Feb 18 '15 15:02 padenot

I remember considering this solution, but discarding it ... but I can't remember exactly why! I think there is no support for typed arrays in workers, but not sure about that : https://github.com/audreyt/node-webworker-threads/issues/18

Also, it seems like Node.js clusters have gone a long way since I last looked at it (sept 2013!) and now you can fork ... which means that it would probably be the way to go. As you say @padenot , first taking out the audio processing from main thread, then optimizing buffer transfers.

Also, I don't mind having native code there, as long as it can be replaced by something else in the browser.

sebpiq avatar Feb 18 '15 15:02 sebpiq