Visualization Example
In the buffered approach, it was reasonably easy to show a decent graph of a buffer. In the single sampled approach it seems quite difficult to get similar results. Currently, the included visualization example does not show the graph. Is there a working example of this? Or perhaps more detail on the ramifications of changing to single sample that will allow us to graph it well?
Thanks, Elliott
Hey Elliott. Yeah, I forgot about this on moving over. Probably the solution for the moment is a new node which subclasses PassThroughNode and fills a fixed length buffer in its tick method. Then you should be able to use it in the same way as before, just grabbing data from the buffer and visualising it.
I can have a look at some point in the next week or so, but if you fancy having a shot at fixing it then feel free to ask any questions here, and chuck in a pull request. Thanks for the report, Joe
That makes sense. However, I'd like to be able to show a visualization for any arbitrary node. Short of adding a fixed length buffer back into every node purely for visualization, is there a sensible way to accomplish this? It may be that this is outside the general scope of the library, in which case I suppose I'll have to do it in a fork.
Thanks for the quick reply and any other info you can offer, Elliott
Other than adding a fixed length buffer to each node I can't really think of how you could do this. It's a bit of an unfortunate consequence of shifting to single sample ticking (although visualising the variable length buffers wasn't ideal when working with blocks).
You can probably monkey-patch the behaviour into the base AudioletNode fairly simply though. So something like this would probably do what you want:
AudioletNode.prototype.tick = function() {
// First three lines are the standard AudioletNode tick method
this.createInputSamples();
this.createOutputSamples();
this.generate();
// This is where we make sure each node stores a visualisation buffer
if (!this.buffer) {
// Create this.buffer to store the samples here
}
// Push a sample from an output into this.buffer here
};
Hope this helps, Joe
Okay, thanks. I'll give that a shot. I'd like to switch to single-sample so that I can use the FFT stuff when that's all wrapped up, I just can't lose the visualizations!
If you think it'll be useful for others, I can put up a pull request if I get it ironed out. If not, then I'll just keep a fork around.
Thanks again, Elliott
On Fri, May 25, 2012 at 1:21 AM, Joe Turner < [email protected]
wrote:
Other than adding a fixed length buffer to each node I can't really think of how you could do this. It's a bit of an unfortunate consequence of shifting to single sample ticking (although visualising the variable length buffers wasn't ideal when working with blocks).
You can probably monkey-patch the behaviour into the base AudioletNode fairly simply though. So something like this would probably do what you want:
AudioletNode.prototype.tick = function() { // First three lines are the standard AudioletNode tick method this.createInputSamples(); this.createOutputSamples(); this.generate(); // This is where we make sure each node stores a visualisation buffer if (!this.buffer) { // Create this.buffer to store the samples here } // Push a sample from an output into this.buffer here };Hope this helps, Joe
Reply to this email directly or view it on GitHub: https://github.com/oampo/Audiolet/issues/33#issuecomment-5925494
I think this is a fairly specific case (being able to simultaneously visualise all nodes), so it's probably best in a fork given that there is some processing and memory overhead associated. I do think there should probably be a separate node used for visualisation so we're not losing functionality, so I'll keep the bug open until that's in place. Cheers, Joe
hey, I'm trying to add a visualizer to an audiolet test, and I've reduced the code down to a simple sinewave. I cannot get this to visualize no matter what I try. I have looked at the example but it's not working, tried the suggestions here, and also tried using webaudio's real time analyser like this
analyzer = audiolet.output.device.sink._node.context.createAnalyser()
data = new Uint8Array(analyzer.frequencyBinCount)
analyzer.getByteFrequencyData(data)
// data is [0,0,0 ...]
what would be the best way to visualize audiolet?
I forked and added a buffer for visualization to each node. You can look at the specifics here: https://github.com/ewdicus/Audiolet. Then I'm plotting it using Canvas and Javascript.
great man! I'll use that. thanks!
I should clarify, I haven't updated the examples there. I've just made the changes to the code. If you need help figuring it out, let me know.
I have collected the vizData successfully with your branch, but my visualizer needs a lot of work, lolz thanks!
Just a quick note that this probably falls under the Audiolet 2 remit now, where you will be able to use the RealTimeAnalyserNode.
@oampo, ok, I'll give this a test run tomorrow...
@ewdicus so, I found a slight bug with your code...
(coffeescript syntax - contrived example)
sine_l = new Sine @audiolet, 440
sine_r = new Sine @audiolet, 440
pan_l = new Pan @audiolet, 0
pan_r = new Pan @audiolet, 1
sine_l.connect pan_l
sine_r.connect pan_r
vol_l = new Gain @audiolet, 0.1
vol_r = new Gain @audiolet, 0.1
pan_l.connect vol_l
pan_r.connect vol_r
vol_l.connect @audiolet.output
vol_r.connect @audiolet.output
vol_r.visBuffer.channels[0] will be correct, however vol_l.visBuffer.channels[0] will be filled with basically zeros (very small floats)
I'm not sure what the problem could be, and I may not even be using audiolet correctly, connecting stuff, but either way, I think it's time for me to call it a night... look at it tomorrow... gg guys!
@heavyk The code to make it work in the version-2 branch isn't quite ready yet. I'll ping you when you can try it. Joe