ipytone icon indicating copy to clipboard operation
ipytone copied to clipboard

X-Y oscilloscope music

Open benbovy opened this issue 4 years ago • 5 comments
trafficstars

I'd love to be able to experiment with something like this (well, perhaps less ambitious animations) in a notebook at some point.

This would require some X-Y oscilloscope widget to which we can plug in an ipytone audio node. I've found some examples either using custom shaders (https://www.shadertoy.com/view/XttSzf, https://github.com/m1el/woscope) or based on the Canvas API (https://github.com/Sean-Bradley/Oscilloscope).

I guess it shouldn't be too hard implementing a basic version in ipytone, so that we can have a very flexible emulator (code + widgets). I have zero experience in shaders / Canvas API, though.

With both ipytone and ipycanvas, I imagine real-time drawing -> generate an audio signal from the drawing -> feed the X-Y oscilloscope emulator with the audio signal, like this: https://www.youtube.com/watch?v=AGeHwNEwbZk. :-)

Cc @martinRenou (in case you have any thoughts / suggestion for the X-Y oscilloscope widget, that would be really helpful!).

benbovy avatar Mar 18 '21 14:03 benbovy

Not sure it's 100% relevant, but could be interesting: https://github.com/meyda/meyda https://github.com/vcync/modV

davidbrochart avatar Mar 18 '21 14:03 davidbrochart

My two cents: I wouldn't use ipycanvas for that, for performances reason. You'd have way better performances using the Web canvas API directly in JavaScript.

martinRenou avatar Mar 18 '21 14:03 martinRenou

Not sure it's 100% relevant, but could be interesting: https://github.com/meyda/meyda https://github.com/vcync/modV

Hmm modV looks like a full application rather than a library? Not sure how meyda would be useful here as it is rather the opposite of ipytone (parameters <-> sound/music) and there are similar libraries in Python such as librosa... Maybe for real-time analysis of an imported audio track so that we could programmatically generate some smart overdubs with ipytone... ?

My two cents: I wouldn't use ipycanvas for that, for performances reason. You'd have way better performances using the Web canvas API directly in JavaScript.

Yes I was rather thinking of using ipycanvas to draw a shape "by hand", get the canvas image and do some computation on the Python side to convert it into "XY-oscilloscope compatible" audio signal generated with ipytone... However, there's some way to go before even trying something like that!

Do you know if the Canvas API is performant enough for responsive drawing at the audio sample rate?

benbovy avatar Mar 18 '21 15:03 benbovy

Do you know if the Canvas API is performant enough for responsive drawing at the audio sample rate?

The Canvas API is quite fast, as long as you don't draw tens of thousands of shapes and use the API wisely it should be fine. Of course, WebGL shaders are faster, but they are also a lot more complicated.

I am not an audio expert, what is the order of magnitude of audio rate? I am not sure you want to match the audio frequency, you'll be limited by your screen refresh frequency anyway (60fps on my laptop). You can definitely make 60fps animations with a Web2D canvas.

martinRenou avatar Mar 18 '21 15:03 martinRenou

I am not an audio expert, what is the order of magnitude of audio rate? I am not sure you want to match the audio frequency, you'll be limited by your screen refresh frequency anyway (60fps on my laptop). You can definitely make 60fps animations with a Web2D canvas.

Sorry my comment was dumb (typical audio rate is 44.1kHz vs. 60 Hz screen rate haha).

The Canvas API is quite fast, as long as you don't draw tens of thousands of shapes and use the API wisely it should be fine. Of course, WebGL shaders are faster, but they are also a lot more complicated.

It'll be wiser to start with the Canvas API then for a basic rendering (without trying to emulate the nice rendering of an analog oscilloscope). Thanks!

benbovy avatar Mar 18 '21 16:03 benbovy