violetmage
violetmage
I think magicgoose is more concerned with the synchronization of when a sample of the spectrum is displayed and when the corresponding sound reaches their ears. It seems to me...
No, that's exactly the point, it's a system specific thing. Sometimes output devices have quite significant latency that we can't do anything about (bluetooth, etc.). The idea is to give...
I always thought that the synchronization was only automatic for the audio side of things, and that if you're trying to synchronize other stuff with the audio (like a video...
Yes, but the part you've forgotten is that while the *source player* may be synchronized between it's audio and video, since we passed along the latency, and maybe even our...
Yes, I am talking about the second situation you described. All that would be necessary is adding either a delay to the drawing of the spectrum, or simply a delay...
Couldn't there be a FIFO queue added in somewhere around this bit instead? [https://github.com/wwmm/easyeffects/blob/master/src/effects_box.cpp#L344](https://github.com/wwmm/easyeffects/blob/master/src/effects_box.cpp#L344) Edit: you could alternatively make the queue out ui::chart instead of spectrum_mag, but I'm assuming spectrum_mag...
Also, since I still haven't managed to fully understand pipe_manager, I would be delighted to know why branching in the pipeline would be difficult.
So basically it is the branching path idea, except implemented within the spectrum code itself, using a FIFO queue on raw audio frames. I don't think you need to inset...
Oh, I understand now. You mean that every time playback begins, the spectrum would feed itself n milliseconds of silence, then start processing the audio. That makes sense. The part...
Isn't the reported latency value from pipewire of easyeffects' target output just that? It seems like it would be trivial to test if the value is reasonable, and if not,...