nuclear icon indicating copy to clipboard operation
nuclear copied to clipboard

Audio Normalization

Open hp27596 opened this issue 3 years ago • 1 comments

Since Nuclear pulls songs from varying different sources, it'd be nice if it has a built in feature to normalize volume.

Currently the jump can be quite steep from songs to songs, and there has been more than one occasion where I turned volume up for one song and get deafen by the next one's loudness.

hp27596 avatar Jul 08 '22 04:07 hp27596

I know, and I actually got really annoyed by this just yesterday while listening to some songs. We've had a discussion around this in #703 , I never got around implementing it though. It would be really good to have this. I don't see any existing ReplayGain wrappers for node, so we'd have to integrate this from scratch.

nukeop avatar Jul 08 '22 08:07 nukeop

Hello, I'll take this issue.

SimonKacir avatar Oct 03 '22 23:10 SimonKacir

Ok, can you please first say how you're planning to approach this?

nukeop avatar Oct 03 '22 23:10 nukeop

I'll be working on it over the next couple of weeks, I just managed to build the app today, I will do some research on audio normalization and let you know once I have a proper plan.

SimonKacir avatar Oct 03 '22 23:10 SimonKacir

Sounds good, thanks for contributing.

nukeop avatar Oct 03 '22 23:10 nukeop

So after digging a bit I think that implementing an application-level normalizer seems more reasonable. To achieve this I need to retrieve specific dB level of what's currently playing at the time of streaming. Is there some way to get the dB values somewhere in the code? Or possibly to just retrieve specific audio data? Just making sure since I can't find it.

SimonKacir avatar Oct 09 '22 16:10 SimonKacir

We use the html5 audio api to stream (SoundContainer and the react-hifi package), so it might be possible to get it from there: https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API

nukeop avatar Oct 10 '22 12:10 nukeop

I tried messing around with the web audio api, however I can't seem to access audio context of the app. I see audio context being used in HLS player, but is it even working? React-hifi doesn't appear to have any function to use the audio context (or maybe I missed something), could I use https://github.com/audiojs/web-audio-api as a complement to it?

SimonKacir avatar Oct 15 '22 20:10 SimonKacir

You can search by the string audioContext, e.g. visualizer uses this. Anything that's rendered inside the <Sound> component will have the audio context passed in.

nukeop avatar Oct 15 '22 20:10 nukeop

In that case, would a considerable approach be creating a component to be rendered in the <Sound> component, that controls audio? Feel free to correct me if I'm misunderstanding something, since I'm trying to understand how audio even flows in this app.

SimonKacir avatar Oct 16 '22 16:10 SimonKacir

Yeah that's a sensible approach. That's how visualizer and equalizer work. Since this is similar to what the equalizer is doing I think you could add it there. If it starts getting messy that component (SoundContainer) can be refactored somehow but we'll cross that bridge when we get there.

nukeop avatar Oct 16 '22 17:10 nukeop

I'm currently taking a look at Equalizer, but I'm confused as to where I could add the normalization implementation. In SoundContainer there is an Equalizer being rendered, however it is imported from 'react-hifi' as far as I can see. Is there something I'm missing?

Edit: Finally managed to control volume, now I just need to retrieve file data so I can calculate normalization.

SimonKacir avatar Oct 22 '22 12:10 SimonKacir

Web audio API lets you add processing nodes in a chain. Each node gets its input after it's been processed by the nodes before it. So you shouldn't worry about the equalizer.

The imported equalizer is there to process the sound according to the values selected on the equalizer screen. To normalize audio, you can simply create your own audio node, and put it somewhere in the chain - probably before the volume node, at the very beginning.

nukeop avatar Oct 22 '22 13:10 nukeop

I managed to create a component with its own gain node to control volume which could serve to actually normalize it. But now I need to calculate the average amplitude of the wave. Looking at the Web audio API, it has an AnalyserNode: https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode

The AnalyserNode might help with that? I'm not sure to what degree it would yet.

SimonKacir avatar Oct 22 '22 13:10 SimonKacir

I pretty much have an implementation at this point, there is just one issue I'm facing. Is there any way to wait for normalizing to take effect? Because at the moment, it starts the audio from a song, and half a second or so later, normalizing takes effect. That is because I need to create a fetch at the source URL to get the audio data and then to perform calculations according to those. I tried using await, however that creates a lot of other problems.

SimonKacir avatar Oct 30 '22 12:10 SimonKacir

Hm, is it possible to perform it on a stream? I'd like to avoid interruptions in playback if possible. Otherwise, streaming audio to the sinks is left to the web audio api and we don't have (yet) a mechanism for awaiting.

nukeop avatar Oct 30 '22 13:10 nukeop

From the research I've done, I need the entire audio file to be able to normalize properly. Otherwise the normalization would be improper and I would be guessing more than anything. I tried suspending the audio context while normalizing and then resuming (seems like it prevents playback interruption somewhat), and it sort of works, however it turns out that the normalization only occurs when the song played is actually loading in, but when a new song loads in the queue and finishes loading and you click on it after it has loaded in, the normalization doesn't happen (rather it is the same normalization from the first song loaded in). So basically, the song loads in, then the normalization happens and that is then the normalization for every song in the queue, until you select a new song outside the queue.

It's almost as if I need to force a re-render or something, every time a new song played for it to take effect.

I'm not sure if there is a way to avoid this problem, all I've done is created my own component (like the volume plugin in react-hifi), performed a fetch in it and the normalizing calculations, after which I simply set the gain to whatever it should be.

SimonKacir avatar Oct 30 '22 15:10 SimonKacir

Maybe the component needs to be re-rendered when the current item in the queue is updated? This could be achieved with a hook. Could you open a pull request so I can take a look?

Btw for the first iteration it's fine if the solution is not perfect. We can put it behind a toggle in the settings to let users test it, then decide how to improve it.

nukeop avatar Nov 03 '22 20:11 nukeop

PR is up, feel free to take a look.

SimonKacir avatar Nov 03 '22 21:11 SimonKacir

Since this is merged, maybe close the issue and add this to the feature list? https://github.com/nukeop/nuclear/pull/1355

kohend avatar Apr 01 '23 20:04 kohend

Good idea

nukeop avatar Apr 01 '23 21:04 nukeop