icecast-metadata-js
icecast-metadata-js copied to clipboard
Pass custom parameter into icecast-metadata-player-1.17.1.main.min.js
How can I pass a custom int parameter into the script that is available globally from the configuration of the instance?
I'm trying to display the currently received volume for a stream - the below implementation logs it to the console. Ideally this would be implemented as a callback that could update labels or canvases for multiple players on a single page with the currently received audio volume.
I'm attempting to add the following code which allows for adding a web audio analyzer connects to the chain that calculates the volume currently received
const scriptNode = e.createScriptProcessor(16384, 2, 2);
const myAudio = document.querySelector("audio");
const source = e.createMediaElementSource(myAudio);
const analyser = e.createAnalyser();
source.connect(analyser);
analyser.connect(e.destination);
scriptNode.connect(e.destination),
(e.onstatechange = () => {
"running" !== e.state && e.resume().catch(t);
});
analyser.fftSize = 2048;
const sampleBuffer = new Float32Array(analyser.fftSize);
setInterval(function() {
analyser.getFloatTimeDomainData(sampleBuffer);
// Compute average power over the interval.
let sumOfSquares = 0;
for (let i = 0; i < sampleBuffer.length; i++) {
sumOfSquares += sampleBuffer[i] ** 2;
}
const avgPowerDecibels = 10 * Math.log10(sumOfSquares / sampleBuffer.length);
// Compute peak instantaneous power over the interval.
let peakInstantaneousPower = 0;
for (let i = 0; i < sampleBuffer.length; i++) {
const power = sampleBuffer[i] ** 2;
peakInstantaneousPower = Math.max(power, peakInstantaneousPower);
}
const peakInstantaneousPowerDecibels = 10 * Math.log10(peakInstantaneousPower);
console.log(avgPowerDecibels.toFixed(2));
}, 1000);
})
You can do this by creating an Audio
element before instantiating IcecastMetadataPlayer
and passing that reference into the constructor's options.audioElement
. You'll need to create a unique audio element for each unique stream that you want to play on your webpage. You should just be able to replace the query selector with the audio element reference in your example.
This is how the visualizers on the demo page get their audio data.
const myAudio = new Audio();
const player = new IcecastMetadataPlayer("https://stream.example.com", {
audioElement: myAudio,
// any other options
});
const e = new AudioContext();
const source = e.createMediaElementSource(myAudio);
const analyser = e.createAnalyser();
// ... the rest of your analyzer code
It should be noted, that if you want to reuse the existing audio element for a new instance of IcecastMetadataPlayer
, then you will need to call player.detachAudioElement()
before using it with another instance.
Note that I'm currently using the constructor's options.audioElement
- but I'm passing the result of document.getElementById
on an <audio>
html5 tag. The behavior I get no audio to the browser.
I'll need to test further and see how you've done it on the visualizers on the demo page.
Ok, I've gotten this working, however I'm unable to get analyser.getFloatTimeDomainData
to output anything else other than zeros in Safari on Mac/iOS. How are you handling Safari in your visualizations?
I'm making sure the audioContext is started by a user action (I start the audio context by clicking a button that runs a function that sets up the context, the player etc... ). Any suggestions? Here is my working code in all browsers EXCEPT Safari....
<button onclick="playStream()" id="playButton" class="btn btn-primary">Play</button>
<script src="icecast-metadata-player-1.17.1.main.min.js"></script>
<script>
const myAudio = new Audio();
var icecastMetadataPlayer = new IcecastMetadataPlayer(streamURL, {
audioElement: myAudio,
playbackMethod: "html5"
});
function playStream() {
icecastMetadataPlayer.play();
const e = new (window.AudioContext || window.webkitAudioContext)();
const source = e.createMediaElementSource(myAudio);
const analyser = e.createAnalyser();
analyser.fftSize = 2048;
source.connect(analyser);
source.connect(e.destination);
setInterval(function () {
const sampleBuffer = new Float32Array(analyser.fftSize);
analyser.getFloatTimeDomainData(sampleBuffer);
console.log(sampleBuffer); // All Zeros in Safari !
let sumOfSquares = 0;
for (let i = 0; i < sampleBuffer.length; i++) {
sumOfSquares += sampleBuffer[i] ** 2;
}
const avgPowerDecibels = 10 * Math.log10(sumOfSquares / sampleBuffer.length);
// Compute peak instantaneous power over the interval.
let peakInstantaneousPower = 0;
for (let i = 0; i < sampleBuffer.length; i++) {
const power = sampleBuffer[i] ** 2;
peakInstantaneousPower = Math.max(power, peakInstantaneousPower);
}
// Log to the console sampled volume
console.log(avgPowerDecibels.toFixed(2));
}, 1000);
}
Could you check if the demo visualizer works on your Safari device? On the upper right of the React demo, there's a dropdown that allows the visualizer plugin to be changed. You'll want to cycle through those.
It's been a while since I've tried this with iOS and it's possible something broke. I don't own a physical iOS device so I use a vm running MacOS and Xcode to simulate an iPhone to test with iOS, which is inconvenient to set up.
You might try removing the playbackMethod: "html5"
option. Also, removing that allows the more bandwidth efficient MediaSource and WebAudio playback apis to be used when supported.
Also, you might try changing the order that the audio context is attached to the audio element. I would try moving it before the play
and then before the player is instantiated.
I definitely tried all those. I've found working with the webaudio apis requires basically what I call "brute force coding" - you have to try every permutation and combination of code examples until something finally works. And most of the time you don't know why. Hahaha!
Interestly enough, I just checked your visualizers on Mac OS X Safari and now they are NOT working. But they were working I believe earlier in the week. It's possible a Safari upgrade I did this week broke it (again).