wavesurfer.js
wavesurfer.js copied to clipboard
HLS Support?
Is there a way to support hls audio in wavesurfer? File format is .m3u8. I tried using backend: MediaElement but I keep getting this error
@jeremylach Did you check videojs-wavesurfer? https://github.com/collab-project/videojs-wavesurfer
I think videojs has a plugin for HLS
videojs-wavesurfer can support HLS when wavesurfer.js does, see https://github.com/collab-project/videojs-wavesurfer/issues/17 for related ticket.
For anyone looking for this: It's pretty straightforward to use hls.js for this.
- Initialize a
<video>
with your desired src (e.g. m3u8 file) with hls.js - Once hls.js has initialized the video, pass in the dom element for the video via
wavesurfer.load(videoElement, waveformData, 'none')
(not sure if all those parameters are needed, but it wasn't able to generate the waveform itself); Backend has to be set toMediaElement
. - Now you can use all your usual wavesurfer controls to jump in the audio, play, pause, etc.
As I understand it, it exposes an MediaElement compatible interface, or at least one that is close enough that I didn't run into issues.
thanks for sharing @hobofan. can you add an example that uses this solution?
@hobofan Hey, can you give more details? I tried to set backend to "MediaElement" and use .load() but It's not working for me
are you using wavesurfer.backend._load(video, peaks, preload) ? how do you generate the peaks?
getting these erros when trying to use wavesurfer.load(videoElement, [0.5, 0.4, ...], 'none')
Uncaught (in promise) DOMException: The element has no supported sources.
Uncaught TypeError: Failed to set the 'currentTime' property on 'HTMLMediaElement': The provided double value is non-finite.
@Nadav42 I'll try to give a minimal example (though I use it slightly different in my project for a number of reasons), which should work:
var wavesurferNode = document.getElementById('wavesurfer');
var video = document.getElementById('video');
var videoSrc = 'https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8';
if (Hls.isSupported()) {
var hls = new Hls();
hls.loadSource(videoSrc);
hls.attachMedia(video);
hls.on(Hls.Events.MANIFEST_PARSED, function() {
const wavesurfer = WaveSurfer.create({
container: wavesurferNode,
partialRender: true, // Not sure if this is necessary
backend: 'MediaElement',
});
// Not sure if necessary; I think I put it there to prevent strange initial rendering artifacts
wavesurfer.backend.setPeaks(waveform.data, waveform.length)
wavesurfer.load(video, waveform.data);
});
}
The waveform data is generated via audiowaveform, as explained in the FAQ.
@Nadav42 I'll try to give a minimal example (though I use it slightly different in my project for a number of reasons), which should work:
var wavesurferNode = document.getElementById('wavesurfer'); var video = document.getElementById('video'); var videoSrc = 'https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8'; if (Hls.isSupported()) { var hls = new Hls(); hls.loadSource(videoSrc); hls.attachMedia(video); hls.on(Hls.Events.MANIFEST_PARSED, function() { const wavesurfer = WaveSurfer.create({ container: wavesurferNode, partialRender: true, // Not sure if this is necessary backend: 'MediaElement', }); // Not sure if necessary; I think I put it there to prevent strange initial rendering artifacts wavesurfer.backend.setPeaks(waveform.data, waveform.length) wavesurfer.load(video, waveform.data); }); }
The waveform data is generated via audiowaveform, as explained in the FAQ.
Hey, thanks. like I thought seems like I can't avoid creating the full waveform data first, in that case I think I will just have the server send the waveform data and use this as display only with no audio (hls will play the audio)
Yeah, I think for most of the use-cases where you want to use HLS (long media where you want to seek quickly), it makes sense to generate the waveforms server side (and given the speed of audiowaveform pre-generate them), as it would take really long to generate the waveforms client side.
For future readers consider live generating the waveform data with the following library (if that's something you need):
https://github.com/bbc/waveform-data.js/blob/master/README.md
You can set a custom loader for hls.js to process array buffers from stream
@Nadav42 I am trying to use pre-loaded waveform data, but hls.js seems to blow up every time I call: wavesurfer.load(video, waveform.data);
I'm curious if there is anything else you're doing in your implementation? I get Uncaught (in promise) DOMException: The element has no supported sources.
anytime I try and load.
I'm also curious if you got a client side implementation going using waveform-data.js?
@Nadav42 I am trying to use pre-loaded waveform data, but hls.js seems to blow up every time I call:
wavesurfer.load(video, waveform.data);
I'm curious if there is anything else you're doing in your implementation? I getUncaught (in promise) DOMException: The element has no supported sources.
anytime I try and load.I'm also curious if you got a client side implementation going using waveform-data.js?
What I ended up doing is write a custom loader that saves hls fragment buffers then I generate a 1minute audio object with javascript and load that into wavesurfer, you also need to set wavesurfer's time with the proper offset for it to work.
So if the HLS audio is playing on 15:30 I generate a 1minute audio from 15:00-16:00 fragments, load it to wavesurfer then set it's time to 0:30
When the player reaches 16:00 I load the next minute into wavesurfer etc
worked great for me and for my use case
@Nadav42 Glad you got it working! Any chance you could share the code that loads the hls audio fragments into an audio object and preps it for wavesurfer?
@Nadav42 Glad you got it working! Any chance you could share the code that loads the hls audio fragments into an audio object and preps it for wavesurfer?
Sadly I'm currently away from home, Remind me in a few days if this is still relevant I should be back by then
Ok, Thanks! Hope you have a good trip.
For anyone else referencing this, I didn't realize there is a playlist plugin...I don't think I can use it because it doesn't seem to support video...But looks pretty good.
https://wavesurfer-js.org/plugins/playlist.html
[UPDATE] This is misleading. They're only using m3u files to list independent audio streams...not to chunk them in a streaming sense...kinda odd plugin if you ask me.
@Nadav42 Glad you got it working! Any chance you could share the code that loads the hls audio fragments into an audio object and preps it for wavesurfer?
Sadly I'm currently away from home, Remind me in a few days if this is still relevant I should be back by then
Hey, i'd also be really interested in seeing some example code of how you achieved this!
@Nadav42 Glad you got it working! Any chance you could share the code that loads the hls audio fragments into an audio object and preps it for wavesurfer?
Sadly I'm currently away from home, Remind me in a few days if this is still relevant I should be back by then
Wish to learn your example code too :-) @Nadav42
For anyone else referencing this, I didn't realize there is a playlist plugin...I don't think I can use it because it doesn't seem to support video...But looks pretty good.
https://wavesurfer-js.org/plugins/playlist.html
[UPDATE] This is misleading. They're only using m3u files to list independent audio streams...not to chunk them in a streaming sense...kinda odd plugin if you ask me.
If you have any progress, could you update it here. thanks! @tslater
@Nadav42 I'd be interested to see your implementation as well.
I was able to do this in a couple of minutes but do not understand why I see no waves. Can someone help
https://jsfiddle.net/x0ho1b9m/1/
@Nadav42 I'll try to give a minimal example (though I use it slightly different in my project for a number of reasons), which should work:
var wavesurferNode = document.getElementById('wavesurfer'); var video = document.getElementById('video'); var videoSrc = 'https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8'; if (Hls.isSupported()) { var hls = new Hls(); hls.loadSource(videoSrc); hls.attachMedia(video); hls.on(Hls.Events.MANIFEST_PARSED, function() { const wavesurfer = WaveSurfer.create({ container: wavesurferNode, partialRender: true, // Not sure if this is necessary backend: 'MediaElement', }); // Not sure if necessary; I think I put it there to prevent strange initial rendering artifacts wavesurfer.backend.setPeaks(waveform.data, waveform.length) wavesurfer.load(video, waveform.data); }); }
The waveform data is generated via audiowaveform, as explained in the FAQ.
hls cant find file for me and i change hls config to try more but nothing happend. config is: { manifestLoadingMaxRetry: 4, manifestLoadingTimeOut: 90e3, levelLoadingTimeOut: 90e3, levelLoadingMaxRetry: 8 }
@Nadav42 I'll try to give a minimal example (though I use it slightly different in my project for a number of reasons), which should work:
var wavesurferNode = document.getElementById('wavesurfer'); var video = document.getElementById('video'); var videoSrc = 'https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8'; if (Hls.isSupported()) { var hls = new Hls(); hls.loadSource(videoSrc); hls.attachMedia(video); hls.on(Hls.Events.MANIFEST_PARSED, function() { const wavesurfer = WaveSurfer.create({ container: wavesurferNode, partialRender: true, // Not sure if this is necessary backend: 'MediaElement', }); // Not sure if necessary; I think I put it there to prevent strange initial rendering artifacts wavesurfer.backend.setPeaks(waveform.data, waveform.length) wavesurfer.load(video, waveform.data); }); }
The waveform data is generated via audiowaveform, as explained in the FAQ.
I tried to implement this but I'm having trouble when getting the waveform data from the m3u8
video. I've followed the FAQ steps but it seems that audiowaveform
doesn't support m3u8
files. I also tried to use the waveform-data
package to get the data via client side but no success.
Does anyone have a solution for this? I couldn't solve it and I must use m3u8
files
This seems to have caught some traction, so i'll explain how i achieved rendering live waveform data smoothly from a HLS stream using hls.js. Unfortunately i can't share any code as it was written for a client.
For hls you can pass in a custom fLoader, write your own CustomFragLoader
class that extends the base Loader
.
Inside the load
function you have access to the onSuccess
callback which is invoked when a new frag is fetched.
The onSuccess
cb makes the fetched response available to you. response.data.slice(0)
will give you a copy of the decoded buffer.
You can use the OfflineAudioContext
API method decodeAudioData
to transform the buffers into audio-buffers. Each time you receive a new fetched Buffer, just append it to the previous accumulation of buffers to create a representation of the stream.
Each time you add to the accumulated buffer, you can clear the old waveform data by calling empty
on the Wavesurfer class, then load the updated Buffer by calling loadDecodedBuffer
on the Wavesurfer class.
@JordanPawlett, and if you don't want to empty the wavesurfer, just use this function proposed by @thijstriemstra #272. works good for me ^^
Unfortunately i can't share any code as it was written for a client.
But the community provided them with wavesurfer.js, saving them enormous development costs, and they don't want to give something back? Strange world we live in.
Hello, I'm also having problems using hls with wavesurfer.
this.canPlay = false;
this.audioPlayer = document.getElementById("audio-player");
this.wavesurferNode = document.getElementById("wave");
this.initEventHandlers();
return new Promise((res, rej) => {
if (hls && Hls.isSupported()) {
this.initHLS(
source +
"?hls=true&token=" +
this.$store.getters.getToken
);
this.hls.on(
Hls.Events.MANIFEST_PARSED,
function (event, data) {
axios
.get(this.currentAudio.waveform)
.then((waveform) => {
const wavesurfer = WaveSurfer.create({
container: this.wavesurferNode,
partialRender: true, // Not sure if this is necessary
backend: "MediaElement",
normalize: true,
});
wavesurfer.backend.setPeaks(
waveform.data.data,
waveform.data.length
);
wavesurfer.load(
this.audioPlayer,
waveform.data.data
);
})
.catch((error) => {
// Auto-play was prevented
// Show paused UI.
});
this.canPlay = true;
res();
}.bind(this)
);
Unfortunately i can't share any code as it was written for a client.
But the community provided them with wavesurfer.js, saving them enormous development costs, and they don't want to give something back? Strange world we live in.
Unfortunately you encounter this a lot. Someone pays somebody, and in this case @JordanPawlett has integrity and knows better (which is awesome!). Despite this, he gave clear directions on how to achieve this on your own. It's not his fault that his customer is closed source, but he has to do right by his customer...
How are people generating the waveform from an M3U8 because this doesn't seem to be supported by peaks... ?
How are people generating the waveform from an M3U8 because this doesn't seem to be supported by peaks... ?
yes I'm in the same situation, can we someone provide and example
Hi @katspaugh
Thanks for you awesome package! It works great, the only issue I have is HLS support. I have to make quite a few workarounds in order to get HLS support, such as using vendor players while only using the rendering features of waveform without loading the actual audio.
It would be great if HLS support could be added to wavesurfer, is it something that can be done in the near future?