Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to visualise HLS chunks #67

Open
JordanPawlett opened this issue Apr 24, 2020 · 4 comments
Open

Unable to visualise HLS chunks #67

JordanPawlett opened this issue Apr 24, 2020 · 4 comments

Comments

@JordanPawlett
Copy link

I'm trying to visualise a period of N (time in seconds), using a HLS stream handled by hls.js.

I've created an AudioContext and connected it up with my media element correctly. I then constructed a new process from the context using createScriptProcessor.
Binding a function to onaudioprocess, I grab each audioProcessingEvent.inputBuffer (AudioBuffer) over N seconds, and append them to each other, ultimately creating a AudioBuffer representing N period of time.

I then pass the constructed AudioBuffer to WaveformData.createFromAudio with a scale of 128. The output waveform seems ok at a glance, although i'm not too sure how to verify this...

I'm unable to represent the waveform data using the canvas example in the README.
Are there any tools i can use to verify the data i've produced is correct? Or at least any points to look for.
Should i normalise the data in the ArrayBuffer produced between 0 and 1 before trying to render it? I've noticed there's lots of peaks and troughs.

Furthermore, i've tried to pass the waveform data produced to be represented by peaks.js. The duration of the output is correct, however there are no data points displayed.

@chrisn
Copy link
Member

chrisn commented Apr 24, 2020

I'd really like to get to a fully working example with HLS and Peaks.js.

I recommend checking the length of the audio segments, audio sample rate, and scale value. Try to arrange things such that the segment length (in audio samples) is exactly divisible by the scale.

The ArrayBuffer will contain values in the range -128 to 127. The scaleY function in the canvas example scales this range to fit a given height in pixels, no need to normalise to 0 to 1.

It's hard to answer more specifically about the canvas example or Peaks.js, happy to try if you can provide more details or an example.

@tslater
Copy link

tslater commented Feb 2, 2021

Curious if either of you made progress on this or not? I'm just starting out on it.

@JordanPawlett
Copy link
Author

Unfortunately I had to abandon my R&D as i needed to pivot my current direction.
Funnily enough I had a meeting a few hours ago (strange world) which means i need to focus my attentions back onto this!

For my use-case I need to dynamically generate interact-able wave-forms live maybe with a 30 second running 'cache'.

I think i was along the right lines with what i mentioned above.
I'll let you know if I manage to work anything out. Keep me posted!

@tslater
Copy link

tslater commented Feb 3, 2021

Coincidences!!!!

For context:
I've opted to use WaveSurfer over Peaks.js because Peaks segments are much harder to click, visualize, etc. and because it is canvas, it doesn't seem like you can customize looks or behavior as easily and PRs look like they're vetted heavily.

I have considered, and am still considering, trying to compute the peaks client side, but I'm actually not live streaming, I'm merely using HLS, so precomputing isn't a big issue, just requires more work upfront. If I get things working precomputed, I may try and make it dynamic/client side. There is someone I was chatting yesterday who seems to have gotten something working. They said to bug them in a few days when they get back home. Here's the issue/conversation for reference: katspaugh/wavesurfer.js#1078

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants