-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to visualise HLS chunks #67
Comments
I'd really like to get to a fully working example with HLS and Peaks.js. I recommend checking the length of the audio segments, audio sample rate, and scale value. Try to arrange things such that the segment length (in audio samples) is exactly divisible by the scale. The ArrayBuffer will contain values in the range -128 to 127. The It's hard to answer more specifically about the canvas example or Peaks.js, happy to try if you can provide more details or an example. |
Curious if either of you made progress on this or not? I'm just starting out on it. |
Unfortunately I had to abandon my R&D as i needed to pivot my current direction. For my use-case I need to dynamically generate interact-able wave-forms live maybe with a 30 second running 'cache'. I think i was along the right lines with what i mentioned above. |
Coincidences!!!! For context: I have considered, and am still considering, trying to compute the peaks client side, but I'm actually not live streaming, I'm merely using HLS, so precomputing isn't a big issue, just requires more work upfront. If I get things working precomputed, I may try and make it dynamic/client side. There is someone I was chatting yesterday who seems to have gotten something working. They said to bug them in a few days when they get back home. Here's the issue/conversation for reference: katspaugh/wavesurfer.js#1078 |
I'm trying to visualise a period of N (time in seconds), using a HLS stream handled by hls.js.
I've created an AudioContext and connected it up with my media element correctly. I then constructed a new process from the context using
createScriptProcessor
.Binding a function to
onaudioprocess
, I grab eachaudioProcessingEvent.inputBuffer
(AudioBuffer) over N seconds, and append them to each other, ultimately creating a AudioBuffer representing N period of time.I then pass the constructed AudioBuffer to
WaveformData.createFromAudio
with a scale of 128. The output waveform seems ok at a glance, although i'm not too sure how to verify this...I'm unable to represent the waveform data using the canvas example in the README.
Are there any tools i can use to verify the data i've produced is correct? Or at least any points to look for.
Should i normalise the data in the ArrayBuffer produced between 0 and 1 before trying to render it? I've noticed there's lots of peaks and troughs.
Furthermore, i've tried to pass the waveform data produced to be represented by peaks.js. The duration of the output is correct, however there are no data points displayed.
The text was updated successfully, but these errors were encountered: