Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to create get seekable webm from media-recoder example without using TypeScript? #14

Closed
guest271314 opened this issue Aug 26, 2017 · 23 comments

Comments

@guest271314
Copy link

Have no experience using TypeScript. The code example using your library could possibly be a solution for w3c/media-source#191. How can we convert get seekable webm from media-recoder example to plain JavaScript capable of usage within browser?

@guest271314
Copy link
Author

Interestingly, getting errors at plnkr https://plnkr.co/edit/oxE8JTGKPWa21tHpzuCc?p=preview using same code from https://jsfiddle.net/ub2jej7c/

VM501 script.js:4278 Uncaught (in promise) RangeError: "value" argument is out of bounds
    at checkInt (VM501 script.js:4278)
    at Uint8Array.writeUIntBE (VM501 script.js:4307)
    at createUIntBuffer (VM501 script.js:1112)
    at VM501 script.js:1020
    at Array.forEach (<anonymous>)
    at create_cue (VM501 script.js:1015)
    at refineMetadata (VM501 script.js:966)
    at Object.putRefinedMetaData (VM501 script.js:933)
    at VM501 script.js:731
    at step (VM501 script.js:637)

@guest271314
Copy link
Author

@legokichi Was able to make adjustments to jsfiddle to use at browser, though not able to successfully render ArrayBuffer representation of refinedWebM at MediaSource https://jsfiddle.net/uyo38jcp/.

@legokichi
Copy link
Owner

legokichi commented Aug 27, 2017

@guest271314 here is a seekable webm creation code in vanilla js.

browserify index.js -o bundle.js

const ebml = require("ts-ebml");

navigator.mediaDevices.getUserMedia({video: true, audio: true}).then((stream)=>{
  const decoder = new ebml.Decoder();
  const reader = new ebml.Reader();
  let tasks = Promise.resolve();
  let webM = new Blob([], {type: "video/webm"});
  const rec = new MediaRecorder(stream, { mimeType: 'video/webm; codecs="vp8, opus"'});
  rec.addEventListener("dataavailable", ondataavailable);
  function ondataavailable(ev){
    console.log("data");
    const chunk = ev.data;
    webM = new Blob([webM, chunk], {type: chunk.type});
    const task = ()=> readAsArrayBuffer(chunk).then((buf)=>{
      const elms = decoder.decode(buf);
      elms.forEach((elm)=>{ reader.read(elm); });
    });
    tasks = tasks.then(()=> task() );
  }
  rec.start(100);
  return sleep(10*1000)
    .then(()=>{
      rec.stop();
      rec.removeEventListener("dataavailable", ondataavailable);
      rec.stream.getTracks().map((track) => { track.stop(); });
      reader.stop();
      return tasks; })
    .then(()=>{
      const refinedMetadataBuf = ebml.tools.makeMetadataSeekable(reader.metadatas, reader.duration, reader.cues);
      return readAsArrayBuffer(webM).then((webMBuf)=>{
        const body = webMBuf.slice(reader.metadataSize);
        const refinedWebM = new Blob([refinedMetadataBuf, body], {type: webM.type});
        const refined_video = document.createElement("video");
        refined_video.src = URL.createObjectURL(refinedWebM);
        refined_video.controls = true;
        document.body.appendChild(refined_video);
        return; }); });
});


function readAsArrayBuffer(blob) {
  return new Promise((resolve, reject)=>{
    const reader = new FileReader();
    reader.readAsArrayBuffer(blob);
    reader.onloadend = ()=>{ resolve(reader.result); };
    reader.onerror = (ev)=>{ reject(ev.error); };
  });
}

function sleep(ms){
  return new Promise((resolve)=> setTimeout(resolve, ms) );
}

@guest271314
Copy link
Author

@legokichi Interesting results using URL.createObjectURL() and MediaSource. Neither approach renders expected result so far https://jsfiddle.net/258f9038/, https://jsfiddle.net/258f9038/1/.

@legokichi
Copy link
Owner

legokichi commented Aug 28, 2017

@guest271314 example_mediasource.ts is demo code for split recording and streaming playback. But it was incomplete. This is the code that works.

This code is intended for long-term recording. In order not to create a huge ArrayBuffer, WebM output by the MediaRecorder API is divided into metadata, clusters, and mime type. And you can save this in IndexedDB one by one. Avoid using huge blobs and ArrayBuffer by using the MediaSource API for playing split long video files.

const ebml = require("ts-ebml");

navigator.mediaDevices.getUserMedia({video: true, audio: true}).then((stream)=>{
  stream.addEventListener("active", (ev)=>{ console.log(ev.type); });
  stream.addEventListener("inactive", (ev)=>{ console.log(ev.type); });
  stream.addEventListener("addtrack", (ev)=>{ console.log(ev.type); });
  stream.addEventListener("removetrack", (ev)=>{ console.log(ev.type); });

  const rec = new MediaRecorder(stream, { mimeType: 'video/webm; codecs="vp8, opus"'});
  rec.addEventListener("dataavailable", (ev)=>{ console.log(ev.type); });
  rec.addEventListener("pause", (ev)=>{ console.log(ev.type); });
  rec.addEventListener("resume", (ev)=>{ console.log(ev.type); });
  rec.addEventListener("start", (ev)=>{ console.log(ev.type); });
  rec.addEventListener("stop", (ev)=>{ console.log(ev.type); });
  rec.addEventListener("error", (ev)=>{ console.error(ev.type, ev); });

  const decoder = new ebml.Decoder();
  const reader = new ebml.Reader();
  let tasks = Promise.resolve();
  
  rec.addEventListener("dataavailable", ondataavailable);
  function ondataavailable(ev){
    console.log("data")
    const chunk = ev.data;
    const task = ()=> readAsArrayBuffer(chunk).then((buf)=>{
      const elms = decoder.decode(buf);
      elms.forEach((elm)=>{ reader.read(elm); });
    });
    tasks = tasks.then(()=> task() );
  }
  const clusters = [];
  reader.addListener("cluster", (ev)=>{
    console.log("cluster");
    const buf = new ebml.Encoder().encode(ev.data);
    clusters.push(buf);
  });
  rec.start(100);
  return sleep(10*1000).then(()=>{
    rec.stop();
    rec.removeEventListener("dataavailable", ondataavailable);
    rec.stream.getTracks().map((track) => { track.stop(); });
    reader.stop();
    return tasks;
  }).then(()=>{
    const refinedMetadataBuf = ebml.tools.makeMetadataSeekable(reader.metadatas, reader.duration, reader.cues);
    return {metadata: refinedMetadataBuf, clusters, mimeType: rec.mimeType};
  });
}).then((ctx)=>{
  const {metadata, clusters, mimeType} = ctx;

  const ms = new MediaSource();
  ms.addEventListener('sourceopen', (ev)=>{ console.log(ev.type); });
  ms.addEventListener('sourceended', (ev)=>{ console.log(ev.type); });
  ms.addEventListener('sourceclose', (ev)=>{ console.log(ev.type); });
  ms.sourceBuffers.addEventListener('addsourcebuffer', (ev)=>{ console.log(ev.type); });
  ms.sourceBuffers.addEventListener('removesourcebuffer', (ev)=>{ console.log(ev.type); });
    
  const video = document.createElement("video");
  video.addEventListener('loadstart', (ev)=>{ console.log(ev.type); });
  video.addEventListener('progress', (ev)=>{ console.log(ev.type); });
  video.addEventListener('loadedmetadata', (ev)=>{ console.log(ev.type); });
  video.addEventListener('loadeddata', (ev)=>{ console.log(ev.type); });
  video.addEventListener('canplay', (ev)=>{ console.log(ev.type); });
  video.addEventListener('canplaythrough', (ev)=>{ console.log(ev.type); });
  video.addEventListener('playing', (ev)=>{ console.log(ev.type); });
  video.addEventListener('waiting', (ev)=>{ console.log(ev.type); });
  video.addEventListener('seeking', (ev)=>{ console.log(ev.type); });
  video.addEventListener('seeked', (ev)=>{ console.log(ev.type); });
  video.addEventListener('ended', (ev)=>{ console.log(ev.type); });
  video.addEventListener('emptied', (ev)=>{ console.log(ev.type); });
  video.addEventListener('stalled', (ev)=>{ console.log(ev.type); });
  video.addEventListener('timeupdate', (ev)=>{ console.log(ev.type); }); // annoying
  video.addEventListener('durationchange', (ev)=>{ console.log(ev.type); });
  video.addEventListener('ratechange', (ev)=>{ console.log(ev.type); });
  video.addEventListener('play', (ev)=>{ console.log(ev.type); });
  video.addEventListener('pause', (ev)=>{ console.log(ev.type); });
  video.addEventListener('error', (ev)=>{ console.warn(ev.type, ev); });
  //video.srcObject = ms;
  video.src = URL.createObjectURL(ms);
  video.volume = 0;
  video.controls = true;
  video.autoplay = true;
  document.body.appendChild(video);

  return new Promise((resolve)=>{ ms.addEventListener('sourceopen', ()=>{ resolve({metadata, clusters, mimeType, video, ms}); }, {once: true}); });
}).then((ctx)=>{
  const {video, ms, metadata, clusters, mimeType} = ctx;
  const sb = ms.addSourceBuffer(mimeType);
  sb.addEventListener('updatestart', (ev)=>{ console.log(ev.type); }); // annoying
  sb.addEventListener('update', (ev)=>{ console.log(ev.type); }); // annoying
  sb.addEventListener('updateend', (ev)=>{ console.log(ev.type); }); // annoying
  sb.addEventListener('error', (ev)=>{ console.error(ev.type, ev); });
  sb.addEventListener('abort', (ev)=>{ console.log(ev.type); });

  return [metadata].concat(clusters).reduce((o, buf)=> o.then(()=> appendBuffer(video, sb, buf)), Promise.resolve());
});


function readAsArrayBuffer(blob) {
  return new Promise((resolve, reject)=>{
    const reader = new FileReader();
    reader.readAsArrayBuffer(blob);
    reader.onloadend = ()=>{ resolve(reader.result); };
    reader.onerror = (ev)=>{ reject(ev.error); };
  });
}

function sleep(ms){
  return new Promise((resolve)=> setTimeout(resolve, ms) );
}


function appendBuffer(video, sb, buf) {
  return new Promise((resolve, reject)=>{
    sb.appendBuffer(buf);
    sb.addEventListener('updateend', ()=> resolve(), {once: true});
    sb.addEventListener("error", (ev)=> reject(ev), {once: true});
  }).then(()=>{
    console.log("timestampOffset", sb.timestampOffset);
    console.log("appendWindowStart", sb.appendWindowStart);
    console.log("appendWindowEnd", sb.appendWindowEnd);
    for(let i=0; i<sb.buffered.length; i++){
      console.log("buffered", i, sb.buffered.start(i), sb.buffered.end(i));
    }
    for(let i=0; i<video.seekable.length; i++){
      console.log("seekable", i, video.seekable.start(i), video.seekable.end(i));
    }
    console.log("webkitAudioDecodedByteCount", video["webkitAudioDecodedByteCount"]);
    console.log("webkitVideoDecodedByteCount", video["webkitVideoDecodedByteCount"]);
    console.log("webkitDecodedFrameCount", video["webkitDecodedFrameCount"]);
    console.log("webkitDroppedFrameCount", video["webkitDroppedFrameCount"]);
  
    if (video.buffered.length > 1) {
      console.warn("MSE buffered has a gap!");
      throw new Error("MSE buffered has a gap!");
    }
  });
}

@guest271314
Copy link
Author

guest271314 commented Aug 28, 2017

@legokichi Code have composed so far using MediaSource without ts-ebml. Have not included MediaRecorder tests, yet. Still has at least one bug. When trying to set .appendWindowStart and .appendWindowEnd for the third buffer the value does not get set, and if a range is set outside of the second (previous) range an error is thrown. Still reading example_mediasource.ts

<video preload="auto" autoplay="true" width="320" height="280"></video>

    (async() => {

      const mediaSource = new MediaSource();

      const video = document.querySelector("video");

      const urls = ["https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4"
     , "https://raw.githubusercontent.com/w3c/web-platform-tests/master/media-source/mp4/test.mp4" 
     , "https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4" 
     ];

      const request = url => fetch(url).then(response => response.arrayBuffer());

      const files = await Promise.all(urls.map(request));

      let ranges = [
        [10, 20],
        [2.5, 5.75],
        /* [25, 30] 
          Uncaught (in promise) TypeError: Failed to set 
          the 'appendWindowStart' 
          property on 'SourceBuffer': The value provided (25) 
          is outside the range (0, 5.75]. ?
        */
        [0, 5] 
      ];
      
      const mimeCodec = 'video/mp4; codecs="avc1.42E01E,mp4a.40.2"';

      const media = await Promise.all(files.map((file, index) => {
        return new Promise(resolve => {
          let media = document.createElement("video");
          let blobURL = URL.createObjectURL(new Blob([file]));
          media.onloadedmetadata = async e => {
            resolve({
              mediaDuration: media.duration,
              mediaBuffer: file,
              mediaRanges: ranges[index]
            })
          }
          media.src = blobURL;
        })
      }));

      console.log(media);

      const sourceOpen = async e => {
      
        const sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
        sourceBuffer.mode = "segments";
        if (MediaSource.isTypeSupported(mimeCodec)) {
        
          video.ontimeupdate = e => console.log(video.currentTime, sourceBuffer.buffered.end(0));
          
          for (let {mediaBuffer, mediaDuration, mediaRanges: [from, to]} of media) {
            console.log(from, to);
            await new Promise(resolve => {

              sourceBuffer.appendWindowStart = from;
              sourceBuffer.appendWindowEnd = to;
              sourceBuffer.appendBuffer(mediaBuffer);
            
              sourceBuffer.onupdateend = async e => {
                sourceBuffer.onupdateend = null;
                // sourceBuffer.timestampOffset += to - from;
                console.log(sourceBuffer.buffered.start(0)
                , sourceBuffer.buffered.end(0), mediaSource.duration);
                
                video.currentTime = sourceBuffer.buffered.start(0);

                video.play().then(() => {
                  video.onwaiting = e => {
                    console.log(e);
                    video.onwaiting = null;
                    resolve();
                  };
                })
              }

            })

          }
           mediaSource.endOfStream();

        } else {
          console.warn(mimeCodec + " not supported");
        }
      };
      
      const sourceEnded = e => console.log(e.type);
      
      mediaSource.onsourceended = sourceEnded;
      
      mediaSource.onsourceopen = sourceOpen;
      
      video.src = URL.createObjectURL(mediaSource);
      
    })();

Using .mode = "segments" does not returned expected result https://jsfiddle.net/258f9038/2/ , https://jsfiddle.net/258f9038/2/

@guest271314
Copy link
Author

@legokichi Getting closer. Adjusted MediaSource code. The issue now is that the media durations are the same, though the first should be 10 seconds https://jsfiddle.net/258f9038/4/

@guest271314
Copy link
Author

@legokichi .map() does not await previous iterable within Promise.all(). Substituted async/await and for..of loop (could alternatively use .reduce()) for .map(), Promise.all(), which yields expected result https://jsfiddle.net/258f9038/5/.

Solved https://jsfiddle.net/258f9038/6/.

Nice work.

@guest271314
Copy link
Author

@legokichi Evidently did not update jsfiddle to include for..of loop substituted for Promise.all() https://jsfiddle.net/258f9038/9/

@guest271314
Copy link
Author

@legokichi fwiw the code that have composed so far https://github.com/guest271314/recordMediaFragments. There are several issues at Firefox, one of which is that the media fragments recorded into a single Blob at MediaSource does not render audio playback.

@thijstriemstra
Copy link
Contributor

thijstriemstra commented Jan 8, 2019

Here's a simple snippet for Blob instances:

import {Decoder, Encoder, tools, Reader} from 'ts-ebml';

const readAsArrayBuffer = function(blob) {
    return new Promise((resolve, reject) => {
        const reader = new FileReader();
        reader.readAsArrayBuffer(blob);
        reader.onloadend = () => { resolve(reader.result); };
        reader.onerror = (ev) => { reject(ev.error); };
    });
}

const injectMetadata = function(blob) {
    const decoder = new Decoder();
    const reader = new Reader();
    reader.logging = false;
    reader.drop_default_duration = false;

    readAsArrayBuffer(blob).then((buffer) => {
        const elms = decoder.decode(buffer);
        elms.forEach((elm) => { reader.read(elm); });
        reader.stop();

        var refinedMetadataBuf = tools.makeMetadataSeekable(
            reader.metadatas, reader.duration, reader.cues);
        var body = buffer.slice(reader.metadataSize);

        const result = new Blob([refinedMetadataBuf, body],
            {type: blob.type});

       return result;
    });
}

// usage: pass in a webm blob
var updatedBlob = injectMetadata(myBlob);

@guest271314
Copy link
Author

@thijstriemstra The site that was using for ts-ebml is no longer providing the service https://wzrd.in/bundle/ts-ebml@latest/. Is there an online CDN for ts-ebml that you use?

Note, at the code readAsArrayBuffer(blob) is not returned from injectMetadata() call.

@thijstriemstra
Copy link
Contributor

@guest271314 I use webpack to compile it into my project but perhaps unpkg.com works for you (e.g. https://unpkg.com/[email protected]/lib/index.js)

@guest271314
Copy link
Author

@thijstriemstra Currently using https://cdn.webrtc-experiment.com/EBML.js. Can we mimic $ mkvmerge -w -o full.webm first.webm + second.webm with ts-ebml by concatenating only the SimbleBlock elements from second.webm, ...N.webm and changing the Duration?

@JimNewaz
Copy link

No Audio Duration
I'm facing a problem with audio duration length, I have a chrome extension, which works properly but when I download the recorded file it shows 00.00 duration. is there anyone who can solve the bug? Please
https://github.com/JimNewaz/Record-audio-chrome-extension-

@guest271314
Copy link
Author

@JimNewaz MediaRecorder does not support encoding 'audio/mpeg-3' at

https://github.com/JimNewaz/Record-audio-chrome-extension-/blob/main/record_audio/myapp.js#L109

You can use ts-ebml to set duration of 'video/webm' files, see #15.

@jessiejs
Copy link

Yes, I know this is closed, but is there any solution that doesn't require compiling your code to use browserify? I am using Thijs's code, but I cant just import ts-ebml since its client side js.

@guest271314
Copy link
Author

Yes, I know this is closed, but is there any solution that doesn't require compiling your code to use browserify? I am using Thijs's code, but I cant just import ts-ebml since its client side js.

There is a minimized version of ts-ebml at https://plnkr.co/edit/dTEO4HepKOY3xwqBemm5?p=preview&preview.

@buynao
Copy link

buynao commented Jan 3, 2022

The above solution doesn't work if the video file size exceeds 2GB, because it uses arrayBuffer internally and it has a size limit. Based on this, I wrapped webm-duration-fix, which uses blob.stream to solve the problem of fixing large files and memory usage.

@guest271314
Copy link
Author

FWIW a version of ts-ebml (minimized) that is a JavaScript module export https://github.com/guest271314/captureSystemAudio/blob/master/native_messaging/capture_system_audio/ts-ebml.min.js.

Usage

const {
  Decoder,
  Encoder,
  tools,
  Reader,
  injectMetadata,
} = await import('./ts-ebml.min.js');

@alex-coda-13
Copy link

Hello, since tools.makeMetadataSeekable use a lot of new Buffer(...) in tool.ts how can we use solution from @thijstriemstra ? (new Buffer is deprecated)

@guest271314
Copy link
Author

new Buffer() is not used in ts-ebml-min.js here https://plnkr.co/edit/dTEO4HepKOY3xwqBemm5?p=preview&preview.

@guest271314
Copy link
Author

The above solution doesn't work if the video file size exceeds 2GB, because it uses arrayBuffer internally and it has a size limit. Based on this, I wrapped webm-duration-fix, which uses blob.stream to solve the problem of fixing large files and memory usage.

I don't think ArraBuffer has a 2GB size limit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants