-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to correctly append buffers to play multiple media files in sequence? #190
Comments
Just tried with https://github.com/w3c/web-platform-tests/blob/master/media-source/mp4/test.mp4 and the |
However, if we place the array buffer of media having shortest duration before array buffer of media having longest duration the media playback stops rendering after first media file playback completes, though |
Followup to previous post, if longest media is placed after shortest media within |
Have you experimented with different MSE implementations? Is there any chance this is an implementation bug in the browser you are using? /paulc |
@paulbrucecotton Trying at Chromium 60.0.3112.78 (Developer Build). Have not tried at Firefox, yet. Seeking guidance on what the recommended or working pattern is to achieve expected result. Can include the code tried here if that will be helpful. |
@paulbrucecotton Just tried at Firefox 55, though the codec |
@paulbrucecotton Code tried using
One issue that have been facing is testing sourcing and using properly encoded media files served with CORS headers. Not sure if files created using |
@guest271314 wrote:
What do you mean it's not supported? it certainly is [removed off topic comments] |
The quotes surrounding value of
returns expected result at Firefox 55. Still have yet to try pattern with more than two media files. |
@jyavenard Just tried with four requests for media files. Chromium 60 plays the files in sequence without an issue using same code pattern. Firefox 55 logged an apparent
http://plnkr.co/edit/9FYe4cJ6d4BC0B0LyOmN?p=preview, https://jsfiddle.net/hcfvyx9k/1/ |
@jyavenard The media finally rendered playback at Firefox 55 with four requests. Not sure about the reason for previous errors? Timeout? |
@guest271314 you had a space in your original mimetype separating the two codecs, so yes, quotes are required then. If you remove the space, there's no need for quotes. Your fiddle worked first go here. |
Not sure what you mean as to sequence mode? What is the effective rendering difference between sequence and segments mode relevant to the expected result described at OP? Can you illuminate? |
Oops my bad. I read the title and understood it as you using sequence mode. My first comment was about using the source buffer in sequence mode (as opposed to segment mode) |
What is the effective difference between the two modes as to processing or rendering the media? Is |
http://w3c.github.io/media-source/#dom-mediasource-addsourcebuffer step 7 for the default value. For an explanation of the different modes |
@jyavenard Utilizing https://github.com/legokichi/ts-ebml was able to convert a The use case is trying to play media fragments at At step 6. at first link does the algorithm expect the metadata within the file to be at a certain location within the file being read? |
RE: https://bugs.chromium.org/p/chromium/issues/detail?id=820489
Where we want to play specific segments of media content as a single stream to convey the resulting communication when such media is "concatenated" or "merged", and not rendered individually, or with no noticeable (to perception) gaps in in the media playback. The concept is to, ideally
The above is a brief synopsis of the concept. Based on the combined implementations of To a an approeciable extent the use case is possible using Am not sure if there is interest in a unified specification to merge the disparate APIs into a single "media editing" API, or if there is any interest at all in the use case outside of creating a "playlist" of media. |
The code could look something like
or without using the
though the result from the above is inconsistent as to gaps in playback or playback at all (no video or audio track?) and we would be using a |
Finally composed a pattern using
The audio and video are not precisely synchronized, though a start at using the https://plnkr.co/edit/E8OlvwiUmCwIUTOKSNkv at version 5. Have not yet tried with |
Was able to put together code which records
Observations: The implementations of Firefox 60 using
Chromium 64 using Since presently we know the
Even where Questions:
|
@wolenetz The code at #190 (comment) results in a
|
@wolenetz Just tried the code at #190 (comment) at Chromium without |
@wolenetz How to remedy the various differences between Firefox and Chromium implementations of Can we work towards creating a specification which decodes any media file - potentially faster than real-time - into a media container, i.e.g., |
The tab still crashes at Chromium. |
This looks like further investigation is needed to see if this is a spec or implementation issue. Note also that WebCodecs incubations may support some of the underlying use case better, and that MediaRecorder implementation issues may also be contributing to issues. Further note, MSE changeType() could be used to switch among bytestreams and codecs buffered within a single SourceBuffer. |
@wolenetz The use case has been largely solved by the experiments that were successful in meeting the requirement at branches here https://github.com/guest271314/MediaFragmentRecorder/branches/all. In particular master branch was crashing at Chromium because of https://bugs.chromium.org/p/chromium/issues/detail?id=992235. Have not retested the |
This code https://github.com/guest271314/MediaFragmentRecorder/blob/master/MediaFragmentRecorder.html currently does not output expected results at Chromium 87.0.4270.0 (Official Build) snap (64-bit) Revision | 6fc672b0fa6a30d0e3426e4a3f8d418290855a9c-refs/branch-heads/4270@{#1} Linux V8 8.7.142 or Nightly 83.0a1 (2020-09-27) (64-bit). At one point the code achieved the same result at both browsers. Will try again substituting
|
Chromium audio is still 6 seconds ahead of video https://bugs.chromium.org/p/chromium/issues/detail?id=1006617, even when using |
Have been attempting to implement, for lack of a more descriptive reference, an "offline media context". The basic concept is to be able to use the tools available at the most recent browsers alone to record or request media fragments capable of independent playback and to be able to concatenate those discrete media fragments into a single stream of media playback at an
HTMLMediaElement
. A brief summary of the progression of the proof of concept Proposal: Implement OfflineMediaContext #2824.From the outset have had the sense that
MediaSource
could possibly be utilized to achieve part if not all of the requirement. However, had not located an existing or configured an appropriate pattern during own testing to realize the concatenation of discrete files usingMediaSource
.Found this question and answer How do i append two video files data to a source buffer using media source api? which appeared to indicate that setting the
.timestampOffset
property ofMediaSource
could result in sequencing media playback of discrete buffers appended toSourceBuffer
. Following the question led to a 2012 Editor's Draft Media Source ExtensionsW3C Editor's Draft 8 October 2012 which states at 2.11. Applying Timestamp Offsets
Which tried dozens of times using different patterns over the past several days. Interestingly all attempts using
.webm
video files failed; generally resulting in the following being logged atconsole
at plnkrAll of attempts using
.mp4
video files failed save for a single.mp4
file which is a downloaded copy of "Big Buck Bunny" trailer. Not entirely sure where downloaded the file from during testing, though may have been "https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4". Is this fact related to what is revealed at FFmpeg FAQ?
Made a copy of the original file in the same directory. Used
<input type="file">
withmultiple
attribute set to upload the files. Converted theFile
objects toArrayBuffer
usingFileReader
and used, in pertinent part, this patternThe questions that have for authors and contributors to the specification are
What is the correct code pattern (as clear and definitive as possible) to use to append array buffers from discrete files or media fragments to one or more
SourceBuffer
s ofMediaSource
, where theHTMLMediaElement
renders playback of each of the files or media fragments?Why was a single
.mp4
which was copied the only two files whichMediaSource
correctly set the.duration
of the to total time of the two files and rendered playback?The text was updated successfully, but these errors were encountered: