Skip to content
This repository has been archived by the owner on Oct 7, 2021. It is now read-only.

Support Android scoped storage #440

Merged
merged 12 commits into from
Sep 21, 2020

Conversation

alexcohn
Copy link
Contributor

@alexcohn alexcohn commented May 30, 2020

This intends to answer #334 and #185. Inspired by https://stackoverflow.com/a/60702386/192373.

This works with all videos received from ACTION_OPEN_DOCUMENT on all Android versions (tested 21, 29, and R), including custom document provider (I tested Google Drive) and removable sdcard on Android 10.

This also works for output, as set with ACTION_CREATE_DOCUMENT, and understands the media suffix (e.g. mp3) to create file format automatically (if the user requests so).

The same protocol can also be used to properly read bundled assets on Android (see https://stackoverflow.com/q/24701029/192373). This PR does not demo this use case.

@alexcohn alexcohn changed the title WIP: Support Android scoped storage Support Android scoped storage Jun 8, 2020
@alexcohn
Copy link
Contributor Author

alexcohn commented Jun 9, 2020

I have another improvement for scoped storage, where I take care of the "content" Uri in C, thus cleaning up the Java code and possibly, serving people who want to use the C API directly from their Android app. I put it in an "experiment" branch for now, because I am still not certain whether this is a good thing.

@tanersener
Copy link
Owner

I see that this PR is no longer in progress. First of all, why do we need this? I thought that Android reverted some of their changes in the API and so we no longer had issues about it.

@alexcohn
Copy link
Contributor Author

alexcohn commented Jun 19, 2020

why do we need this?

First of all, it's cool. You can work with content: just like any other supported protocol. Actually, it inherits almost everything from file: protocol. I wanted to see how smooth this can be, and I am quite satisfied with the result.

I have a separate experiment in progress (close to final), that does all this on the C side, even more cool. A developer can use mobile_ffmpeg from Java or from native code, pass content: as received from DOCUMENT_OPEN, and it will just work. The price is that this modifies the ffmpeg code (some changes must go into libavformat/file.c).

Yes, Android 11 intends to step back on the restrictions of extracting the 'real path' from such Uri, but it's still not released, we don't know how it will evolve. And anyways, there are tons of Android 10 devices that are not going to disappear where the workaround of requestLegacyExternalStorage is necessary.

With this protocol, we have a better, cleaner solution. We don't need the external storage permissions, it works on sd-card where others seem to have troubles…

Recently, I came across another problem with mobile-ffmpeg: https://stackoverflow.com/a/62439309/192373. The output file must be exposed to MediaStore. With the content: protocol, I could provide such service out of the box, so that people don't need to employ GallerySaver or other plugins/libraries to copy their video output.

I don't want to press you to embrace this change. Please see the two feature branches https://github.com/alexcohn/mobile-ffmpeg/tree/feature/scoped-storage and https://github.com/alexcohn/mobile-ffmpeg/tree/feature/scoped-storage-jni (in your spare time) and if you have any thoughts about it, let me know.

@JayParikh20
Copy link

JayParikh20 commented Jul 14, 2020

@alexcohn Sorry, I had deleted it, after seeing the SAF section in wiki. But it did not help, I read somewhere that it does not work for mp4 files due to seeking problems. Currently i'm in the process of building ffmpeg using your pull request. How do I use it in the command?

will this syntax work?

int fd = parcelFileDescriptor.getFd();
String[] cmd ={ "-y", "-i",  "saf:",  fd+"/"+uri.getLastPathSegment(),  "-vf", ....};

Previously commented
How do i use this with android inputstream and ffmpeg? This feature can be very useful, when other apps share media to your app, getting file path from content URI is pretty hard, unless we copy the whole file to our local app storage. Can you please help, i'm stuck with this same problem! ^

@alexcohn
Copy link
Contributor Author

alexcohn commented Jul 14, 2020

Your expression has extra , instead of + between "saf:" and file "name". So,

int fd = parcelFileDescriptor.getFd();
String[] cmd ={ "-y", "-i",  "saf:" + fd+"/"+uri.getLastPathSegment(),  "-vf", ....};

But I suggest that you try another branch that hides all these troubles on the C side, https://github.com/alexcohn/mobile-ffmpeg/tree/feature/scoped-storage-jni. The command will simply look as

String[] cmd ={ "-y", "-i",  uri.toString(), "-vf", ....};

@JayParikh20
Copy link

JayParikh20 commented Jul 14, 2020

@alexcohn hey, I have tried both of them. Jni one was definitely easier, but I am getting some problems.

  1. When I try to overlay png (on blank white background) it works using uri but before starting the execution, I get this in the logcat
recovered name 'IMG-20200714-WA0025.jpg' for 'nsp'
recovered name 'IMG-20200714-WA0025.jpg' for 'nsv'
recovered name 'IMG-20200714-WA0025.jpg' for 'nut'
recovered name 'IMG-20200714-WA0025.jpg' for 'ogg'
recovered name 'IMG-20200714-WA0025.jpg' for 'oma,omg,aa3'
recovered name 'IMG-20200714-WA0025.jpg' for 'al'

This goes for very long, lasts about 2-3 seconds and then starts executing command. This does not happen in the current pull request. (only occurs in the jni branch)

  1. When tried using a mp4 video file, nothing happens. I get no log or success/failure. Looks like it is waiting for data.
    I am just using mediaUri.tostring() is something more needed to be done while using videos?

Thanks a lot for your help!

@alexcohn
Copy link
Contributor Author

@JayParikh20 Thanks. Could you post the ffmpeg commands that were used in the two cases?

@JayParikh20
Copy link

JayParikh20 commented Jul 15, 2020

@alexcohn yes, sure. I have hardcoded some stuff.
this one is for video file:

rc = FFmpeg.execute(new String[]{"-y", "-f", "lavfi", "-i", "color=0xFFFFFF:1080x1080" , "-i", selectedMediaUri.toString(), "-filter_complex", "[1:v]scale=720:720" + "[vid];[0:v][vid]overlay=0:0:shortest=1", "-c:v", "libx264", "-preset", "veryfast", outputFile.getAbsolutePath()});

This one is for image files:

rc = FFmpeg.execute(new String[]{"-y", "-f", "lavfi", "-i", "color=0xFFFFFF:1080x1080" , "-i", selectedMediaUri.toString(), "-filter_complex", "[1:v]scale=720:720" + "[img];[0:v][img]overlay=0:0:shortest=1", outputFile.getAbsolutePath()});

@tanersener
Copy link
Owner

I'm looking at the changes in this PR. There are a few things that I don't understand. For example, ParcelFileDescriptor.getFd() returns a native file descriptor. But I see that videoUri.getLastPathSegment() is appended at the end of saf:. What is the reason for doing that? Isn't it possible to open a file with just file descriptor?

@alexcohn
Copy link
Contributor Author

videoUri.getLastPathSegment() is appended at the end of saf:

Here is why: ffmpeg looks at the file name (both on input and on output) to determine the file type. On input, this is usually not critical, and sniffing the content can resolve the ambiguity. On output, this is the only source of information, unless you set the format explicitly (people rarely do).

So, the protocol handler for fd: takes both the integer descriptor and the name and processes the latter as usual (literally, I did not have to change a single line of code for this to work).

The name part is optional. Everything will still work if it is not appended (if the system can choose the output format).

@tanersener
Copy link
Owner

tanersener commented Jul 16, 2020

Okay, thanks. Have you considered implementing this protocol as a more general solution that non-Android users can use as well? If getLastPathSegment() part is removed, this solution will allow users to use file descriptors as input or output. Which can be used in other platforms too. Additionally, you can submit this PR to FFmpeg as well. Not sure whether they'll approve it or not but this is something FFmpeg lacks.

@JayParikh20
Copy link

This is weird, video files are not working even if I use a file path. (both branches)

@alexcohn
Copy link
Contributor Author

@JayParikh20, are you sure you have libx264 enabled? Also, if it takes few seconds to run on a single jpg for you, try a real short video to begin with.

@alexcohn
Copy link
Contributor Author

alexcohn commented Jul 19, 2020

If getLastPathSegment() part is removed, this solution will allow users to use file descriptors as input or output.

Have you got a use case for this protocol on iOS? Because for Android I could simply extend the pipe: protocol, by borrowing the seekable option from file:. The content: protocol (as in https://github.com/alexcohn/mobile-ffmpeg/tree/feature/scoped-storage-jni) is a different story. I hope soon we will resolve the issues with @JayParikh20 and I will be ready to propose it for this PR.

As you see, using JNI removes a lot of complexity from the API, and makes it much more robust.

@alexcohn alexcohn changed the title Support Android scoped storage WIP: Support Android scoped storage Jul 19, 2020
@tanersener
Copy link
Owner

Have you got a use case for this protocol on iOS?

Are you asking if there is a module on iOS similar to Scoped Storage that returns fds? I don't know. In fact, I don't think we need to have such a module to introduce this protocol on iOS/tvOS. This solution is based on file descriptors and file descriptors can be used to access files in any os which is posix complaint, not only iOS/tvOS.

Because for Android I could simply extend the pipe: protocol, by borrowing the seekable option from file:.

This can break the pipe: protocol. I don't think this is a good idea.

@JayParikh20
Copy link

JayParikh20 commented Jul 19, 2020

@alexcohn, I checked it again, x264 is enabled.
getExternalLibraries: [x264]

I tried it with 10 seconds mp4 video, waited for approx 5 mins, no output. [using both uri & file]
I'm using this command to build the library.
./android.sh --enable-gpl --enable-x264 --enable-android-zlib --enable-android-media-codec
Tried it with both lts and non lts, just in case.

@alexcohn
Copy link
Contributor Author

@JayParikh20 Let us switch to another thread, maybe here: alexcohn#1.

@alexcohn
Copy link
Contributor Author

Are you asking if there is a module on iOS similar to Scoped Storage that returns fds?

Not exactly. I wonder what use cases for passing file descriptors may exist on iOS (except classical pipes).

As for breaking the pipe: protocol: don't worry, the idea is to only extend it with optional seekable. Such extension is clean and follows the pattern of other ffmpeg protocols (file: included).

The point is that a pipe: is controlled by the parent, and actually so is saf: protocol, the Java app opens all the files (inputs and outputs) and passes them by descriptor to ffmpeg.

On the other hand, content: is different: here, ffmpeg decides when a file must be opened and when it will be closed.

@tanersener
Copy link
Owner

tanersener commented Jul 20, 2020

Extending a protocol is always possible. My main concern is about maintaining that protocol. Those changes are applied inside ffmpeg source code. Every change made there brings new risks and, modifying an existing feature brings more risks than introducing a new feature. I prefer to choose the safest option here. Remember that we don't have tests for mobile platforms and manually testing every feature is not always possible.

@alexcohn
Copy link
Contributor Author

I have found a problem with my approach: the image2 logic breaks because they interpret the context: string as file path, and they can find % characters there, so they try to use sprintf() to encode the packet number into the file name.

@tanersener
Copy link
Owner

I don't know the details of content: protocol, but I find it very similar to data: protocol. Isn't it possible to use data: protocol for saf ?

@alexcohn
Copy link
Contributor Author

With data:, the entire video (or image) should be in memory. I was thinking how data: protocol could help with ReplayKit, but even there the difference is too much.

@tanersener
Copy link
Owner

Apparently content: is very different than data:. I thought it included the actual file data.

@alexcohn alexcohn force-pushed the feature/scoped-storage branch 2 times, most recently from 4ca25d6 to 609d0b9 Compare July 22, 2020 20:11
@alexcohn alexcohn changed the title WIP: Support Android scoped storage Support Android scoped storage Jul 22, 2020
@alexcohn
Copy link
Contributor Author

@tanersener In the end, I feel quite confident with the minimalistic approach. You can see the changes to file.c – this is the only little change necessary in ffmpeg core code. The PR does yet not include override of the upstream file.c with the patched one. I believe that a git patch may be a better way to express the intent here, but a simple sed script also could do. What do you think?

@alexcohn
Copy link
Contributor Author

is it possible to add a tooltip/toast/something about it, so people can get why it is disabled?

sure

@alexcohn
Copy link
Contributor Author

It took some while to get back to this, but here is a change that seems to answer your expectations. It adds a simple SAF tab. What do you think?

@HBiSoft
Copy link

HBiSoft commented Sep 7, 2020

Thanks @alexcohn
I've been wanting to test this for some time now and finally got around to it.
It seems to be working perfectly.

@tanersener
Copy link
Owner

tanersener commented Sep 13, 2020

Thanks, I'm planning to test these changes this week.

I have a few questions regarding the protocol definition. Is it possible to use this protocol without using Config.getSafParameter() method and its variants?

Also, I'm looking at the syntax of other protocols supported by ffmpeg. There are different implementations out there. Some of the definitions come from the original RFCs. However, I couldn't see a protocol definition that starts with : (not ://) and use / to define an option. Would you mind using another character to split file descriptor and display name?

@alexcohn
Copy link
Contributor Author

This is not a 'protocol'. In the end, I am using custom AVIO and the saf: prefix is used by the wrapper to pass the input substring to this custom context.

You cannot prepare a compliant saf: string without calling Config.getSafParameter() or reproducing its not-very-trivial logic. It's not enough to find out what numeric file descriptor has been assigned for a given content: Uri; you must explicitly request a detached file descriptor (and handle its lifecycle).

The special string prefix closely follows the way custom protocols are defined in ffmpeg, including file:, pipe: and data:. As for use of /, it does not define an option, but rather adds optional context that sometimes may help working with the parameter. Choice of the split character is absolutely random, this could be any symbol that can stop the decimal sequence, e.g. saf:123.audio.mp3. But it would be nice if this character is clearly not part of the name of the file itself, that's why / is one of the natural choices.

There is a live discussion about the files with spaces in their display name, so before closing this PR, let's see that @HBiSoft has no outstanding questions.

@alexcohn
Copy link
Contributor Author

I guess the spaces problem has finally been resolved for good.

@tanersener
Copy link
Owner

Tested the changes. Everything, except saving to cloud worked during my tests. Saving to cloud crashes for me unfortunately. Was that tested?

@alexcohn
Copy link
Contributor Author

I confirm that write to Google Drive was failing. Unfortunately, the fix requires JNI. Please see alexcohn@cdb4bdf

@alexcohn
Copy link
Contributor Author

So, there are no unresolved issues anymore, and write to drive works now. Is there anything else missing?

@tanersener tanersener merged commit b898574 into tanersener:development Sep 21, 2020
@tanersener
Copy link
Owner

No, it was necessary to re-run the tests after your latest commit. Just finished running them and merged the changes.

Thanks for working on that 👍

@PaulWoitaschek
Copy link

I just tried this and it crashes:

This is the ffprobe command executed:

[-print_format, json=c=1, -show_chapters, -loglevel, quiet, -show_entries, format=duration, -show_entries, format_tags=artist,title,album, -show_entries, stream_tags=artist,title,album, -select_streams, a, -i, saf:69/28 - Kapitel 8_ Mondmänner und Katzonauten.mp3]

And this is the sigabrt crash:

A/libc: fdsan: attempted to close file descriptor 69, expected to be unowned, actually owned by ParcelFileDescriptor 0xcd37e73
A/libc: Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 14414 (DefaultDispatch), pid 14385 (audiobook.debug)

Tested targeting android 11 using a pixel 4.

@alexcohn
Copy link
Contributor Author

Tested targeting android 11 using a pixel 4.

Well, I must rest Android 11. Also, where does the media file actually located?

@PaulWoitaschek
Copy link

It's on /storage/sdcard/audiobooks/xyz/abc.mp3

@alexcohn
Copy link
Contributor Author

I confirm this problem on Android API 30.

imostovyi pushed a commit to imostovyi/mobile-ffmpeg that referenced this pull request Apr 2, 2021
* development: (155 commits)
  fix android async example
  update ios manual test application
  update android test app
  support iOS 14, fixes tanersener#553
  Support Android scoped storage (tanersener#440)
  Update arch-common.sh (tanersener#534)
  fix monitorWait method for android, fixes tanersener#520
  reorganise android.mk for cpu-features
  update the packaging for android arm-v7a and arm-v7a-neon architectures and remove custom abi rules for releases, fixes tanersener#511
  fix proguard rules, fixes tanersener#508
  target android api level 30
  update android gradle version
  Update README
  move tooltips to resources
  fix android release packaging script
  update ios test-app settings
  fix ios/tvos system library indexes in ffmpeg build scripts
  use v4.4.LTS in test applications
  release v4.4.LTS
  fix mobile-ffmpeg videotoolbox flags for tvos
  ...

# Conflicts:
#	ios/configure
#	ios/src/MobileFFmpeg.h
#	ios/src/MobileFFmpeg.m
#	src/ffmpeg/configure
#	src/ffmpeg/libswscale/aarch64/hscale.S
#	src/libvorbis/doc/Vorbis_I_spec.html
#	src/libvorbis/doc/Vorbis_I_spec.pdf
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants