-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable modules snapshot for Android #1563
Comments
This comment was marked as abuse.
This comment was marked as abuse.
Snapshot generation can be generated on pc or have to be on device? If have to be on device, for start maybe the developer can manually use command or put some kinda function to generate it then send it back for deployment If this is greatly improve things up, even if this is the only way, i will still be happy :) |
Can this be extended to put the actual app code inside the blob? |
I would suggest investigating whether an alternative to v8 with multiple execution tiers such as JavaScriptCore, SpiderMonkey or ChakraCore would obviate the need to jump through heap snapshotting hoops - over in iOS land we are very happy with our startup times since bootstrapping the JavaScriptCore interpreter is way faster than bringing up the optimizing JIT tier. Since v8 lacks an interpreter, it takes longer before it can execute JavaScript code because it has to JIT compile it, right? |
This comment was marked as abuse.
This comment was marked as abuse.
@atanasovg Do you use same approach as Atom's https://github.com/atom/node-mksnapshot? |
Hello, guys. I'm going to update the issue with the results of a more recent research we did on V8 heap snapshots. Feel free to ask any questions, if something is missing or not quite clear to you. ResultsHere are the startup times of {N} Angular on a Nexus 5 device. Only second runs are included, in release configuration. {N} Angular Hello World App - Source
{N} Angular Render Test - Source
Snapshot size is about 12-16MB (4-6MB when the script is minified) per architecture. Tasks - 2.0
Tasks 2.1
Ping @atanasovg, @KristinaKoeva. P.S. On the regards of encryption, the heap snapshot turns out to be a poor shot, because all of the JavaScript source seems to be included inside the snapshot data blob. |
This comment was marked as abuse.
This comment was marked as abuse.
There is no `index.js` in the main folder and this breaks bundling. Related to: NativeScript/NativeScript#1563
The repo can be found here: https://github.com/NativeScript/android-snapshot |
Is the loading performance improvement already in 2.0? |
@x4080 This is not enabled by default for now, but we are looking for ways to do so in the next release. |
I see, so in the NS demo app it still not using it yet, I guess ? |
@x4080 Not yet. Once we use it there, you can expect the app to start a whole lot faster 😄 |
Alright then |
I can confirm that the mksnapshot tool from V8 when cross-compiled for ARM successfully generates ARM snapshots with the V8 ARM simulator from the host machine, without the need for any ARM devices or emulators 🎆 |
This comment was marked as abuse.
This comment was marked as abuse.
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
The Story
In a nutshell - since in Android application loading time is one of the areas that need improvement,
we've made a POC implementation that takes advantage of V8's startup snapshots feature to see what may be achieved with it. The results are quite promising and we may gain literally more than 1 second ! of an improvement by saving all the modules.
Technical Details
Due to the V8 API specifics, we need to bundle the entire modules JS into one single file and pass it to the V8::CreateSnapshotDataBlob method. What V8 does when making the snapshot is to parse, compile and run this script into a new Context and then to save the state of the heap into a binary representation. Then, upon next application runs, this binary file may be used to load the whole representation of the modules directly within memory.
Here is my proposal for taking advantage of this feature:
Distribute pre-generated BLOBs
The snapshot is CPU-architecture dependent. Hence, if we want to distribute pre-generated version of the snapshot we will need to package three files, saved against the three available architectures -
armeabi-v7a
,x86
andarm64-v8a
. The average size of one file is ~3 MB but it compresses very well and an archived version is about 400 KB.This is the most efficient way performance-wise. It adds further optimization by skipping the extraction of numerous JS files initially
Tasks:
Ensure modules are snapshot-ready. This is already done - Atanasovg/snapshot refactorings #1407
Write a custom bundler that puts all the modules content within a single JS file. I've already done a custom Node task for the POC but we may go with webpack for example (as @hdeshev) suggested.
Think how to automate snapshot generation against the three CPU architectures.
Think how to distribute (package) these three binary files. For example we may have two packages -
tns-core-modules
andtns-core-modules-snapshot
. Or we may package all into one package.Since the debugger does not work with snapshots, we will need to rely on the snapshot for release builds ONLY. Hence, there will be some effort on the CLI side. Plus, for release builds the CLI should not pack the modules JavaScript files but only the BLOBs (ping @teobugslayer, @ligaz).
Enable the Android Runtime to consume such a BLOB directly, depending on the current CPU architecture. We may use the same convention as for the native part of the Runtime itself, for example:
The text was updated successfully, but these errors were encountered: