Releases: mlcommons/mobile_app_open
MLPerf Mobile App v4.0
Description
MLPerf Mobile Inference Benchmark is an open-source benchmark suite for measuring how fast mobile devices (e.g. phones, laptops) can run AI tasks. The benchmark is supported by the MLPerf Mobile App which currently supports Android and iOS. Please see the MLPerf Mobile Inference benchmark paper for a detailed description of the benchmarks along with the motivation and guiding principles behind the benchmark suite.
Release Notes
New features
- Added a new benchmark task: Image Classification v2
- Implemented a new design for various screens.
- Added the ability to manage your uploaded results directly in the app.
- Added a web interface to view uploaded results (https://mlperf-mobile.mlcommons.org)
Bug fixes and other changes
- Integrated with Firebase Crashlytics to identify and fix app crashes.
- Updated and improved CI&CD system.
Full Changelog: v3.1...v4.0
Supported SOCs
Mediatek
- Dimensity 9000 series (9000/9000+/9200/9200+/9300/9300+)
- Dimensity 8000 series (8000/8020/8050/8100/8200)
Qualcomm
- Snapdragon 8 Gen 3
- Snapdragon 8s Gen 3
- Snapdragon 7 Gen 3
- Snapdragon 8 Gen 2
- Snapdragon 7+ Gen 2
- Snapdragon 8+ Gen 1
- Snapdragon 8 Gen 1
- Snapdragon 7 Gen 1
- Snapdragon 888
- Snapdragon 778
Samsung
- Exynos 2400
- Exynos 2300
- Exynos 2200
- Exynos 2100
- Exynos 990
Google Pixel
- Pixel 8 and Pixel 8 Pro (Tensor G3 SoC)
- Pixel 7 and Pixel 7 Pro (Tensor G2 ScC)
- Pixel 6 and Pixel 6 Pro (Tensor G1 ScC)
MLPerf Mobile 4.0 will also run on a host of other devices via our default path, which uses TensorFlow Lite on Android devices.
Installation instructions
- Allow installation of unknown apps in Settings > Apps > Special Access.
- Download the MLPerf Mobile APK.
- Find the APK in 'Downloads' or file browser.
- Tap the APK file. Approve installation when prompted.
- Confirm 'Install'.
- Once installed, tap 'Open' to launch MLPerf Mobile.
SHA256 of mlperfbench-v4.0-qsmgt.apk
: 816ceb4358b5dc5c90f3e67f72cbadd94531f265799403124a1b1dfe0b56a9a9
MLPerf Mobile App v3.1
Description
MLPerf Mobile Inference Benchmark is an open-source benchmark suite for measuring how fast mobile devices (e.g. phones, laptops) can run AI tasks. The benchmark is supported by the MLPerf Mobile App which currently supports Android and iOS. Please see the MLPerf Mobile Inference benchmark paper for a detailed description of the benchmarks along with the motivation and guiding principles behind the benchmark suite.
Release Notes
New features
- Added result upload and download.
- Added result filtering and sorting in the history screen.
- Added delegate choices for multiple backends.
- Qualcomm now supports default backend
Bug fixes and other changes
- Resolved an issue where cooldown pause not cancelled correctly.
- Added PNG support and PNG-based SNUSR dataset.
- Added support for pbtxt instead of header file for backend settings.
- Updated menu navigation in the main screen for a cleaner UI.
- Added max_duration flag.
- Updated the loadgen to the latest v3.1 version.
- Migrated from Flutter v3.3.5 to v3.7.6
- Updated and improved CI&CD system.
Full Changelog: v3.0...v3.1
Supported SOCs
Samsung
- Exynos 990
- Exynos 2100
- Exynos 2200
- Exynos 2300
- Exynos 2400
Qualcomm
- Snapdragon 8 Gen 2
- Snapdragon 7+ Gen 2
- Snapdragon 8+ Gen 1
- Snapdragon 8 Gen 1
- Snapdragon 7 Gen 1
- Snapdragon 888
- Snapdragon 778
- Default backend
Mediatek
- Dimensity 9000 series (9000/9000+/9200/9200+)
- Dimensity 8000 series (8000/8020/8050/8100/8200)
Pixel
- Pixel 6 and Pixel 6 PRO using Tensor G1 SOC
- Pixel 7 and Pixel 7 PRO using Tensor G2 SOC
Installation instructions
- Allow installation of unknown apps in Settings > Apps > Special Access.
- Download the MLPerf Mobile APK.
- Find the APK in 'Downloads' or file browser.
- Tap the APK file. Approve installation when prompted.
- Confirm 'Install'.
- Once installed, tap 'Open' to launch MLPerf Mobile.
SHA256 of mlperfbench-v3.1-qsmgt.apk
: 92f6cfdad9fa7c3ec5c1728c70107fd38d07e03e59c372089026c0c75789b3e5
MLPerf Mobile App v3.0
Description
MLPerf Mobile Inference Benchmark is an open-source benchmark suite for measuring how fast mobile devices (e.g. phones, laptops) can run AI tasks. The benchmark is supported by the MLPerf Mobile App which currently supports Android and iOS. Please see the MLPerf Mobile Inference benchmark paper for a detailed description of the benchmarks along with the motivation and guiding principles behind the benchmark suite.
Release Notes
New features
- Added a new Super Resolution task with the SNUSR dataset.
- Added a new Core ML backend for the iOS platform.
- Included additional device information in the result log (SoC, model, platform, etc.)
- Added an Accuracy Only test mode.
Bug fixes and other changes
- Fixed a bug where the accuracy value was not valid.
- Fixed a bug where the integration test would crash.
- Fixed a bug where the dataset info was not saved correctly.
- Fixed a bug where the backend loading error would be ignored.
- Added missing permission to access media on Android 13.
- Updated the loadgen to the latest v2.1 version.
- Migrated from Flutter v2.10.5 to v3.3.5
Supported SOCs
Samsung
- Exynos 990
- Exynos 2100
- Exynos 2200
Qualcomm
- Snapdragon 8 Gen 2
- Snapdragon 7+ Gen 2
- Snapdragon 8+ Gen 1
- Snapdragon 8 Gen 1
- Snapdragon 7 Gen 1
- Snapdragon 888
- Snapdragon 778
Mediatek
- Dimensity 9000 series (9000/9000+/9200/9200+)
- Dimensity 8000 series (8000/8020/8050/8100/8200)
Pixel
- Pixel 6 and Pixel 6 PRO using Tensor G1 SOC
- Pixel 7 and Pixel 7 PRO using Tensor G2 SOC
Installation instructions
- Allow installation of unknown apps in Settings > Apps > Special Access.
- Download the MLPerf Mobile APK.
- Find the APK in 'Downloads' or file browser.
- Tap the APK file. Approve installation when prompted.
- Confirm 'Install'.
- Once installed, tap 'Open' to launch MLPerf Mobile.
SHA256 of mlperfbench-v3.0-qsmgt.apk
: 195a110ab318f153631eb904abcffdda8c291c4e1ad9413ac68a5d447a1d0a1f
MLPerf Windows Command Line App v3.0
Installation instructions:
https://github.com/mlcommons/mobile_app_open/blob/submission-v3.0/mobile_back_qti/README.md
Supported SOC:
Snapdragon 8CX Gen 3 (Windows on Arm)
SHA256 checksum:
2023-03-13_mlperfbench_windows_qualcomm.zip
: 7ab12364cfa9d9aea90af51d48242afdfd3e222fc729cc155ecc8d3151ed7ea7
v2.1 Flutter Android
A Flutter-based Android APK
with following vendor's backends:
- Q - QTI (SNPE SDK
1.65.0.3676
was used in this build.) - S - Samsung SLSI
- M - MediaTek
- G - Google Pixel
- T - TFLite
and tasks:
- Image Classification
- Object Detection
- v2.0 Image Segmentation
- Language Understanding
- Image Classification (Offline)
v2.0 Android
File naming convention for vendor's backend:
Q - QTI
S - Samsung SLSI
M - MediaTek
G - Google Pixel
I - Intel
H - Huawei
T - TFLite
Notes:
SNPE SDK 1.59.1.3230 was used in this build.
v1.1 Android
File naming convention for vendor's backend:
- Q - QTI
- S - Samsung SLSI
- M - MediaTek
- G - Google Pixel
- I - Intel
- H - Huawei
- T - TFLite
Notes:
- SNPE SDK 1.54.2.2899 was used in this build.
Flutter Android Alpha 4
File naming convention for vendor's backend:
- Q - QTI
- S - Samsung SLSI
- M - MediaTek
- G - Google Pixel
- I - Intel
- H - Huawei
- T - TFLite
Note: SNPE SDK 1.54.2.2899 was used in this build.
v2.0.1a MOSAIC task
Old Android app with a new MOSAIC task added.
Flutter Android Alpha 3
Changes since Alpha 1:
- Fix passing nativeLibPath to backend.