MLPerf Mobile App v3.1
Description
MLPerf Mobile Inference Benchmark is an open-source benchmark suite for measuring how fast mobile devices (e.g. phones, laptops) can run AI tasks. The benchmark is supported by the MLPerf Mobile App which currently supports Android and iOS. Please see the MLPerf Mobile Inference benchmark paper for a detailed description of the benchmarks along with the motivation and guiding principles behind the benchmark suite.
Release Notes
New features
- Added result upload and download.
- Added result filtering and sorting in the history screen.
- Added delegate choices for multiple backends.
- Qualcomm now supports default backend
Bug fixes and other changes
- Resolved an issue where cooldown pause not cancelled correctly.
- Added PNG support and PNG-based SNUSR dataset.
- Added support for pbtxt instead of header file for backend settings.
- Updated menu navigation in the main screen for a cleaner UI.
- Added max_duration flag.
- Updated the loadgen to the latest v3.1 version.
- Migrated from Flutter v3.3.5 to v3.7.6
- Updated and improved CI&CD system.
Full Changelog: v3.0...v3.1
Supported SOCs
Samsung
- Exynos 990
- Exynos 2100
- Exynos 2200
- Exynos 2300
- Exynos 2400
Qualcomm
- Snapdragon 8 Gen 2
- Snapdragon 7+ Gen 2
- Snapdragon 8+ Gen 1
- Snapdragon 8 Gen 1
- Snapdragon 7 Gen 1
- Snapdragon 888
- Snapdragon 778
- Default backend
Mediatek
- Dimensity 9000 series (9000/9000+/9200/9200+)
- Dimensity 8000 series (8000/8020/8050/8100/8200)
Pixel
- Pixel 6 and Pixel 6 PRO using Tensor G1 SOC
- Pixel 7 and Pixel 7 PRO using Tensor G2 SOC
Installation instructions
- Allow installation of unknown apps in Settings > Apps > Special Access.
- Download the MLPerf Mobile APK.
- Find the APK in 'Downloads' or file browser.
- Tap the APK file. Approve installation when prompted.
- Confirm 'Install'.
- Once installed, tap 'Open' to launch MLPerf Mobile.
SHA256 of mlperfbench-v3.1-qsmgt.apk
: 92f6cfdad9fa7c3ec5c1728c70107fd38d07e03e59c372089026c0c75789b3e5