You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Issue Description
Currently, onnxruntime-gpu package lacks official support for ARM64/aarch64 architecture, limiting GPU acceleration capabilities on increasingly popular ARM-based platforms.
Current Situation
No official pre-built wheels for onnxruntime-gpu on ARM64/aarch64
Limited documentation for ARM64 GPU deployment
Official pre-built wheels for ARM64/aarch64
CI/CD pipeline additions for ARM64 builds
Would appreciate any feedback or guidance on how to make this happen?
Describe scenario use case
Use Case :
Growing adoption of ARM64 in edge/hpc computing.
Cloud deployments on ARM64-based servers (AWS Graviton, etc.)
Machine learning workloads on newer ARM-based development machines
IoT and embedded systems requiring GPU acceleration
The text was updated successfully, but these errors were encountered:
Describe the feature request
Issue Description
Currently, onnxruntime-gpu package lacks official support for ARM64/aarch64 architecture, limiting GPU acceleration capabilities on increasingly popular ARM-based platforms.
Current Situation
No official pre-built wheels for onnxruntime-gpu on ARM64/aarch64
Limited documentation for ARM64 GPU deployment
Technical Details
Architecture: ARM64/aarch64
Platform: Various (like AWS Graviton, etc.)
Python Versions: 3.8+ compatibility needed
Proposed Solution
Official pre-built wheels for ARM64/aarch64
CI/CD pipeline additions for ARM64 builds
Would appreciate any feedback or guidance on how to make this happen?
Describe scenario use case
Use Case :
Growing adoption of ARM64 in edge/hpc computing.
Cloud deployments on ARM64-based servers (AWS Graviton, etc.)
Machine learning workloads on newer ARM-based development machines
IoT and embedded systems requiring GPU acceleration
The text was updated successfully, but these errors were encountered: