Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Documentation] CoreML python support and how to switch to MLProgram #22792

Open
Rikyf3 opened this issue Nov 10, 2024 · 3 comments · May be fixed by #22958
Open

[Documentation] CoreML python support and how to switch to MLProgram #22792

Rikyf3 opened this issue Nov 10, 2024 · 3 comments · May be fixed by #22958
Assignees
Labels
documentation improvements or additions to documentation; typically submitted using template ep:CoreML issues related to CoreML execution provider

Comments

@Rikyf3
Copy link

Rikyf3 commented Nov 10, 2024

Describe the documentation issue

Hi.

First of all thank you for the incredible work you do :D

I was looking at the Apple - CoreML page for help on how to fully support a production model in CoreML EP. As the model contains ConvTranspose I have to enable MLProgram. The production code is in python. The documentation does not state neither that python API is supported or how to enable COREML_FLAG_ONLY_ENABLE_DEVICE_WITH_ANE.

Can this two information be added to the documentation?

Thanks,
Riccardo

Page / URL

https://onnxruntime.ai/docs/execution-providers/CoreML-ExecutionProvider.html

@Rikyf3 Rikyf3 added the documentation improvements or additions to documentation; typically submitted using template label Nov 10, 2024
@github-actions github-actions bot added the ep:CoreML issues related to CoreML execution provider label Nov 10, 2024
@wejoncy
Copy link
Contributor

wejoncy commented Nov 15, 2024

Thanks for filing the issue.

A more general path to enable ML_Program is implementing by this PR

After that, I will make the doc more clear and detail.

@Rikyf3
Copy link
Author

Rikyf3 commented Nov 17, 2024

Thank you very much :)
Is there any way in which I can enable it already in 1.19.0 to make the model fully compatible?

@wejoncy
Copy link
Contributor

wejoncy commented Nov 18, 2024

providers = [
    ('CoreMLExecutionProvider', {
        'flags': "COREML_FLAG_CREATE_MLPROGRAM",
    }),
    'CPUExecutionProvider',
]

session = ort.InferenceSession(model_path, providers=providers)

@wejoncy wejoncy linked a pull request Nov 29, 2024 that will close this issue
@wejoncy wejoncy self-assigned this Nov 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation improvements or additions to documentation; typically submitted using template ep:CoreML issues related to CoreML execution provider
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants