You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For taxi mlflow model usecase - online deployment, I tried using Kubernetes as compute resource
As per the Microsoft docs, created aks, configured extensions, attached in ml workspace. Then I tried online deployment via cli v2, where I am facing the below issues - "InferencingClientCallFailed"
When deploying via ui, for mlflow models also, it is asking to upload scoring script and environment [ which is not incase of managed instance]
So, whether we need to manually add / configure scoring script and environment details for mlflow models when we use Kubernetes?
The text was updated successfully, but these errors were encountered:
Hi,
For taxi mlflow model usecase - online deployment, I tried using Kubernetes as compute resource
As per the Microsoft docs, created aks, configured extensions, attached in ml workspace. Then I tried online deployment via cli v2, where I am facing the below issues - "InferencingClientCallFailed"
When deploying via ui, for mlflow models also, it is asking to upload scoring script and environment [ which is not incase of managed instance]
So, whether we need to manually add / configure scoring script and environment details for mlflow models when we use Kubernetes?
The text was updated successfully, but these errors were encountered: