Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix error when deploying a model from mlflow #4413

Merged
merged 3 commits into from
Jun 7, 2022

Conversation

Rusteam
Copy link
Contributor

@Rusteam Rusteam commented May 20, 2022

Add config_path key to copy_paths to avoid KeyError when deploying from mlflow models in onnx flavor with config.pbtxt

Add `config_path` key to `copy_paths` to avoid `KeyError` when deploying from mlflow models in **onnx** flavor with `config.pbtxt`
@GuanLuo
Copy link
Contributor

GuanLuo commented May 23, 2022

Hi @Rusteam, thanks for the contribution. The change looks good but do you mind to sign the CLA as instructed here before we can merge this pull request?

@Rusteam
Copy link
Contributor Author

Rusteam commented May 24, 2022

I don't mind. But I'm not a corporation, I'm an individual contributor. Is it fine?

@Rusteam
Copy link
Contributor Author

Rusteam commented Jun 1, 2022

@GuanLuo I've sent a signed CLA to the email mentioned. Can you review and merge now?

I've also added two other commits: to fix ensemble deployment and labels.txt if present

@GuanLuo
Copy link
Contributor

GuanLuo commented Jun 1, 2022

@Rusteam sorry that I missed the earlier message, we have received your CLA and the change looks good to me. I would like to extend L0_mlflow test to cover this case where ONNX model is logged with additional files (i.e. config / labels). I wonder if you have any insight on logging ONNX flavor in this form, currently the test is using the MLFlow ONNX plugin to log the model, but I don't think mflow.onnx.log_model can accept additional files

@Rusteam
Copy link
Contributor Author

Rusteam commented Jun 2, 2022

You're right it does not. What I do is I log other files as artifacts right after logging the model, i.e.:

with mlflow.start_run():
    mlflow.onnx.log_model(model, path)
    mlflow.log_artifact("labels.txt", path)
    mlflow.log_artifact("config.pbtxt", path)

@GuanLuo
Copy link
Contributor

GuanLuo commented Jun 4, 2022

Thanks for the hint, I have extended the test to cover the case with config in this PR #4469, feel free to leave any comments

@GuanLuo GuanLuo merged commit 359ebd8 into triton-inference-server:main Jun 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants