Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Inference API] Check pipelines on delete inference endpoint #109123

Merged
merged 20 commits into from
May 31, 2024

Conversation

maxhniebergall
Copy link
Member

This PR adds the requirement that an inference endpoint not be referenced by an InferenceProcessor IngestProcessors in an ingest pipeline. This PR also adds two new options to the Delete inference endpoint API: dry_runand force. dry_run:true will return a list of ingest processors which reference this inference endpoint. force:true will delete the inference endpoint regardless of whether or not it is referenced by ingest processors.

@elasticsearchmachine
Copy link
Collaborator

Hi @maxhniebergall, I've created a changelog YAML for you.

import java.util.Objects;
import java.util.Set;

public class DeleteInferenceEndpointAction extends ActionType<AcknowledgedResponse> {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this file already existed, it was just renamed

}
}

public static class Response extends AcknowledgedResponse {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the main changes to this file are here, in the Response


import java.util.Set;

public class TransportDeleteInferenceEndpointAction extends AcknowledgedTransportMasterNodeAction<DeleteInferenceEndpointAction.Request> {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this file was renamed

Comment on lines +89 to +100
if (request.isDryRun()) {
masterListener.onResponse(
new DeleteInferenceEndpointAction.Response(
false,
InferenceProcessorInfoExtractor.pipelineIdsForResource(state, Set.of(request.getInferenceEndpointId()))
)
);
return;
} else if (request.isForceDelete() == false
&& endpointIsReferencedInPipelines(state, request.getInferenceEndpointId(), listener)) {
return;
}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is the main change in this file

* this class was duplicated from org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor
*/

public final class InferenceProcessorInfoExtractor {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I couldn't figure out how to import this function from the ML module without causing Jarr hell, so I just duplicated it here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried moving org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor to package org.elasticsearch.xpack.core.ml.utils and it seems like I can reference it from both the inference plugin and ml. I didn't replace all the ml plugin's references though. I did have to hardcode the TYPE like you mentioned below (maybe that could be moved too, I didn't look too hard).

Does moving it there cause issues for you?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did try that initially and it caused Jar hell, so I am hesitant to try that again. It seems like it should work, but for some reason it didn't work when I tried it.

# Conflicts:
#	server/src/main/java/org/elasticsearch/TransportVersions.java
#	x-pack/plugin/inference/src/main/java/org/elasticsearch/xpack/inference/InferencePlugin.java
@maxhniebergall maxhniebergall marked this pull request as ready for review May 28, 2024 19:42
@elasticsearchmachine elasticsearchmachine added the Team:ML Meta label for the ML team label May 28, 2024
@elasticsearchmachine
Copy link
Collaborator

Pinging @elastic/ml-core (Team:ML)

* this class was duplicated from org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor
*/

public final class InferenceProcessorInfoExtractor {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried moving org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor to package org.elasticsearch.xpack.core.ml.utils and it seems like I can reference it from both the inference plugin and ml. I didn't replace all the ml plugin's references though. I did have to hardcode the TYPE like you mentioned below (maybe that could be moved too, I didn't look too hard).

Does moving it there cause issues for you?

Copy link
Contributor

@timgrein timgrein left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left some smaller comments, cool stuff! 👏

@maxhniebergall maxhniebergall self-assigned this May 29, 2024
if (modelIdsReferencedByPipelines.contains(inferenceEndpointId)) {
listener.onFailure(
new ElasticsearchStatusException(
"Model "
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this say Inference endpoint instead of Model?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I'll do a follow up PR that more thoroughly replaces model in the module

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
>enhancement :ml Machine learning Team:ML Meta label for the ML team v8.15.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants