-
Notifications
You must be signed in to change notification settings - Fork 24.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Inference API] Check pipelines on delete inference endpoint #109123
[Inference API] Check pipelines on delete inference endpoint #109123
Conversation
Hi @maxhniebergall, I've created a changelog YAML for you. |
…hub.com/elastic/elasticsearch into checkPipelinesOnDeleteInferenceEndpoint
import java.util.Objects; | ||
import java.util.Set; | ||
|
||
public class DeleteInferenceEndpointAction extends ActionType<AcknowledgedResponse> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this file already existed, it was just renamed
} | ||
} | ||
|
||
public static class Response extends AcknowledgedResponse { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the main changes to this file are here, in the Response
|
||
import java.util.Set; | ||
|
||
public class TransportDeleteInferenceEndpointAction extends AcknowledgedTransportMasterNodeAction<DeleteInferenceEndpointAction.Request> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this file was renamed
if (request.isDryRun()) { | ||
masterListener.onResponse( | ||
new DeleteInferenceEndpointAction.Response( | ||
false, | ||
InferenceProcessorInfoExtractor.pipelineIdsForResource(state, Set.of(request.getInferenceEndpointId())) | ||
) | ||
); | ||
return; | ||
} else if (request.isForceDelete() == false | ||
&& endpointIsReferencedInPipelines(state, request.getInferenceEndpointId(), listener)) { | ||
return; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is the main change in this file
* this class was duplicated from org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor | ||
*/ | ||
|
||
public final class InferenceProcessorInfoExtractor { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I couldn't figure out how to import this function from the ML module without causing Jarr hell, so I just duplicated it here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried moving org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor
to package org.elasticsearch.xpack.core.ml.utils
and it seems like I can reference it from both the inference plugin and ml. I didn't replace all the ml plugin's references though. I did have to hardcode the TYPE
like you mentioned below (maybe that could be moved too, I didn't look too hard).
Does moving it there cause issues for you?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did try that initially and it caused Jar hell, so I am hesitant to try that again. It seems like it should work, but for some reason it didn't work when I tried it.
# Conflicts: # server/src/main/java/org/elasticsearch/TransportVersions.java # x-pack/plugin/inference/src/main/java/org/elasticsearch/xpack/inference/InferencePlugin.java
Pinging @elastic/ml-core (Team:ML) |
...in/java/org/elasticsearch/xpack/inference/action/TransportDeleteInferenceEndpointAction.java
Outdated
Show resolved
Hide resolved
.../src/main/java/org/elasticsearch/xpack/inference/rest/RestDeleteInferenceEndpointAction.java
Outdated
Show resolved
Hide resolved
* this class was duplicated from org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor | ||
*/ | ||
|
||
public final class InferenceProcessorInfoExtractor { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried moving org.elasticsearch.xpack.ml.utils.InferenceProcessorInfoExtractor
to package org.elasticsearch.xpack.core.ml.utils
and it seems like I can reference it from both the inference plugin and ml. I didn't replace all the ml plugin's references though. I did have to hardcode the TYPE
like you mentioned below (maybe that could be moved too, I didn't look too hard).
Does moving it there cause issues for you?
...in/java/org/elasticsearch/xpack/inference/action/TransportDeleteInferenceEndpointAction.java
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left some smaller comments, cool stuff! 👏
...in/java/org/elasticsearch/xpack/inference/action/TransportDeleteInferenceEndpointAction.java
Outdated
Show resolved
Hide resolved
...in/java/org/elasticsearch/xpack/inference/action/TransportDeleteInferenceEndpointAction.java
Show resolved
Hide resolved
...e/src/main/java/org/elasticsearch/xpack/inference/utils/InferenceProcessorInfoExtractor.java
Outdated
Show resolved
Hide resolved
...e/src/main/java/org/elasticsearch/xpack/inference/utils/InferenceProcessorInfoExtractor.java
Outdated
Show resolved
Hide resolved
...in/java/org/elasticsearch/xpack/inference/action/TransportDeleteInferenceEndpointAction.java
Outdated
Show resolved
Hide resolved
if (modelIdsReferencedByPipelines.contains(inferenceEndpointId)) { | ||
listener.onFailure( | ||
new ElasticsearchStatusException( | ||
"Model " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this say Inference endpoint
instead of Model
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I'll do a follow up PR that more thoroughly replaces model
in the module
This PR adds the requirement that an inference endpoint not be referenced by an InferenceProcessor IngestProcessors in an ingest pipeline. This PR also adds two new options to the Delete inference endpoint API:
dry_run
andforce
.dry_run:true
will return a list of ingest processors which reference this inference endpoint.force:true
will delete the inference endpoint regardless of whether or not it is referenced by ingest processors.