-
Notifications
You must be signed in to change notification settings - Fork 24.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ML] Close inference services #104726
[ML] Close inference services #104726
Conversation
@@ -20,7 +20,7 @@ | |||
import java.util.function.Function; | |||
import java.util.stream.Collectors; | |||
|
|||
public class InferenceServiceRegistry extends AbstractLifecycleComponent { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe we need this to be an AbstractLifecycleComponent
anymore since it's not part of the NodeConstruction
class right?
What is the purpose of |
Yeah and we have some background threads that run and it shuts those down as well. |
@elasticmachine run elasticsearch-ci/part-1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@elasticmachine merge upstream |
Pinging @elastic/ml-core (Team:ML) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Co-authored-by: Elastic Machine <[email protected]>
This PR cleans up some logic and adds functionality to close the inference services when the
InferencePlugin::close()
is called. This removes the need to keep a reference to theHttpClientManager
since it'll be closed by the individual services. This helps prepare the batching logic to have a separate http pool for searching and ingest.