Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ML] Support force stop deployment #118563

Merged
merged 6 commits into from
Nov 25, 2021
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -151,10 +151,13 @@ export function trainedModelsApiProvider(httpService: HttpService) {
});
},

stopModelAllocation(modelId: string) {
stopModelAllocation(modelId: string, options: { force: boolean } = { force: false }) {
const force = options?.force;

return httpService.http<{ acknowledge: boolean }>({
path: `${apiBasePath}/trained_models/${modelId}/deployment/_stop`,
method: 'POST',
query: { force },
});
},
};
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import React, { FC } from 'react';
import { EuiConfirmModal } from '@elastic/eui';
import { FormattedMessage } from '@kbn/i18n/react';
import type { OverlayStart } from 'kibana/public';
import type { ModelItem } from './models_list';
import { toMountPoint } from '../../../../../../../src/plugins/kibana_react/public';

interface ForceStopModelConfirmDialogProps {
model: ModelItem;
onCancel: () => void;
onConfirm: () => void;
}

export const ForceStopModelConfirmDialog: FC<ForceStopModelConfirmDialogProps> = ({
model,
onConfirm,
onCancel,
}) => {
return (
<EuiConfirmModal
title={
<FormattedMessage
id="xpack.ml.trainedModels.modelsList.forceStopDialog.title"
defaultMessage="Force stop model {modelId}?"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest removing the term Force from this dialog, as it seems like an API level detail that doesn't need to be shown in the UI.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed in 9ed4433

values={{ modelId: model.model_id }}
/>
}
onCancel={onCancel}
onConfirm={onConfirm}
cancelButtonText={
<FormattedMessage
id="xpack.ml.trainedModels.modelsList.forceStopDialog.cancelText"
defaultMessage="Cancel"
/>
}
confirmButtonText={
<FormattedMessage
id="xpack.ml.trainedModels.modelsList.forceStopDialog.confirmText"
defaultMessage="Stop"
/>
}
buttonColor="danger"
>
<FormattedMessage
id="xpack.ml.trainedModels.modelsList.forceStopDialog.pipelinesWarning"
defaultMessage="Selected model has associated pipelines: "
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe reword to something like The selected model is being used by the following pipelines: @lcawl any thoughts on the text in this dialog?

Copy link
Contributor

@lcawl lcawl Nov 24, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO the impact is unclear. Will the ingest pipelines fail until you start a new deployment or will the inference processor just be skipped? Perhaps something like this:

You can't use these ingest pipelines until you restart the model:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed in 9ed4433

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any index request hitting a pipeline that refers to a model that doesn't have a started deployment will fail. The bulk response will contain an item for each one of them explaining the model wasn't deployed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I've added similar information in the docs and API spec via elastic/elasticsearch#81026 and elastic/elasticsearch-specification#1061

/>
<ul>
{Object.keys(model.pipelines!)
.sort()
.map((pipelineName) => {
return <li key={pipelineName}>{pipelineName}</li>;
})}
</ul>
</EuiConfirmModal>
);
};

export const getUserConfirmationProvider =
(overlays: OverlayStart) => async (forceStopModel: ModelItem) => {
return new Promise(async (resolve, reject) => {
try {
const modalSession = overlays.openModal(
toMountPoint(
<ForceStopModelConfirmDialog
model={forceStopModel}
onCancel={() => {
modalSession.close();
resolve(false);
}}
onConfirm={() => {
modalSession.close();
resolve(true);
}}
/>
)
);
} catch (e) {
resolve(false);
}
});
};
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ import { useFieldFormatter } from '../../contexts/kibana/use_field_formatter';
import { FIELD_FORMAT_IDS } from '../../../../../../../src/plugins/field_formats/common';
import { useRefresh } from '../../routing/use_refresh';
import { DEPLOYMENT_STATE } from '../../../../common/constants/trained_models';
import { getUserConfirmationProvider } from './force_stop_dialog';

type Stats = Omit<TrainedModelStat, 'model_id'>;

Expand Down Expand Up @@ -81,6 +82,7 @@ export const ModelsList: FC = () => {
const {
services: {
application: { navigateToUrl, capabilities },
overlays,
},
} = useMlKibana();
const urlLocator = useMlLocator()!;
Expand Down Expand Up @@ -111,6 +113,8 @@ export const ModelsList: FC = () => {
{}
);

const getUserConfirmation = useMemo(() => getUserConfirmationProvider(overlays), []);

const navigateToPath = useNavigateToPath();

const isBuiltInModel = useCallback(
Expand Down Expand Up @@ -454,13 +458,21 @@ export const ModelsList: FC = () => {
available: (item) => item.model_type === 'pytorch',
enabled: (item) =>
!isLoading &&
!isPopulatedObject(item.pipelines) &&
isPopulatedObject(item.stats?.deployment_stats) &&
item.stats?.deployment_stats?.state !== DEPLOYMENT_STATE.STOPPING,
onClick: async (item) => {
const requireForceStop = isPopulatedObject(item.pipelines);

if (requireForceStop) {
const hasUserApproved = await getUserConfirmation(item);
if (!hasUserApproved) return;
}

try {
setIsLoading(true);
await trainedModelsApiService.stopModelAllocation(item.model_id);
await trainedModelsApiService.stopModelAllocation(item.model_id, {
force: requireForceStop,
});
displaySuccessToast(
i18n.translate('xpack.ml.trainedModels.modelsList.stopSuccess', {
defaultMessage: 'Deployment for "{modelId}" has been stopped successfully.',
Expand Down
3 changes: 3 additions & 0 deletions x-pack/plugins/ml/server/routes/trained_models.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ import { modelsProvider } from '../models/data_frame_analytics';
import { TrainedModelConfigResponse } from '../../common/types/trained_models';
import { memoryOverviewServiceProvider } from '../models/memory_overview';
import { mlLog } from '../lib/log';
import { forceQuerySchema } from './schemas/anomaly_detectors_schema';

export function trainedModelsRoutes({ router, routeGuard }: RouteInitialization) {
/**
Expand Down Expand Up @@ -262,6 +263,7 @@ export function trainedModelsRoutes({ router, routeGuard }: RouteInitialization)
path: '/api/ml/trained_models/{modelId}/deployment/_stop',
validate: {
params: modelIdSchema,
query: forceQuerySchema,
},
options: {
tags: ['access:ml:canGetDataFrameAnalytics'],
Expand All @@ -272,6 +274,7 @@ export function trainedModelsRoutes({ router, routeGuard }: RouteInitialization)
const { modelId } = request.params;
const { body } = await mlClient.stopTrainedModelDeployment({
model_id: modelId,
force: request.query.force ?? false,
});
return response.ok({
body,
Expand Down