Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit 7dbca1a
Author: Ning <[email protected]>
Date:   Fri Aug 16 12:06:37 2019 -0700

    update gcloud ml-engine to ai-platform (kubeflow#1863)

commit e849f22
Author: Ryan Dawson <[email protected]>
Date:   Fri Aug 16 19:21:23 2019 +0100

    Seldon examples (kubeflow#1405)

commit a6f3f40
Author: sina chavoshi <[email protected]>
Date:   Fri Aug 16 09:56:09 2019 -0700

    Adding a sample for serving component (kubeflow#1830)

    * Adding a sample for serving component

    * removed typo / updated based on PR feedback

    * fixing the jupyter rendering issue

    * adding pip3 for tensorflow

    * Fixed spelling error in VERSION

    * fix indentation based on review feedback

commit 54ff3e6
Author: Alexey Volkov <[email protected]>
Date:   Fri Aug 16 01:22:31 2019 -0700

     SDK - Cleanup - Serialized PipelineParamTuple does not need value or type (kubeflow#1469)

    * SDK - Refactoring - Serialized PipelineParam does not need type
    Only the types in non-serialized PipelineParams are ever used.

    * SDK - Refactoring - Serialized PipelineParam does not need value
    Default values are only relevant when PipelineParam is used in the pipeline function signature and even in this case compiler captures them explicitly from the pipelineParam objects in the signature.
    There is no other uses for them.

commit d2e94e4
Author: Riley Bauer <[email protected]>
Date:   Thu Aug 15 19:52:34 2019 -0700

    Fix run duration bug (kubeflow#1827)

    * Allows durations >=24h and renames 'showLink' in RunList

    * Update, fix tests

commit d8eaeaa
Author: Alexey Volkov <[email protected]>
Date:   Thu Aug 15 17:25:59 2019 -0700

    SDK - Preserving the pipeline input information in the compiled Workflow (kubeflow#1381)

    * SDK - Preserving the pipeline metadata in the compiled Workflow

    * Stabilizing the DSL compiler tests

commit afe8a69
Author: Kirin Patel <[email protected]>
Date:   Thu Aug 15 17:00:28 2019 -0700

    Reduce API usage by utilizing reference name in reference resource API (kubeflow#1824)

    * Regenerated run api for frontend

    * Added support for reference name to resource reference API in frontend

    * Revert "Regenerated run api for frontend"

    * Addressed PR comments

    * Removed extra if statement by setting default value of parameter

    * Removed the whole comment

    * Addressed PR feedback

    * Addressed PR feedback

    * Simplified logic after offline discussion

commit ea67c99
Author: Kirin Patel <[email protected]>
Date:   Thu Aug 15 16:18:35 2019 -0700

    Change how Variables are Provided to Visualizations (kubeflow#1754)

    * Changed way visualization variables are passed from request to NotebookNode

    Visualization variables are now saved to a json file and loaded by a NotebookNode upon execution.

    * Updated roc_curve visualization to reflect changes made to dependency injection

    * Fixed bug where checking if is_generated is provided to roc_curve visualization would crash visualizaiton

    Also changed ' -> "

    * Changed text_exporter to always sort variables by key for testing

    * Addressed PR suggestions

commit d238bef
Author: Jiaxiao Zheng <[email protected]>
Date:   Thu Aug 15 12:55:29 2019 -0700

    Add back coveralls. (kubeflow#1849)

    * Remove redundant import.

    * Simplify sample_test.yaml by using withItem syntax.

    * Simplify sample_test.yaml by using withItem syntax.

    * Change dict to str in withItems.

    * Add back coveralls.

commit 0517114
Author: Riley Bauer <[email protected]>
Date:   Thu Aug 15 12:28:35 2019 -0700

    Reduce getPipeline calls in RunList (kubeflow#1852)

    * Skips calling getPipeline in RunList if the pipeline name is in the pipeline_spec

    * Update fixed data to include pipeline names in pipeline specs

    * Remove redundant getRuns call

commit 39e5840
Author: IronPan <[email protected]>
Date:   Thu Aug 15 11:04:34 2019 -0700

    Add retry button in Pipeline UI (kubeflow#1782)

    * add retry button

    * add retry button

    * add retry button

    * address comments

    * fix tests

    * fix tests

    * update image

    * Update StatusUtils.test.tsx

    * Update RunDetails.test.tsx

    * Update Buttons.ts

    * update test

    * update frontend

    * update

    * update

    * addrerss comments

    * update test
  • Loading branch information
numerology committed Aug 25, 2019
1 parent 96fd193 commit ce9b3e9
Showing 1 changed file with 297 additions and 0 deletions.
297 changes: 297 additions & 0 deletions samples/core/model_serving_component/model_serving_component.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,297 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"# Copyright 2019 Google Inc. All Rights Reserved.\n",
"#\n",
"# Licensed under the Apache License, Version 2.0 (the \"License\");\n",
"# you may not use this file except in compliance with the License.\n",
"# You may obtain a copy of the License at\n",
"#\n",
"# http://www.apache.org/licenses/LICENSE-2.0\n",
"#\n",
"# Unless required by applicable law or agreed to in writing, software\n",
"# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
"# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
"# See the License for the specific language governing permissions and\n",
"# limitations under the License."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Install Pipeline SDK - This only needs to be ran once in the enviroment. \n",
"!pip3 install kfp --upgrade\n",
"!pip3 install tensorflow==1.14"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## KubeFlow Pipelines Serving Component\n",
"In this notebook, we will demo:\n",
"\n",
"* Saving a Keras model in a format compatible with TF Serving\n",
"* Creating a pipeline to serve a trained model within a KubeFlow cluster\n",
"\n",
"Reference documentation:\n",
"* https://www.tensorflow.org/tfx/serving/architecture\n",
"* https://www.tensorflow.org/beta/guide/keras/saving_and_serializing\n",
"* https://www.kubeflow.org/docs/components/serving/tfserving_new/"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Setup\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"tags": [
"parameters"
]
},
"outputs": [],
"source": [
"# Set your output and project. !!!Must Do before you can proceed!!!\n",
"PROJECT_NAME = 'Your-Gcp-Project-ID' #'Your-GCP-Project-ID'\n",
"MODEL_NAME = 'Model-Name' # Model name matching TF_serve naming requirements \n",
"EXPERIMENT_NAME = 'serving_component'\n",
"MODEL_VERSION = '1' # A number representing the version model \n",
"OUTPUT_BUCKET = 'gs://%s-serving-component' % PROJECT_NAME # A GCS bucket for asset outputs\n",
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:fe639f41661d8e17fcda64ff8242127620b80ba0'\n",
"MODEL_PATH = '%s/%s' % (OUTPUT_BUCKET,MODEL_NAME) \n",
"MODEL_VERSION_PATH = '%s/%s/%s' % (OUTPUT_BUCKET,MODEL_NAME,MODEL_VERSION)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Creating gs://chavoshi-dev-2-serving-component/...\n"]
}
],
"source": [
"!gsutil mb {OUTPUT_BUCKET}"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"#Get or create an experiment and submit a pipeline run\n",
"import kfp\n",
"client = kfp.Client()\n",
"\n",
"try:\n",
" experiment = client.get_experiment(experiment_name=EXPERIMENT_NAME)\n",
"except:\n",
" experiment = client.create_experiment(EXPERIMENT_NAME)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Load a Keras Model \n",
"Loading a pretrained Keras model to use as an example. "
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"import tensorflow as tf"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"WARNING: Logging before flag parsing goes to stderr.\n",
"W0813 22:25:44.771417 139829630277440 deprecation.py:506] From /opt/conda/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1251: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"Call initializer instance with the dtype argument instead of passing it to the constructor\n"
]
}
],
"source": [
"model = tf.keras.applications.NASNetMobile(input_shape=None,\n",
" include_top=True,\n",
" weights='imagenet',\n",
" input_tensor=None,\n",
" pooling=None,\n",
" classes=1000)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Saved the Model for TF-Serve\n",
"Save the model using keras export_saved_model function. Note that specifically for TF-Serve the output directory should be structure as model_name/model_version/saved_model."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"W0813 22:26:30.974738 139829630277440 deprecation.py:506] From /opt/conda/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:97: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"Call initializer instance with the dtype argument instead of passing it to the constructor\n",
"W0813 22:26:30.992686 139829630277440 deprecation.py:506] From /opt/conda/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:97: calling Ones.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"Call initializer instance with the dtype argument instead of passing it to the constructor\n",
"W0813 22:26:31.067510 139829630277440 deprecation.py:506] From /opt/conda/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:97: calling GlorotUniform.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"Call initializer instance with the dtype argument instead of passing it to the constructor\n",
"W0813 22:27:34.353115 139829630277440 deprecation.py:323] From /opt/conda/lib/python3.6/site-packages/tensorflow/python/saved_model/signature_def_utils_impl.py:201: build_tensor_info (from tensorflow.python.saved_model.utils_impl) is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info.\n"
]
}
],
"source": [
"tf.keras.experimental.export_saved_model(model,MODEL_VERSION_PATH)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Create a pipeline using KFP TF-Serve component\n"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"def kubeflow_deploy_op():\n",
" return dsl.ContainerOp(\n",
" name = 'deploy',\n",
" image = KUBEFLOW_DEPLOYER_IMAGE,\n",
" arguments = [\n",
" '--model-export-path', MODEL_PATH,\n",
" '--server-name', MODEL_NAME,\n",
" ]\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"import kfp\n",
"import kfp.dsl as dsl\n",
"from kfp.gcp import use_gcp_secret\n",
"\n",
"# The pipeline definition\n",
"@dsl.pipeline(\n",
" name='Sample model deployer',\n",
" description='Sample for deploying models using KFP model serving component'\n",
")\n",
"def model_server():\n",
" deploy = kubeflow_deploy_op().apply(use_gcp_secret('user-gcp-sa'))"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [],
"source": [
"pipeline_func = model_server\n",
"pipeline_filename = pipeline_func.__name__ + '.pipeline.zip'\n",
"\n",
"import kfp.compiler as compiler\n",
"compiler.Compiler().compile(pipeline_func, pipeline_filename)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"Run link <a href=\"/pipeline/#/runs/details/8fb3c380-be19-11e9-b782-42010a8000a3\" target=\"_blank\" >here</a>"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"#Specify pipeline argument values\n",
"arguments = {}\n",
"\n",
"#Submit a pipeline run\n",
"run_name = pipeline_func.__name__ + ' run'\n",
"run_result = client.run_pipeline(experiment.id, run_name, pipeline_filename, arguments)\n",
"\n",
"#This link leads to the run information page. \n",
"#Note: There is a bug in JupyterLab that modifies the URL and makes the link stop working"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.7"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

0 comments on commit ce9b3e9

Please sign in to comment.