Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit 41d39c1
Author: Kirin Patel <[email protected]>
Date:   Wed Aug 21 19:38:31 2019 -0700

    Add run with json data as input within fixed-data.ts for UI testing and development (kubeflow#1895)

    * Added run with json data as input

    * Changed run and uid to not be duplicates of hello-world-runtime

commit 851e7c8
Author: Kirin Patel <[email protected]>
Date:   Wed Aug 21 19:04:31 2019 -0700

    Replace codemirror editor react component with react-ace editor component (kubeflow#1890)

    * Replaced CodeMirror with Editor in PipelineDetails.tsx

    * Replaced codemirror in DetailsTable with react-ace

    * Removed codemirror

    * Updated unit tests for Editor.tsx to test placeholder and value in simplified manner

    * Updated DetailsTable.test.tsx.snap to reflect changes made in DetailsTable.tsx

    * Updated PipelineDetails test snapshot

    * Changed width of Editor in DetailsTable to be 100% instead of 300px

    * Revert "Updated unit tests for Editor.tsx to test placeholder and value in simplified manner"

    This reverts commit 40103f2.

commit 8c3d6fe
Author: Kirin Patel <[email protected]>
Date:   Wed Aug 21 18:30:33 2019 -0700

    Add visualization-server service to lightweight deployment (kubeflow#1844)

    * Add visualization-server service to lightweight deployment

    * Addressed PR suggestions

    * Added field to determine if visualization service is active and fixed unit tests for visualization_server.go

    * Additional small fixes

    * port change from 88888 -> 8888
    * version change from 0.1.15 -> 0.1.26
    * removed visualization-server from base/kustomization.yaml

    * Fixed visualization_server_test.go to reflect new changes

    * Changed implementation to be fail fast

    * Changed host name to be constant provided by environment

    * Added retry and extracted isVisualizationServiceAlive logic to function

    * Fixed deployment.yaml file

    * Fixed serviceURL configuration issuse

    serviceURL is now properly obtained from the environment, the service ip address and port are used rather than service name and namespace

    * Added log message to indicate when visualization service is unreachable

    * Addressed PR comments

    * Removed _HTTP

commit ad307db
Author: Eterna2 <[email protected]>
Date:   Thu Aug 22 08:52:32 2019 +0800

    [Bug Fix] Delete ResourceOp should not have output parameters (kubeflow#1822)

    * Fix bug where delete resource op should not have success_condition, failure_condition, and output parameters

    * remove unnecessary whitespace

    * compiler test for delete resource ops should retrieve templates from spec instead of root

commit 593f25a
Author: Alexey Volkov <[email protected]>
Date:   Wed Aug 21 17:16:33 2019 -0700

    Collecting coverage when running python tests (kubeflow#898)

    * Collecting coiverage when running python tests

    * Added coveralls to python unit tests

    * Try removing the PATH modification

    * Specifying coverage run --source

    * Using the installed package

    * Try getting the correct coverage paths

commit 553885f
Author: Alexey Volkov <[email protected]>
Date:   Wed Aug 21 16:38:12 2019 -0700

    SDK - Components - Fixed ModelBase comparison bug (kubeflow#1874)

commit 2622c67
Author: IronPan <[email protected]>
Date:   Wed Aug 21 16:37:53 2019 -0700

    cleanup test dir (kubeflow#1914)

commit 203307d
Author: Alexey Volkov <[email protected]>
Date:   Wed Aug 21 16:37:21 2019 -0700

    SDK - Lightweight - Fixed custom types in multi-output case (kubeflow#1875)

    The type was mistakenly serialized as `_ForwardRef('CustomType')`.
    The input parameter types and single-output types were not affected.

commit 2e7f2d4
Author: IronPan <[email protected]>
Date:   Wed Aug 21 16:36:35 2019 -0700

    Add cloud sql and gcs connection for pipeline-lite deployment (kubeflow#1910)

    * restructure

    * working example

    * working example

    * move mysql

    * moving minio and mysql out

    * add gcp

    * add files

    * fix test

commit 9adf163
Author: Alexey Volkov <[email protected]>
Date:   Wed Aug 21 16:29:54 2019 -0700

    SDK - Airflow - Fixed bug in airflow op creation (kubeflow#1911)

    This PR fixes a bug in AirFlow op creation.
    The `_run_airflow_op` helper function was not captured along with the `_run_airflow_op_closure` function, because they belong to different modules (`_run_airflow_op_closure` was module-less).
    This was not discovered during the notebook testing of the code since in that environment the `_run_airflow_op` was also module-less as it was defined in a notebook (not in .py file).

commit 7ec5697
Author: hongye-sun <[email protected]>
Date:   Wed Aug 21 16:06:31 2019 -0700

    Release 151c534 (kubeflow#1916)

    * Updated component images to version 151c534

    * Updated components to version a97f1d0

    * Update setup.py

    * Update setup.py

commit 7e062ce
Author: IronPan <[email protected]>
Date:   Wed Aug 21 15:14:31 2019 -0700

    Update README.md

commit 8e1e823
Author: Christian Clauss <[email protected]>
Date:   Thu Aug 22 00:04:31 2019 +0200

    Lint Python code for undefined names (kubeflow#1721)

    * Lint Python code for undefined names

    * Lint Python code for undefined names

    * Exclude tfdv.py to workaround an overzealous pytest

    * Fixup for tfdv.py

    * Fixup for tfdv.py

    * Fixup for tfdv.py
  • Loading branch information
numerology committed Aug 22, 2019
1 parent f5a857d commit bed1135
Show file tree
Hide file tree
Showing 125 changed files with 919 additions and 584 deletions.
12 changes: 7 additions & 5 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -66,13 +66,15 @@ matrix:
env: TOXENV=py35
script: &1
# Additional dependencies
- pip3 install jsonschema==3.0.1
- pip3 install coverage coveralls jsonschema==3.0.1
# DSL tests
- cd $TRAVIS_BUILD_DIR/sdk/python
- python3 setup.py install
- python3 tests/dsl/main.py
- python3 tests/compiler/main.py
- $TRAVIS_BUILD_DIR/sdk/python/tests/run_tests.sh
- python3 setup.py develop
- cd $TRAVIS_BUILD_DIR # Changing the current directory to the repo root for correct coverall paths
- coverage run --source=kfp --append sdk/python/tests/dsl/main.py
- coverage run --source=kfp --append sdk/python/tests/compiler/main.py
- coverage run --source=kfp --append -m unittest discover --verbose --start-dir sdk/python/tests --top-level-directory=sdk/python
- coveralls

# Visualization test
- cd $TRAVIS_BUILD_DIR/backend/src/apiserver/visualization
Expand Down
3 changes: 3 additions & 0 deletions backend/src/apiserver/client_manager.go
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,9 @@ const (

podNamespace = "POD_NAMESPACE"
initConnectionTimeout = "InitConnectionTimeout"

visualizationServiceHost = "ML_PIPELINE_VISUALIZATIONSERVER_SERVICE_HOST"
visualizationServicePort = "ML_PIPELINE_VISUALIZATIONSERVER_SERVICE_PORT"
)

// Container for all service clients
Expand Down
9 changes: 8 additions & 1 deletion backend/src/apiserver/main.go
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,14 @@ func startRpcServer(resourceManager *resource.ResourceManager) {
api.RegisterRunServiceServer(s, server.NewRunServer(resourceManager))
api.RegisterJobServiceServer(s, server.NewJobServer(resourceManager))
api.RegisterReportServiceServer(s, server.NewReportServer(resourceManager))
api.RegisterVisualizationServiceServer(s, server.NewVisualizationServer(resourceManager))
api.RegisterVisualizationServiceServer(
s,
server.NewVisualizationServer(
resourceManager,
getStringConfig(visualizationServiceHost),
getStringConfig(visualizationServicePort),
getDurationConfig(initConnectionTimeout),
))

// Register reflection service on gRPC server.
reflection.Register(s)
Expand Down
1 change: 1 addition & 0 deletions backend/src/apiserver/server/BUILD.bazel
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ go_library(
"//backend/src/common/util:go_default_library",
"//backend/src/crd/pkg/apis/scheduledworkflow/v1beta1:go_default_library",
"@com_github_argoproj_argo//pkg/apis/workflow/v1alpha1:go_default_library",
"@com_github_cenkalti_backoff//:go_default_library",
"@com_github_golang_glog//:go_default_library",
"@com_github_golang_protobuf//jsonpb:go_default_library_gen",
"@com_github_robfig_cron//:go_default_library",
Expand Down
45 changes: 41 additions & 4 deletions backend/src/apiserver/server/visualization_server.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,18 +4,22 @@ import (
"context"
"encoding/json"
"fmt"
"github.com/cenkalti/backoff"
"github.com/golang/glog"
"github.com/kubeflow/pipelines/backend/api/go_client"
"github.com/kubeflow/pipelines/backend/src/apiserver/resource"
"github.com/kubeflow/pipelines/backend/src/common/util"
"io/ioutil"
"net/http"
"net/url"
"strings"
"time"
)

type VisualizationServer struct {
resourceManager *resource.ResourceManager
serviceURL string
resourceManager *resource.ResourceManager
serviceURL string
isServiceAvailable bool
}

func (s *VisualizationServer) CreateVisualization(ctx context.Context, request *go_client.CreateVisualizationRequest) (*go_client.Visualization, error) {
Expand Down Expand Up @@ -56,6 +60,12 @@ func (s *VisualizationServer) validateCreateVisualizationRequest(request *go_cli
// service to generate HTML visualizations from a request.
// It returns the generated HTML as a string and any error that is encountered.
func (s *VisualizationServer) generateVisualizationFromRequest(request *go_client.CreateVisualizationRequest) ([]byte, error) {
if !s.isServiceAvailable {
return nil, util.NewInternalServerError(
fmt.Errorf("service not available"),
"Service not available",
)
}
visualizationType := strings.ToLower(go_client.Visualization_Type_name[int32(request.Visualization.Type)])
arguments := fmt.Sprintf("--type %s --source %s --arguments '%s'", visualizationType, request.Visualization.Source, request.Visualization.Arguments)
resp, err := http.PostForm(s.serviceURL, url.Values{"arguments": {arguments}})
Expand All @@ -73,6 +83,33 @@ func (s *VisualizationServer) generateVisualizationFromRequest(request *go_clien
return body, nil
}

func NewVisualizationServer(resourceManager *resource.ResourceManager) *VisualizationServer {
return &VisualizationServer{resourceManager: resourceManager, serviceURL: "http://visualization-service.kubeflow"}
func isVisualizationServiceAlive(serviceURL string, initConnectionTimeout time.Duration) bool {
var operation = func() error {
_, err := http.Get(serviceURL)
if err != nil {
glog.Error("Unable to verify visualization service is alive!", err)
return err
}
return nil
}
b := backoff.NewExponentialBackOff()
b.MaxElapsedTime = initConnectionTimeout
err := backoff.Retry(operation, b)
return err == nil
}

func NewVisualizationServer(resourceManager *resource.ResourceManager, serviceHost string, servicePort string, initConnectionTimeout time.Duration) *VisualizationServer {
serviceURL := fmt.Sprintf("http://%s:%s", serviceHost, servicePort)
isServiceAvailable := isVisualizationServiceAlive(serviceURL, initConnectionTimeout)
return &VisualizationServer{
resourceManager: resourceManager,
serviceURL: serviceURL,
// TODO: isServiceAvailable is used to determine if the new visualization
// service is alive. If this is true, then the service is alive and
// requests can be made to it. Otherwise, if it is false, the service is
// not alive and requests should not be made. This prevents timeouts and
// counteracts current instabilities with the service. This should be
// removed after the visualization service is deemed stable.
isServiceAvailable: isServiceAvailable,
}
}
60 changes: 53 additions & 7 deletions backend/src/apiserver/server/visualization_server_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,10 @@ import (
func TestValidateCreateVisualizationRequest(t *testing.T) {
clients, manager, _ := initWithExperiment(t)
defer clients.Close()
server := NewVisualizationServer(manager)
server := &VisualizationServer{
resourceManager: manager,
isServiceAvailable: false,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "gs://ml-pipeline/roc/data.csv",
Expand All @@ -27,7 +30,10 @@ func TestValidateCreateVisualizationRequest(t *testing.T) {
func TestValidateCreateVisualizationRequest_ArgumentsAreEmpty(t *testing.T) {
clients, manager, _ := initWithExperiment(t)
defer clients.Close()
server := NewVisualizationServer(manager)
server := &VisualizationServer{
resourceManager: manager,
isServiceAvailable: false,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "gs://ml-pipeline/roc/data.csv",
Expand All @@ -43,7 +49,10 @@ func TestValidateCreateVisualizationRequest_ArgumentsAreEmpty(t *testing.T) {
func TestValidateCreateVisualizationRequest_SourceIsEmpty(t *testing.T) {
clients, manager, _ := initWithExperiment(t)
defer clients.Close()
server := NewVisualizationServer(manager)
server := &VisualizationServer{
resourceManager: manager,
isServiceAvailable: false,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "",
Expand All @@ -59,7 +68,10 @@ func TestValidateCreateVisualizationRequest_SourceIsEmpty(t *testing.T) {
func TestValidateCreateVisualizationRequest_ArgumentsNotValidJSON(t *testing.T) {
clients, manager, _ := initWithExperiment(t)
defer clients.Close()
server := NewVisualizationServer(manager)
server := &VisualizationServer{
resourceManager: manager,
isServiceAvailable: false,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "gs://ml-pipeline/roc/data.csv",
Expand All @@ -80,7 +92,11 @@ func TestGenerateVisualization(t *testing.T) {
rw.Write([]byte("roc_curve"))
}))
defer httpServer.Close()
server := &VisualizationServer{resourceManager: manager, serviceURL: httpServer.URL}
server := &VisualizationServer{
resourceManager: manager,
serviceURL: httpServer.URL,
isServiceAvailable: true,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "gs://ml-pipeline/roc/data.csv",
Expand All @@ -90,8 +106,34 @@ func TestGenerateVisualization(t *testing.T) {
Visualization: visualization,
}
body, err := server.generateVisualizationFromRequest(request)
assert.Equal(t, []byte("roc_curve"), body)
assert.Nil(t, err)
assert.Equal(t, []byte("roc_curve"), body)
}

func TestGenerateVisualization_ServiceNotAvailableError(t *testing.T) {
clients, manager, _ := initWithExperiment(t)
defer clients.Close()
httpServer := httptest.NewServer(http.HandlerFunc(func(rw http.ResponseWriter, req *http.Request) {
assert.Equal(t, "/", req.URL.String())
rw.WriteHeader(500)
}))
defer httpServer.Close()
server := &VisualizationServer{
resourceManager: manager,
serviceURL: httpServer.URL,
isServiceAvailable: false,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "gs://ml-pipeline/roc/data.csv",
Arguments: "{}",
}
request := &go_client.CreateVisualizationRequest{
Visualization: visualization,
}
body, err := server.generateVisualizationFromRequest(request)
assert.Nil(t, body)
assert.Equal(t, "InternalServerError: Service not available: service not available", err.Error())
}

func TestGenerateVisualization_ServerError(t *testing.T) {
Expand All @@ -102,7 +144,11 @@ func TestGenerateVisualization_ServerError(t *testing.T) {
rw.WriteHeader(500)
}))
defer httpServer.Close()
server := &VisualizationServer{resourceManager: manager, serviceURL: httpServer.URL}
server := &VisualizationServer{
resourceManager: manager,
serviceURL: httpServer.URL,
isServiceAvailable: true,
}
visualization := &go_client.Visualization{
Type: go_client.Visualization_ROC_CURVE,
Source: "gs://ml-pipeline/roc/data.csv",
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/predict/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ outputs:
- {name: Predictions dir, type: GCSPath, description: 'GCS or local directory.'} #Will contain prediction_results-* and schema.json files; TODO: Split outputs and replace dir with single file # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:151c5349f13bea9d626c988563c04c0a86210c21
command: [python2, /ml/predict.py]
args: [
--data, {inputValue: Data file pattern},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tfdv/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ outputs:
- {name: Validation result, type: String, description: Indicates whether anomalies were detected or not.}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:151c5349f13bea9d626c988563c04c0a86210c21
command: [python2, /ml/validate.py]
args: [
--csv-data-for-inference, {inputValue: Inference data},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tfma/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ outputs:
- {name: Analysis results dir, type: GCSPath, description: GCS or local directory where the analysis results should were written.} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:151c5349f13bea9d626c988563c04c0a86210c21
command: [python2, /ml/model_analysis.py]
args: [
--model, {inputValue: Model},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tft/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ outputs:
- {name: Transformed data dir, type: GCSPath} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:151c5349f13bea9d626c988563c04c0a86210c21
command: [python2, /ml/transform.py]
args: [
--train, {inputValue: Training data file pattern},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ KFP_PACKAGE = 'https://storage.googleapis.com/ml-pipeline/release/0.1.14/kfp.tar
import kfp.components as comp

bigquery_query_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/bigquery/query/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/bigquery/query/component.yaml')
help(bigquery_query_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ outputs:
type: GCSPath
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-gcp:151c5349f13bea9d626c988563c04c0a86210c21
args: [
kfp_component.google.bigquery, query,
--query, {inputValue: query},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@
"import kfp.components as comp\n",
"\n",
"bigquery_query_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/bigquery/query/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/bigquery/query/component.yaml')\n",
"help(bigquery_query_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/container/component_sdk/python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from setuptools import setup

PACKAGE_NAME = 'kfp-component'
VERSION = '0.1.26'
VERSION = '0.1.27'

setup(
name=PACKAGE_NAME,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ KFP_PACKAGE = 'https://storage.googleapis.com/ml-pipeline/release/0.1.14/kfp.tar
import kfp.components as comp

dataflow_python_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/dataflow/launch_python/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/dataflow/launch_python/component.yaml')
help(dataflow_python_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-gcp:151c5349f13bea9d626c988563c04c0a86210c21
args: [
kfp_component.google.dataflow, launch_python,
--python_file_path, {inputValue: python_file_path},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_python_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/dataflow/launch_python/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/dataflow/launch_python/component.yaml')\n",
"help(dataflow_python_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ KFP_PACKAGE = 'https://storage.googleapis.com/ml-pipeline/release/0.1.14/kfp.tar
import kfp.components as comp

dataflow_template_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/dataflow/launch_template/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/dataflow/launch_template/component.yaml')
help(dataflow_template_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-gcp:151c5349f13bea9d626c988563c04c0a86210c21
args: [
kfp_component.google.dataflow, launch_template,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_template_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/dataflow/launch_template/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/dataflow/launch_template/component.yaml')\n",
"help(dataflow_template_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ KFP_PACKAGE = 'https://storage.googleapis.com/ml-pipeline/release/0.1.14/kfp.tar
import kfp.components as comp

dataproc_create_cluster_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/dataproc/create_cluster/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/dataproc/create_cluster/component.yaml')
help(dataproc_create_cluster_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:0517114dc2b365a4a6d95424af6157ead774eff3
image: gcr.io/ml-pipeline/ml-pipeline-gcp:151c5349f13bea9d626c988563c04c0a86210c21
args: [
kfp_component.google.dataproc, create_cluster,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_create_cluster_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/48dd338c8ab328084633c51704cda77db79ac8c2/components/gcp/dataproc/create_cluster/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/a97f1d0ad0e7b92203f35c5b0b9af3a314952e05/components/gcp/dataproc/create_cluster/component.yaml')\n",
"help(dataproc_create_cluster_op)"
]
},
Expand Down
Loading

0 comments on commit bed1135

Please sign in to comment.