Skip to content

Commit

Permalink
chore: READMEs and other adjustments
Browse files Browse the repository at this point in the history
  • Loading branch information
yuwtennis committed Jan 13, 2025
1 parent 0a0f516 commit 2870104
Show file tree
Hide file tree
Showing 3 changed files with 77 additions and 18 deletions.
2 changes: 0 additions & 2 deletions Ch09/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -51,5 +51,3 @@ deploy:
--region=asia-northeast1 \
--display-name=flights \
--model=$(MODEL_ID) \


36 changes: 36 additions & 0 deletions Ch09/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,42 @@ This chapter uses Vertex AI to run custom training jobs.

## Tutorial

## Deploy endpoint

Register a model
```shell
make import-model GS_ARTIFACT_URI=MY_GS_URI
```

Create endpoint

```shell
gcloud ai endpoints create \
--display-name flights \
--region asia-northeast1
```

Deploy the model

```shell
make deploy ENDPOINT=MY_ENDPOINT MODEL_ID=MY_MODEL_ID
```
Undeploy the model
```shell
gcloud ai endpoints describe ${ENDPOINT_ID} \
--region asia-northeast1
gcloud ai endpoints undeploy-model ${ENDPOINT_ID} \
--region asia-northeast1 \
--deployed-model-id ${DEPLOYED_MODEL_ID}
```

Terminate the endpoint when you are done

```shell
gcloud ai endpoints delete ${ENDPOINT_ID} \
--region=asia-northeast1
```

## Run a training job

```shell
Expand Down
57 changes: 41 additions & 16 deletions Ch10/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,50 @@ As a high level view, below functions will be implemented.
* Ingest events from Google PubSub
* Call oneline prediction API built in Chapter 09

## Instructions

### Setting up Bigtable

Create a instance

```shell
gcloud bigtable instances create flights \
--display-name="flights" \
--cluster-config=id=datascienceongcp,nodes=1,zone=asia-northeast1-a
```

Create a table

```shell
gcloud bigtable instances tables create predictions --instance=flights --column-families=FL
```

Reading data

See full example of cbt
https://cloud.google.com/bigtable/docs/create-instance-write-data-cbt-cli#local-shell

```shell
gcloud components install cbt
cbt -project elite-caster-125113 -instance flights read predictions
```

Clean up
```shell
gcloud bigtable instances delete flights
```

### Running apache beam

#### Using direct runner

```shell
PROJECT=$(gcloud config get core/project)
OUTPUT_LOCATION=gs://${PROJECT}/flights/chapter10/output/
GOOGLE_PROJECT_ID=$PROJECT
DEPARTURE_DELAY_CSV_PATH=gs://${PROJECT}/flights/chapter8/output/delays.csv
AI_PLATFORM_LOCATION=$(gcloud config get compute/region)
AI_PLATFORM_ENDPOINT_ID="4765428530316050432"
AI_PLATFORM_ENDPOINT_ID=MY_ENDOINT_ID
TEMP_LOCATION=gs://${PROJECT}/flights/staging
BIGTABLE_INSTANCE_ID=flights
BIGTABLE_TABLE_ID=predictions
Expand All @@ -32,14 +69,16 @@ mvn compile exec:java \
--bigtableTableId=$BIGTABLE_TABLE_ID"
```

#### Using dataflow , tbc

```shell
PROJECT=$(gcloud config get core/project)
OUTPUT_LOCATION=gs://${PROJECT}/flights/chapter10/output/
STAGING_LOCATION=gs://${PROJECT}/staging
GOOGLE_PROJECT_ID=$PROJECT
DEPARTURE_DELAY_CSV_PATH=gs://${PROJECT}/flights/chapter8/output/delays.csv
AI_PLATFORM_LOCATION=$(gcloud config get compute/region)
AI_PLATFORM_ENDPOINT_ID="3976559723712348160"
AI_PLATFORM_ENDPOINT_ID=MY_ENDOINT_ID
TEMP_LOCATION=gs://${PROJECT}/flights/staging
RUNNER=DataflowRunner
mvn compile exec:java \
Expand All @@ -54,17 +93,3 @@ mvn compile exec:java \
--aiPlatformLocation=$AI_PLATFORM_LOCATION \
--aiPlatformEndpointId=$AI_PLATFORM_ENDPOINT_ID"
```

```shell
gcloud bigtable instances create flights \
--display-name="flights" \
--cluster-config=id=datascienceongcp,nodes=1,zone=asia-northeast1-a
```

```shell
gcloud bigtable instances tables create predictions --instance=flights --column-families=FL
```

```shell
gcloud bigtable instances delete flights
```

0 comments on commit 2870104

Please sign in to comment.