-
Notifications
You must be signed in to change notification settings - Fork 840
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kubernetes-public: BigQuery Data transfer to k8s-infra-kettle #2747
Conversation
cc @spiffxp |
7998a37
to
4a3deea
Compare
4a3deea
to
499bd0a
Compare
Part of : kubernetes#1308 Ref : kubernetes#787 Add a service account with workload identity to ensure k8s service account kettle can push data to BQ dataset k8s-infra-kettle. Add a BQ data transfer job to copy data from k8s-gubernator:build to dataset k8s-infra-kettle. The job is not periodically triggered. Add script to auto-deploy kettle on GKE cluster aaa. Signed-off-by: Arnaud Meukam <[email protected]>
499bd0a
to
decbde2
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/approve
/lgtm
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: ameukam, spiffxp The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
This is partially deployed : module.aaa_kettle_sa.google_service_account.serviceaccount: Creating...
google_service_account.bq_kettle_data_transfer_writer: Creating...
module.aaa_kettle_sa.google_service_account.serviceaccount: Creation complete after 2s [id=projects/kubernetes-public/serviceAccounts/[email protected]]
google_service_account.bq_kettle_data_transfer_writer: Creation complete after 2s [id=projects/kubernetes-public/serviceAccounts/bq-data-transfer-kettle@kubernetes-public.iam.gserviceaccount.com]
data.google_iam_policy.prod_kettle_dataset_iam_policy: Reading... [id=3302437639]
module.aaa_kettle_sa.google_service_account_iam_policy.serviceaccount_iam: Creating...
google_project_iam_member.bq_kettle_data_transfer_jobuser_binding: Creating...
google_bigquery_data_transfer_config.bq_data_transfer_kettle: Creating...
data.google_iam_policy.prod_kettle_dataset_iam_policy: Read complete after 0s [id=551688621]
google_bigquery_dataset_iam_policy.prod_kettle_dataset: Modifying... [id=projects/kubernetes-public/datasets/k8s_infra_kettle]
module.aaa_kettle_sa.google_service_account_iam_policy.serviceaccount_iam: Creation complete after 1s [id=projects/kubernetes-public/serviceAccounts/[email protected]]
google_bigquery_dataset_iam_policy.prod_kettle_dataset: Modifications complete after 1s [id=projects/kubernetes-public/datasets/k8s_infra_kettle]
google_project_iam_member.bq_kettle_data_transfer_jobuser_binding: Creation complete after 9s [id=kubernetes-public/roles/bigquery.jobUser/serviceAccount:bq-data-transfer-kettle@kubernetes-public.iam.gserviceaccount.com]
╷
│ Error: Error creating Config: googleapi: Error 403: BigQuery Data Transfer service has not been used in project 127754664067 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/bigquerydatatransfer.googleapis.com/overview?project=127754664067 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
│
│ with google_bigquery_data_transfer_config.bq_data_transfer_kettle,
│ on k8s-kettle.tf line 84, in resource "google_bigquery_data_transfer_config" "bq_data_transfer_kettle":
│ 84: resource "google_bigquery_data_transfer_config" "bq_data_transfer_kettle" { |
Part of : kubernetes#1308 Ref: kubernetes#787 Followup of : kubernetes#2747 Enable GCP required service for data transfer Signed-off-by: Arnaud Meukam <[email protected]>
Opened and deployed #2749 Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
+ create
Terraform will perform the following actions:
# google_bigquery_data_transfer_config.bq_data_transfer_kettle will be created
+ resource "google_bigquery_data_transfer_config" "bq_data_transfer_kettle" {
+ data_source_id = "cross_region_copy"
+ destination_dataset_id = "k8s_infra_kettle"
+ disabled = true
+ display_name = "BigQuery data transfer to k8s_infra_kettle"
+ id = (known after apply)
+ location = "US"
+ name = (known after apply)
+ params = {
+ "overwrite_destination_table" = "true"
+ "source_dataset_id" = "build"
+ "source_project_id" = "k8s-gubernator"
}
+ project = "kubernetes-public"
+ service_account_name = "bq-data-transfer-kettle@kubernetes-public.iam.gserviceaccount.com"
+ email_preferences {
+ enable_failure_email = false
}
}
Plan: 1 to add, 0 to change, 0 to destroy.
google_bigquery_data_transfer_config.bq_data_transfer_kettle: Creating...
google_bigquery_data_transfer_config.bq_data_transfer_kettle: Creation complete after 4s [id=projects/127754664067/locations/us/transferConfigs/617ffed3-0000-274f-bb87-d4f547e654cc]
Releasing state lock. This may take a few moments... |
Part of : kubernetes#1308 Ref: kubernetes#787 Followup of : kubernetes#2747 Enable GCP required service for data transfer Signed-off-by: Arnaud Meukam <[email protected]>
Part of : #1308
Ref : #787
Add a service account with workload identity to ensure k8s service
account kettle can push data to BQ dataset
k8s-infra-kettle
.Add a BQ data transfer job to copy data from
k8s-gubernator:build
todataset
k8s-infra-kettle
. The job is not periodically triggered.Add script to auto-deploy kettle on GKE cluster aaa.
Signed-off-by: Arnaud Meukam [email protected]