-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug in Resource "google_bigquery_table": hive_partition_options does not properly create the hive fields #6856
Comments
@nick-hf can you share the debug log? Thanks |
There's no debug log to be shared. Terraform thinks it created the table properly, there's no error output at all. In fact, the created table in the BQ UI thinks it has hive partitioning set up, and reports the proper prefix URI and everything. But, the partition fields don't exist. Using the exact same configuration via the BQ UI works fine. As I said, it also happens with the BQ command line tool, so there may be something outside just the Terraform module having problems. I don't appear to be the only person experiencing this: https://stackoverflow.com/questions/60838904/creating-external-table-from-gcs-with-hive-partition-information-in-bigquery-usi |
@nick-hf I understand. With the debug login, we can verify if the field is properly set as wanted from Terraform side, and review the responses from API's. |
Plan:
Relevant bits from the apply:
|
Looks like this was fixed here #6693 Thanks for your assistance |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 [email protected]. Thanks! |
Terraform Version
Terraform v0.12.28
Terraform Configuration Files
resource "google_bigquery_table" "xxx" {
project = xxx
dataset_id = xxx
table_id = xxx
external_data_configuration {
autodetect = false
source_format = "CSV"
max_bad_records = 0
source_uris = [
"xxx"
]
csv_options {
quote = """
field_delimiter = ","
skip_leading_rows = 1
allow_quoted_newlines = true
}
hive_partitioning_options {
mode = "STRINGS"
source_uri_prefix = xxx
}
}
schema = ...
}
Expected Behavior
Table xxx should be created, with hive partitioned fields that follow the source_uri.
Actual Behavior
Table xxx is created, but without the hive partitioned fields generated.
Steps to Reproduce
Create an external table with this resource with a hive-formatted cloud storage backing.
Additional Context
This may be an API problem, as we have seen the same behavior with the bq cmd line interface.
The text was updated successfully, but these errors were encountered: