-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Destination BigQuery Desnormalized: JSON table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the errors[] collection for more details #6638
Comments
I got a similar issue using a facebook pages source. The error:
Sync logs: logs-139-0.txt |
Airbyte version: v0.30.9-alpha Hi guys, as I mentioned in the slack troubleshooting channel, https://airbytehq.slack.com/archives/C01MFR03D5W/p1633085634063100 I got this problem to on my GCP Compute engine instance with 0.30.2 version of Airbyte. I mount a Kubernetes version of Airbytes on GKP for my business purpose and I'm now on 0.30.9 and I got the same problem but different this time. So now on the Airbyte interface my sync is successful... but... Sync logs: logs-2-0.txt But nothing in my dataset, not a single table. When I read the bigquery log, the same problem. The problem is quite simple, when the shema is created, all DATES have STRING key. But in order to work, in the latest version of BigQuery (denormalized typed struct), version 0.1.6, all DATES needs to have FORMAT key |
It looks like BigQuery only understand one specific format for the BigQuery's The UI displays string but when you look at the log, we generated a |
Yes sure ! We used another platform to transfer our data before and they save all the Shopify dates in timestamp in bigquery. so it make a lot of sense ! |
Hi,everyone. Encountered the same problem. It would be great, if the airbyte platform can provide feature to choose the destination datatype by our own. |
May be also somehow related to #5959 |
Hi everyone, could you upgrade to the latest version of the connector and let us know if you're still seeing issues? |
Do I have to update airbyte as well?
Le mer. 3 nov. 2021 5 h 46 p.m., Sherif A. Nada ***@***.***>
a écrit :
… Hi everyone, could you upgrade to the latest version of the connector and
let us know if you're still seeing issues?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#6638 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AVHHARMQTKGR3QCDAAV4NTDUKG3VPANCNFSM5FFRSCPQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
nope just the connector |
The sync is flag as successful, the table and shema are in bigquery but no
data
…On Wed, 3 Nov 2021 at 19:29, Sherif A. Nada ***@***.***> wrote:
nope just the connector
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#6638 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AVHHAROJSKX4MVBGRBSWWCTUKHHYFANCNFSM5FFRSCPQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
@matthieuUrsa can you share the logs from the sync? |
I continue facing the issue Facebook Marketing 0.2.23 to BigQuery Denormalized 0.1.8 I'm using integration account creds. |
May be it due to different type of source-connector (and possible different DATETIME format). Let me check. |
I can't unfortunately, I'm facing a log problem, I'm stuck with the message
"Waiting for logs..."
https://airbytehq.slack.com/archives/C01MFR03D5W/p1636051204126900?thread_ts=1636050801.126600&cid=C01MFR03D5W
Can you tell me another way to find the logs please?
…On Thu, 4 Nov 2021 at 15:09, Sherif A. Nada ***@***.***> wrote:
@matthieuUrsa <https://github.com/matthieuUrsa> can you share the logs
from the sync?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#6638 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AVHHARLFAX4EBMW2CX4L3RDUKLSAVANCNFSM5FFRSCPQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
I did the investigation, I have used:
During investigation was found all DateTime was correctly parsed to BIGQUERY DATETIME format (this was root cause of previous issue): The problem occurs during the processing of the which has a Moreover, My assumption we will have such errors when we gonna process such CC: @sherifnada , @marcosmarxm |
The targeting object seems to declare its type properly. However there is a related issue which might be relevant: #7725 @alexandertsukanov I think however, the destination needs to be more resilient to situations like these. If the input schema is not known, then we should drop this column entirely rather than fail the entire sync. I exported the catalog for the facebook source using Zooming in on the problematic field
This kind of Could we implement the following fix in the BQ destination? the ticket linked above tracks the fix on the FB connector side |
@sherifnada I think the second option is more preferable. I am going to implement the fix using this approach. However, it looks like this issue the blocker only for bigquery-denormalized not for regular BQ. |
Enviroment
Current Behavior
Tell us what happens.
Expected Behavior
Tell us what should happen.
Logs
logs-11-0.txt
Steps to Reproduce
Are you willing to submit a PR?
Remove this with your answer.
The text was updated successfully, but these errors were encountered: