You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The test throws an IntegrityError. I suspect that the camel casing is breaking this merge sql in that its looking for lowercase id and the RECORD sends Id using camelcase.
sqlalchemy.exc.IntegrityError: (snowflake.connector.errors.IntegrityError) 100072 (22000): None: NULL result in a non-nullable column
E [SQL: merge into MELTANOLABS_RAW.TARGET_SNOWFLAKE_1CE306.TESTCAMELCASE d using (select $1:id::VARCHAR as ID, $1:clientname::VARCHAR as CLIENTNAME, $1:_sdc_batched_at::TIMESTAMP_NTZ as _SDC_BATCHED_AT, $1:_sdc_extracted_at::TIMESTAMP_NTZ as _SDC_EXTRACTED_AT, $1:_sdc_received_at::TIMESTAMP_NTZ as _SDC_RECEIVED_AT, $1:_sdc_deleted_at::TIMESTAMP_NTZ as _SDC_DELETED_AT, $1:_sdc_table_version::DECIMAL as _SDC_TABLE_VERSION, $1:_sdc_sequence::DECIMAL as _SDC_SEQUENCE from '@~/target-snowflake/TestCamelcase-f581638e-d5bc-42a5-a8ae-cec1d109b3ea'(file_format => MELTANOLABS_RAW.TARGET_SNOWFLAKE_1CE306."TestCamelcase-f581638e-d5bc-42a5-a8ae-cec1d109b3ea")) s on d."ID" = s."ID" when matched then update set d."ID" = s."ID", d."CLIENTNAME" = s."CLIENTNAME", d."_SDC_BATCHED_AT" = s."_SDC_BATCHED_AT", d."_SDC_EXTRACTED_AT" = s."_SDC_EXTRACTED_AT", d."_SDC_RECEIVED_AT" = s."_SDC_RECEIVED_AT", d."_SDC_DELETED_AT" = s."_SDC_DELETED_AT", d."_SDC_TABLE_VERSION" = s."_SDC_TABLE_VERSION", d."_SDC_SEQUENCE" = s."_SDC_SEQUENCE" when not matched then insert (ID, CLIENTNAME, _SDC_BATCHED_AT, _SDC_EXTRACTED_AT, _SDC_RECEIVED_AT, _SDC_DELETED_AT, _SDC_TABLE_VERSION, _SDC_SEQUENCE) values (s."ID", s."CLIENTNAME", s."_SDC_BATCHED_AT", s."_SDC_EXTRACTED_AT", s."_SDC_RECEIVED_AT", s."_SDC_DELETED_AT", s."_SDC_TABLE_VERSION", s."_SDC_SEQUENCE")]
The text was updated successfully, but these errors were encountered:
Closes#28
- implements a fix for the test suites not running properly, also added
it to the SDK meltano/sdk#1749
- implements a lot of the default test validates methods to make asserts
- comment out the tests that fail for legitimate bugs
- logged the bugs
- #43
- #41
- #40
- I also logged
#42 because I
wrote a test to assert the exception but I'm not actually sure if we
want that behavior or not
---------
Co-authored-by: Ken Payne <[email protected]>
Closes#40
The merge_from_stage and copy_from_stage accept a schema argument that
it uses to build a query that selects from the raw json files in the
stage using the json keys. Since the staging data is raw, we needed to
pass in the raw unconformed schema instead of the conformed schema. The
conformed schema converted the camelcase `Id` column to lower case `id`,
so the query it built was trying to select a key in the json file that
didnt exist. The query has lowercase and the file has camelcase.
The test throws an
IntegrityError
. I suspect that the camel casing is breaking this merge sql in that its looking for lowercaseid
and the RECORD sendsId
using camelcase.The text was updated successfully, but these errors were encountered: