Adding replace_where incremental strategy (#293) #310
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This PR adds a new incremental strategy
replace_where
. The strategy resolves to anINSERT INTO ... REPLACE WHERE
statement. This completes the feature set explained here: https://docs.databricks.com/delta/selective-overwrite.html#replace-where&language-pythonA lot of the code change is to bring macros from dbt-spark over to dbt-databricks. The only real code change was in validating incremental strategies and adding the replace_where strategy.
Why do we need it?
It enables use cases where part of the data is always replaced and where MERGE is not possible, such as when there is no primary key. E.g.: events table where we want to always replace the last 3 days.
Difference from insert_overwrite
Insert overwrite only works with dynamic partition pruning spark setting, which is not available in sql warehouses or any Unity Catalog-enabled cluster. It also only works with whole partitions, making it difficult to set up and assure that the correct data is dropped.
Checklist
CHANGELOG.md
and added information about my change to the "dbt-databricks next" section.