Skip to content

Commit

Permalink
Add handling for default column values
Browse files Browse the repository at this point in the history
  • Loading branch information
nielm committed Jun 8, 2023
1 parent c7e1fb3 commit c3563d9
Show file tree
Hide file tree
Showing 9 changed files with 308 additions and 94 deletions.
23 changes: 14 additions & 9 deletions AddingCapabilities.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,7 @@

## Sync parser code with cloud spanner emulator

1) Clone the Cloud Spanner Emulator repo from

`https://github.com/GoogleCloudPlatform/cloud-spanner-emulator`
1) Clone the Cloud Spanner Emulator repo from `https://github.com/GoogleCloudPlatform/cloud-spanner-emulator`

1) Generate the `ddl_keywords.jjt` file in backend/schema/parser module in order
to build the `ddl_keywords.jjt`
Expand All @@ -21,12 +19,13 @@ in [src/main/jjtree-sources](src%2Fmain%2Fjjtree-sources) with the emulator's
versions in
the [backend/schema/parser](https://github.com/GoogleCloudPlatform/cloud-spanner-emulator/tree/master/backend/schema/parser)
directory)

* `ddl_expression.jjt`
* `ddl_parser.jjt`
* `ddl_string_bytes_tokens.jjt`
* `ddl_whitespace.jjt` - Note that this file does not have
the `ValidateStringLiteral()`
and `ValidateBytesLiteral()` functions - this is intentional.
the `ValidateStringLiteral()`
and `ValidateBytesLiteral()` functions - this is intentional.

## Invalidate AST wrappers that have been added

Expand Down Expand Up @@ -71,7 +70,7 @@ the tests.
## Implement the toString and equals methods for the new AST capability classes

The `toString()` and probably the `equals()` methods need to be implemented in
the new AST classes for implementing the capability.
the new AST classes for implementing the diff capability.

For end-branches of the AST, the toString() method can be as simple as
regenerating the original tokens using the helper method:
Expand All @@ -80,23 +79,28 @@ regenerating the original tokens using the helper method:
ASTTreeUtils.tokensToString(node.firstToken,node.lastToken);
```

and the equals method can do a string compare...
This will iterate down the AST regenerating the tokens into a normalized form
(ie single spaces, capitalized reserved words etc).

Once you have a toString which generates a normalized form, the equals method
can simply do a string comparison... Lazy but it works!

However for more complex classes, like Tables, the implementation is necessarily
more complex, extracting the optional and repeated nodes (eg for Table, columns,
constraints, primary key, interleave etc), and then rebuilding the original
statement in the toString()

If you only implement some of the functionality, the `toString()` method is a
good place to put some validation - checking only for supported child nodes.
good place to put some validation - checking only for supported child nodes. See
the Table and colum definition classes for examples.

Once this is done, you can run some tests in the `DDLParserTest` class to verify
that the parser works, and that the toString() method regenerates the original
statement.

## Implement difference generation.

Once you have a valid `equals()` method, the bulk of the work is handled
With a valid `equals()` method, the bulk of the work is handled
in `DDLDiff.build()` The DDL is split into its components, and Maps.difference()
is used to compare.

Expand Down Expand Up @@ -140,6 +144,7 @@ Normally you will want tests for:
* Adding a DDL feature/object in the new DDL
* Removing a DDL feature/object in the new DDL
* Changing a DDL feature/object.
* Verifying that not changing the feature/object has no effect!

For a DDL object like a constraint that can be added inline in a Create Table or
by an Alter statement, you will need to add multiple versions of the add/remove
Expand Down
111 changes: 75 additions & 36 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,10 @@ statements in the DDL file. This has the following implications:

* Tables and indexes must be created with a single `CREATE` statement (not by
using `CREATE` then `ALTER` statements). The exception to this is when
constraints and row deletion policies are created - the tool supports creating them in the
table creation DDL statement, and also by using `ALTER` statements after the table has been created.
constraints and row deletion policies are created - the tool supports creating
them in the
table creation DDL statement, and also by using `ALTER` statements after the
table has been created.

### Note on dropped database objects.

Expand All @@ -60,6 +62,26 @@ differences found:
Constraints and row deletion policies will always be dropped if they are not
present in the new DDL.

This also helps to prevent edge cases which are not handled by this tool.

Consider for example, a DDL object such as a check constraint, default value
calculation or a row deletion policy clause that has an expression referencing
a column that is going to be removed, and will be changed to reference a column
that is being added.

For this to work properly things need to happen in the following order:

1) the object with the expression referencing the existing column needs to be
dropped
2) the column dropped
3) the new column added
4) finally the object re-created with the new expression referencing the new
column.

As this tool does not _understand_ the contents of the expression, it cannot
know that this is required, so by not dropping the column, steps 1 and 2 are
not required.

### Note on modified indexes

Modifications to indexes are not possible via `ALTER` statements, but if the
Expand All @@ -70,7 +92,8 @@ will cause the tool to fail.

### Note on constraints

`FOREIGN KEY` amd `CHECK` constraints _must_ be explicitly named, either within a
`FOREIGN KEY` amd `CHECK` constraints _must_ be explicitly named, either within
a
`CREATE TABLE` statement, or using an `ALTER TABLE` statement, using the syntax:

```sql
Expand All @@ -84,10 +107,11 @@ dropped.
Anonymous `FOREIGN KEY` or `CHECK` constraints of the form:

```sql
CREATE TABLE fk_dest (
key INT64,
source_key INT64,
FOREIGN KEY (source_key) REFERENCES fk_source(key)
CREATE TABLE fk_dest
(
key INT64,
source_key INT64,
FOREIGN KEY (source_key) REFERENCES fk_source (key)
) PRIMARY KEY (key);
```

Expand Down Expand Up @@ -163,60 +187,72 @@ java -jar target/spanner-ddl-diff-*-jar-with-dependencies.jar \
### Original schema DDL input file

```sql
create table test1 (
create table test1
(
col1 int64,
col2 int64,
col3 STRING(100),
col4 ARRAY<STRING(100)>,
col5 float64 not null,
col6 timestamp)
primary key (col1 desc);
col6 timestamp
) primary key (col1 desc);

create index index1 on test1 (col1);

create table test2 (
col1 int64)
primary key (col1);
create table test2
(
col1 int64
) primary key (col1);

create index index2 on test2 (col1);

create table test3 (
create table test3
(
col1 int64,
col2 int64 )
primary key (col1, col2),
col2 int64
) primary key (col1, col2),
interleave in parent test2
on delete cascade;
on
delete
cascade;

create table test4 (col1 int64, col2 int64) primary key (col1);
create table test4
(
col1 int64,
col2 int64
) primary key (col1);
create index index3 on test4 (col2);
```

### New schema DDL input file

```sql
create table test1 (
col1 int64,
col2 int64 NOT NULL,
col3 STRING(MAX),
col4 ARRAY<STRING(200)>,
col5 float64 not null,
newcol7 BYTES(100))
primary key (col1 desc);
create table test1
(
col1 int64,
col2 int64 NOT NULL,
col3 STRING( MAX),
col4 ARRAY<STRING(200)>,
col5 float64 not null,
newcol7 BYTES(100)
) primary key (col1 desc);

create index index1 on test1 (col2);

create table test2 (
col1 int64,
newcol2 string(max))
primary key (col1);
create table test2
(
col1 int64,
newcol2 string( max)
) primary key (col1);

create index index2 on test2 (col1 desc);

create table test3 (
create table test3
(
col1 int64,
col2 int64,
col3 timestamp )
primary key (col1, col2),
col3 timestamp
) primary key (col1, col2),
interleave in parent test2;
```

Expand All @@ -237,19 +273,22 @@ DROP TABLE test4;

ALTER TABLE test1 DROP COLUMN col6;

ALTER TABLE test1 ADD COLUMN newcol7 BYTES(100);
ALTER TABLE test1
ADD COLUMN newcol7 BYTES(100);

ALTER TABLE test1 ALTER COLUMN col2 INT64 NOT NULL;

ALTER TABLE test1 ALTER COLUMN col3 STRING(MAX);

ALTER TABLE test1 ALTER COLUMN col4 ARRAY<STRING(200)>;

ALTER TABLE test2 ADD COLUMN newcol2 STRING(MAX);
ALTER TABLE test2
ADD COLUMN newcol2 STRING(MAX);

ALTER TABLE test3 SET ON DELETE NO ACTION;

ALTER TABLE test3 ADD COLUMN col3 TIMESTAMP;
ALTER TABLE test3
ADD COLUMN col3 TIMESTAMP;

CREATE INDEX index1 ON test1 (col2 ASC);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import com.google.cloud.solutions.spannerddl.parser.ASTalter_table_statement;
import com.google.cloud.solutions.spannerddl.parser.ASTcheck_constraint;
import com.google.cloud.solutions.spannerddl.parser.ASTcolumn_def;
import com.google.cloud.solutions.spannerddl.parser.ASTcolumn_default_clause;
import com.google.cloud.solutions.spannerddl.parser.ASTcolumn_type;
import com.google.cloud.solutions.spannerddl.parser.ASTcreate_index_statement;
import com.google.cloud.solutions.spannerddl.parser.ASTcreate_table_statement;
Expand Down Expand Up @@ -465,6 +466,32 @@ private static void addColumnDiffs(
+ optionToUpdate.getValue()
+ ")");
}

// Update default values

final ASTcolumn_default_clause oldDefaultValue =
columnDiff.leftValue().getColumnDefaultClause();
final ASTcolumn_default_clause newDefaultValue =
columnDiff.rightValue().getColumnDefaultClause();
if (!Objects.equals(oldDefaultValue, newDefaultValue)) {
if (newDefaultValue == null) {
alterStatements.add(
"ALTER TABLE "
+ tableName
+ " ALTER COLUMN "
+ columnDiff.rightValue().getColumnName()
+ " DROP DEFAULT");
} else {
// add or change default value
alterStatements.add(
"ALTER TABLE "
+ tableName
+ " ALTER COLUMN "
+ columnDiff.rightValue().getColumnName()
+ " SET "
+ newDefaultValue);
}
}
}

static DdlDiff build(String originalDDL, String newDDL) throws DdlDiffException {
Expand Down
Loading

0 comments on commit c3563d9

Please sign in to comment.