-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support partial schemas in CT/CS statements #4566
Milestone
Comments
big-andy-coates
added a commit
to big-andy-coates/ksql
that referenced
this issue
Feb 25, 2020
fixes: confluentinc#4566 With this change users can now supply just the key schema and use schema inference to get the value columns. For example, if the key is an `INT` serialized using Kafka's `IntegerSerializer` and the value is an Avro record with the schema stored in the Scheme Registry, then such a stream can be registered in ksqlDB with a statement such as: ```sql -- note: only the key columns are provided between the first set of brackets -- the value columns will be inferred from the Avro schema in the Schema Registry CREATE STREAM users (ROWKEY INT KET) WITH (kafka_topic='users', value_format='avro'); ```
big-andy-coates
added a commit
that referenced
this issue
Feb 26, 2020
* fix: support partial schemas fixes: #4566 With this change users can now supply just the key schema and use schema inference to get the value columns. For example, if the key is an `INT` serialized using Kafka's `IntegerSerializer` and the value is an Avro record with the schema stored in the Scheme Registry, then such a stream can be registered in ksqlDB with a statement such as: ```sql -- note: only the key columns are provided between the first set of brackets -- the value columns will be inferred from the Avro schema in the Schema Registry CREATE STREAM users (ROWKEY INT KET) WITH (kafka_topic='users', value_format='avro'); ```
big-andy-coates
added a commit
that referenced
this issue
Feb 26, 2020
fixes: #4566 With this change users can now supply just the key schema and use schema inference to get the value columns. For example, if the key is an `INT` serialized using Kafka's `IntegerSerializer` and the value is an Avro record with the schema stored in the Scheme Registry, then such a stream can be registered in ksqlDB with a statement such as: ```sql -- note: only the key columns are provided between the first set of brackets -- the value columns will be inferred from the Avro schema in the Schema Registry CREATE STREAM users (ROWKEY INT KET) WITH (kafka_topic='users', value_format='avro'); ``` (cherry picked from commit 4f1ce8a)
colinhicks
pushed a commit
that referenced
this issue
Feb 27, 2020
fixes: #4566 With this change users can now supply just the key schema and use schema inference to get the value columns. For example, if the key is an `INT` serialized using Kafka's `IntegerSerializer` and the value is an Avro record with the schema stored in the Scheme Registry, then such a stream can be registered in ksqlDB with a statement such as: ```sql -- note: only the key columns are provided between the first set of brackets -- the value columns will be inferred from the Avro schema in the Schema Registry CREATE STREAM users (ROWKEY INT KET) WITH (kafka_topic='users', value_format='avro'); ``` (cherry picked from commit 4f1ce8a)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
We have AVRO schema inference, (and soon PB/JSON schema inference). Schema inference means users don't need to supply the list of value columns. Instead, these are determined by inspecting the schema in SR and inferring the KSQL schema.
The primitive key work allows users to declare sources with non-STRING keys, e.g.
These two things don't play nice together! There is currently no way to say 'I want a non-STRING key, but can you infer the value columns from the Avro schema'.
Proposed syntax is:
The text was updated successfully, but these errors were encountered: