diff --git a/docs/faq.rst b/docs/faq.rst index 03c9c4c9ee5d..9a81e4de3954 100644 --- a/docs/faq.rst +++ b/docs/faq.rst @@ -145,15 +145,15 @@ How can I lock-down KSQL servers for production and prevent interactive client a You can configure your servers to run a set of predefined queries by using ``ksql.queries.file`` or the ``--queries-file`` command line flag. For more information, see :ref:`ksql-server-config`. -==================================================================== -How do I use Avro data and integrate with Confluent Schema Registry? -==================================================================== +==================================================== +How do I use Avro data and integrate with |sr-long|? +==================================================== -Configure the ``ksql.schema.registry.url`` property in the KSQL server configuration to point to Schema Registry +Configure the ``ksql.schema.registry.url`` property in the KSQL server configuration to point to |sr| (see :ref:`install_ksql-avro-schema`). .. important:: - - To use Avro data with KSQL you must have Schema Registry installed. This is included by default with |cp|. + - To use Avro data with KSQL you must have |sr| installed. This is included by default with |cp|. - Avro message values are supported. Avro keys are not yet supported. ========================= @@ -200,11 +200,11 @@ Will KSQL work with a Apache Kafka cluster secured using Kafka ACLs? Yes. For more information, see :ref:`config-security-ksql-acl`. -====================================================== -Will KSQL work with a HTTPS Confluent Schema Registry? -====================================================== +====================================== +Will KSQL work with a HTTPS |sr-long|? +====================================== -Yes. KSQL can be configured to communicate with the Confluent Schema Registry over HTTPS. For more information, see +Yes. KSQL can be configured to communicate with |sr-long| over HTTPS. For more information, see :ref:`config-security-ksql-sr`. ================================================ diff --git a/docs/includes/ksql-includes.rst b/docs/includes/ksql-includes.rst index cc5bebb2ca5c..b523f2cd730e 100644 --- a/docs/includes/ksql-includes.rst +++ b/docs/includes/ksql-includes.rst @@ -278,7 +278,7 @@ the latest offset. #. Create a new persistent query that counts the pageviews for each region and gender combination in a :ref:`tumbling window ` of 30 seconds when the count is greater than one. Results from this query are written to the ``PAGEVIEWS_REGIONS`` Kafka topic in the Avro format. KSQL will register the Avro schema with the - configured schema registry when it writes the first message to the ``PAGEVIEWS_REGIONS`` topic. + configured |sr| when it writes the first message to the ``PAGEVIEWS_REGIONS`` topic. .. code:: sql diff --git a/docs/installation/server-config/avro-schema.rst b/docs/installation/server-config/avro-schema.rst index 3066a4c751e4..1fe32c146d2e 100644 --- a/docs/installation/server-config/avro-schema.rst +++ b/docs/installation/server-config/avro-schema.rst @@ -1,106 +1,104 @@ -.. _install_ksql-avro-schema: - -Avro and Schema Registry -======================== - -KSQL can read and write messages in Avro format by integrating with :ref:`Confluent Schema Registry `. -KSQL automatically retrieves (read) and registers (write) Avro schemas as needed and thus saves you from both having to -manually define columns and data types in KSQL and from manual interaction with the Schema Registry. - -.. contents:: Contents - :local: - -Supported functionality -^^^^^^^^^^^^^^^^^^^^^^^ - -KSQL currently supports Avro data in the Kafka message values. - -Avro schemas with nested fields are supported. In KSQL 5.0 and higher, you can read nested data, in Avro and JSON -formats, by using the STRUCT type. You can’t create new nested STRUCT data as the result of a query, but you can copy existing -STRUCT fields as-is. For more info, see the :ref:`KSQL syntax reference`. - -The following functionality is not supported: - -- Message keys in Avro format are not supported. Message keys in KSQL are always interpreted as STRING format, which means - KSQL will ignore Avro schemas that have been registered for message keys, and the key will be read using ``StringDeserializer``. - -Configuring KSQL for Avro -^^^^^^^^^^^^^^^^^^^^^^^^^ - -You must configure the REST endpoint of |sr| by setting ``ksql.schema.registry.url`` (default: ``http://localhost:8081``) -in the KSQL server configuration file (``/etc/ksql/ksql-server.properties``). For more information, -see :ref:`install_ksql-server`. - -.. important:: Do not use the SET statement in the KSQL CLI to configure the registry endpoint. - -Using Avro in KSQL -^^^^^^^^^^^^^^^^^^ - -Before using Avro in KSQL, make sure that |sr| is up and running and that ``ksql.schema.registry.url`` is set correctly -in the KSQL properties file (defaults to ``http://localhost:8081``). |sr| is :ref:`included by default ` with -|cp|. - -Here's what you can do with Avro in KSQL: - -- Declare streams and tables on Kafka topics with Avro-formatted data by using ``CREATE STREAM`` and ``CREATE TABLE`` statements. -- Read from and write into Avro-formatted data by using ``CREATE STREAM AS SELECT`` and ``CREATE TABLE AS SELECT`` statements. -- Create derived streams and tables from existing streams and tables with ``CREATE STREAM AS SELECT`` and - ``CREATE TABLE AS SELECT`` statements. -- Convert data to different formats with ``CREATE STREAM AS SELECT`` and ``CREATE TABLE AS SELECT`` statements. For example, - you can convert a stream from Avro to JSON. - -Example: Create a new stream ``pageviews`` by reading from a Kafka topic with Avro-formatted messages. - .. code:: sql - - CREATE STREAM pageviews - WITH (KAFKA_TOPIC='pageviews-avro-topic', - VALUE_FORMAT='AVRO'); - -Example: Create a new table ``users`` by reading from a Kafka topic with Avro-formatted messages. - In this example you don’t need to define any columns or data types in the CREATE statement because KSQL automatically - infers this information from the latest registered Avro schema for topic ``pageviews-avro-topic`` (i.e., the latest - schema at the time the statement is first executed). - - .. code:: sql - - CREATE TABLE users - WITH (KAFKA_TOPIC='users-avro-topic', - VALUE_FORMAT='AVRO', - KEY='userid'); - - - - If you want to create a STREAM or TABLE with only a subset of all the - available fields in the Avro schema, then you must explicitly define the - columns and data types. - -Example: Create a new stream ``pageviews_reduced`` - Similar to the previous example, but with only a few of all the available fields in the Avro data. In this example, - only the ``viewtime`` and ``pageid`` columns are picked). - - .. code:: sql - - CREATE STREAM pageviews_reduced (viewtime BIGINT, pageid VARCHAR) - WITH (KAFKA_TOPIC='pageviews-avro-topic', - VALUE_FORMAT='AVRO'); - - KSQL allows you to work with streams and tables regardless of their underlying data format. This means that you can - easily mix and match streams and tables with different data formats and also convert between data formats. For - example, you can join a stream backed by Avro data with a table backed by JSON data. - -Example: Convert a JSON stream into an Avro stream. - In this example only the ``VALUE_FORMAT`` is required for Avro to achieve the data conversion. KSQL automatically - generates an appropriate Avro schema for the new ``pageviews_avro`` stream, and it registers the schema with |sr|. - - .. code:: sql - - CREATE STREAM pageviews_json (viewtime BIGINT, userid VARCHAR, pageid VARCHAR) - WITH (KAFKA_TOPIC='pageviews-json-topic', VALUE_FORMAT='JSON'); - - CREATE STREAM pageviews_avro - WITH (VALUE_FORMAT = 'AVRO') AS - SELECT * FROM pageviews_json; - - - - \ No newline at end of file +.. _install_ksql-avro-schema: + +Configuring Avro and |sr| for KSQL +================================== + +KSQL can read and write messages in Avro format by integrating with :ref:`Confluent Schema Registry `. +KSQL automatically retrieves (read) and registers (write) Avro schemas as needed and thus saves you from both having to +manually define columns and data types in KSQL and from manual interaction with |sr|. + +.. contents:: Contents + :local: + +Supported functionality +^^^^^^^^^^^^^^^^^^^^^^^ + +KSQL currently supports Avro data in the Kafka message values. + +Avro schemas with nested fields are supported. In KSQL 5.0 and higher, you can read nested data, in Avro and JSON +formats, by using the STRUCT type. You can’t create new nested STRUCT data as the result of a query, but you can copy existing +STRUCT fields as-is. For more info, see the :ref:`KSQL syntax reference`. + +The following functionality is not supported: + +- Message keys in Avro format are not supported. Message keys in KSQL are always interpreted as STRING format, which means + KSQL will ignore Avro schemas that have been registered for message keys, and the key will be read using ``StringDeserializer``. + +Configuring KSQL for Avro +^^^^^^^^^^^^^^^^^^^^^^^^^ + +You must configure the REST endpoint of |sr| by setting ``ksql.schema.registry.url`` (default: ``http://localhost:8081``) +in the KSQL server configuration file (``/etc/ksql/ksql-server.properties``). For more information, +see :ref:`install_ksql-server`. + +.. important:: Do not use the SET statement in the KSQL CLI to configure the registry endpoint. + +Using Avro in KSQL +^^^^^^^^^^^^^^^^^^ + +Before using Avro in KSQL, make sure that |sr| is up and running and that ``ksql.schema.registry.url`` is set correctly +in the KSQL properties file (defaults to ``http://localhost:8081``). |sr| is :ref:`included by default ` with +|cp|. + +Here's what you can do with Avro in KSQL: + +- Declare streams and tables on Kafka topics with Avro-formatted data by using ``CREATE STREAM`` and ``CREATE TABLE`` statements. +- Read from and write into Avro-formatted data by using ``CREATE STREAM AS SELECT`` and ``CREATE TABLE AS SELECT`` statements. +- Create derived streams and tables from existing streams and tables with ``CREATE STREAM AS SELECT`` and + ``CREATE TABLE AS SELECT`` statements. +- Convert data to different formats with ``CREATE STREAM AS SELECT`` and ``CREATE TABLE AS SELECT`` statements. For example, + you can convert a stream from Avro to JSON. + +Example: Create a new stream ``pageviews`` by reading from a Kafka topic with Avro-formatted messages. + .. code:: sql + + CREATE STREAM pageviews + WITH (KAFKA_TOPIC='pageviews-avro-topic', + VALUE_FORMAT='AVRO'); + +Example: Create a new table ``users`` by reading from a Kafka topic with Avro-formatted messages. + In this example you don’t need to define any columns or data types in the CREATE statement because KSQL automatically + infers this information from the latest registered Avro schema for topic ``pageviews-avro-topic`` (i.e., the latest + schema at the time the statement is first executed). + + .. code:: sql + + CREATE TABLE users + WITH (KAFKA_TOPIC='users-avro-topic', + VALUE_FORMAT='AVRO', + KEY='userid'); + + + + If you want to create a STREAM or TABLE with only a subset of all the + available fields in the Avro schema, then you must explicitly define the + columns and data types. + +Example: Create a new stream ``pageviews_reduced`` + Similar to the previous example, but with only a few of all the available fields in the Avro data. In this example, + only the ``viewtime`` and ``pageid`` columns are picked). + + .. code:: sql + + CREATE STREAM pageviews_reduced (viewtime BIGINT, pageid VARCHAR) + WITH (KAFKA_TOPIC='pageviews-avro-topic', + VALUE_FORMAT='AVRO'); + + KSQL allows you to work with streams and tables regardless of their underlying data format. This means that you can + easily mix and match streams and tables with different data formats and also convert between data formats. For + example, you can join a stream backed by Avro data with a table backed by JSON data. + +Example: Convert a JSON stream into an Avro stream. + In this example only the ``VALUE_FORMAT`` is required for Avro to achieve the data conversion. KSQL automatically + generates an appropriate Avro schema for the new ``pageviews_avro`` stream, and it registers the schema with |sr|. + + .. code:: sql + + CREATE STREAM pageviews_json (viewtime BIGINT, userid VARCHAR, pageid VARCHAR) + WITH (KAFKA_TOPIC='pageviews-json-topic', VALUE_FORMAT='JSON'); + + CREATE STREAM pageviews_avro + WITH (VALUE_FORMAT = 'AVRO') AS + SELECT * FROM pageviews_json; + + diff --git a/docs/installation/server-config/config-reference.rst b/docs/installation/server-config/config-reference.rst index 0224191d81a8..8d684ea4820c 100644 --- a/docs/installation/server-config/config-reference.rst +++ b/docs/installation/server-config/config-reference.rst @@ -119,7 +119,7 @@ properties file: ksql.schema.registry.url ------------------------ -The Schema Registry URL path to connect KSQL to. +The |sr| URL path to connect KSQL to. .. _ksql-service-id: diff --git a/docs/installation/server-config/security.rst b/docs/installation/server-config/security.rst index cbf854487aaf..e4d9b11d5d1f 100644 --- a/docs/installation/server-config/security.rst +++ b/docs/installation/server-config/security.rst @@ -3,7 +3,7 @@ Configuring Security for KSQL ============================= -KSQL supports many of the security features of both Apache Kafka and the |sr|. +KSQL supports many of the security features of both Apache Kafka and |sr|. - KSQL supports Apache Kafka security features such as :ref:`SSL for encryption `, :ref:`SASL for authentication `, and :ref:`authorization with ACLs `. @@ -29,33 +29,33 @@ You can use KSQL with a Kafka cluster in |ccloud|. For more information, see :re .. _config-security-ksql-sr: -Configuring KSQL for Secured Confluent Schema Registry ------------------------------------------------------- +Configuring KSQL for Secured |sr-long| +-------------------------------------- -KSQL can be configured to connect to the Schema Registry over HTTP by setting the -``ksql.schema.registry.url`` to the Schema Registry's HTTPS endpoint. +You can configure KSQL to connect to |sr| over HTTP by setting the +``ksql.schema.registry.url`` to the HTTPS endpoint of |sr|. Depending on your security setup, you might also need to supply additional SSL configuration. -For example, a trustStore is required if the Schema Registry's SSL certificates are not trusted by -the JVM by default; a keyStore is required if the Schema Registry requires mutual authentication. +For example, a trustStore is required if the |sr| SSL certificates are not trusted by +the JVM by default; a keyStore is required if |sr| requires mutual authentication. -SSL configuration for communication with the Schema Registry can be supplied using none-prefixed, -e.g. `ssl.truststore.location`, or prefixed e.g. `ksql.schema.registry.ssl.truststore.location`, -names. Non-prefixed names are used for settings that are shared with other communication +You can configure SSL for communication with |sr| by using non-prefixed names, +e.g. ``ssl.truststore.location``, or prefixed names, e.g. ``ksql.schema.registry.ssl.truststore.location``. +Non-prefixed names are used for settings that are shared with other communication channels, i.e. where the same settings are required to configure SSL communication -with both Kafka and Schema Registry. Prefixed names only affects communication with Schema registry -and overrides any non-prefixed setting of the same name. +with both Kafka and |sr|. Prefixed names only affect communication with |sr| +and override any non-prefixed setting of the same name. -Use the following to configure KSQL to communicate with the Schema Registry over HTTPS, -where mutual authentication is not required and the Schema Registry's SSL certificates are trusted +Use the following to configure KSQL to communicate with |sr| over HTTPS, +where mutual authentication is not required and |sr| SSL certificates are trusted by the JVM: .. code:: bash ksql.schema.registry.url=https://: -Use the following to configure KSQL to communicate with the Schema Registry over HTTPS, with +Use the following to configure KSQL to communicate with |sr| over HTTPS, with mutual authentication, with an explicit trustStore, and where the SSL configuration is shared -between Kafka and Schema Registry: +between Kafka and |sr|: .. code:: bash @@ -66,9 +66,9 @@ between Kafka and Schema Registry: ssl.keystore.password=confluent ssl.key.password=confluent -Use the following to configure KSQL to communicate with the Schema Registry over HTTP, without +Use the following to configure KSQL to communicate with |sr| over HTTP, without mutual authentication and with an explicit trustStore. These settings explicitly configure only -KSQL to Schema Registry SSL communication. +KSQL to |sr| SSL communication. .. code:: bash @@ -76,10 +76,10 @@ KSQL to Schema Registry SSL communication. ksql.schema.registry.ssl.truststore.location=/etc/kafka/secrets/sr.truststore.jks ksql.schema.registry.ssl.truststore.password=confluent -The exact settings will vary depending on the encryption and authentication mechanisms the -Confluent Schema Registry is using, and how your SSL certificates are signed. +The exact settings will vary depending on the encryption and authentication mechanisms +|sr| is using, and how your SSL certificates are signed. -You can pass authentication settings to the Schema Registry client used by KSQL +You can pass authentication settings to the |sr| client used by KSQL by adding the following to your KSQL server config. .. code:: bash diff --git a/docs/operations.rst b/docs/operations.rst index 6e2e5f578ea4..9a23587ec247 100644 --- a/docs/operations.rst +++ b/docs/operations.rst @@ -205,7 +205,7 @@ associated Avro schemas, they aren't automatically matched with the renamed topics. In the KSQL CLI, the ``PRINT`` statement for a replicated topic works, which shows -that the Avro schema ID exists in the Schema Registry, and KSQL can deserialize +that the Avro schema ID exists in |sr|, and KSQL can deserialize the Avro message. But ``CREATE STREAM`` fails with a deserialization error: .. code:: bash diff --git a/docs/tutorials/basics-local.rst b/docs/tutorials/basics-local.rst index 7ed584b6d7fd..88d8cfba8bc9 100644 --- a/docs/tutorials/basics-local.rst +++ b/docs/tutorials/basics-local.rst @@ -13,7 +13,7 @@ Watch the `screencast of Reading Kafka Data from KSQL ` is installed and running. This installation includes a Kafka broker, KSQL, |c3-short|, - |zk|, Schema Registry, REST Proxy, and Kafka Connect. + |zk|, |sr|, REST Proxy, and Kafka Connect. - If you installed |cp| via TAR or ZIP, navigate into the installation directory. The paths and commands used throughout this tutorial assume that your are in this installation directory. - Java: Minimum version 1.8. Install Oracle Java JRE or JDK >= 1.8 on your local machine diff --git a/docs/tutorials/generate-custom-test-data.rst b/docs/tutorials/generate-custom-test-data.rst index 0ed4d6c94518..a4dc6cfc7e03 100644 --- a/docs/tutorials/generate-custom-test-data.rst +++ b/docs/tutorials/generate-custom-test-data.rst @@ -14,7 +14,7 @@ provide. - :ref:`Confluent Platform ` is installed and running. This installation includes a Kafka broker, KSQL, |c3-short|, |zk|, - Schema Registry, REST Proxy, and Kafka Connect. + |sr|, REST Proxy, and Kafka Connect. - If you installed |cp| via TAR or ZIP, navigate to the installation directory. The paths and commands used throughout this tutorial assume that you're in this installation directory.