-
Notifications
You must be signed in to change notification settings - Fork 1k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Browse files
Browse the repository at this point in the history
* docs: initial commit of markdown docs (DOCS-2815) * docs: add stub topics (DOCS-2817) * docs: add docs_dir config * docs: try another docs_dir config * docs: test push for webhook * docs: another test push for webhook * docs: move .readthedocs.yml for RTD integration * docs: try out another docs_dir path * docs: try another docs_dir path * docs: try no docs_dir path * docs: move requirements.txt * docs: delete old readthedocs.yml * docs: test readthedocs.yml push * docs: try another docs_dir * docs: another stab in the dark * docs: move mkdocs.yml to root * docs: move all content up one level * docs: fix a couple of broken links * docs: refactor old REST API ref topic into per-endpoint topics * docs: fix tables that were munged during migration (#3710) * docs: try to fix munged list in admonition * docs: copy new annotation fields content in UDF reference (DOCS-2858) (#3738) * docs: copy new recommended KSQL server config (DOCS-2816) (#3740) * docs: migrate partitioning topic to markdown (DOCS-2776) (#3743) * docs: add Google Analytics tag (DOCS-2754) (#3745)
- Loading branch information
1 parent
3d9a1e8
commit 5046864
Showing
130 changed files
with
20,587 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
# .readthedocs.yml | ||
# Read the Docs configuration file | ||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details | ||
|
||
# Required | ||
version: 2 | ||
|
||
# Build documentation in the docs/ directory with Sphinx | ||
#sphinx: | ||
# configuration: docs/conf.py | ||
|
||
# Build documentation with MkDocs | ||
mkdocs: | ||
configuration: mkdocs.yml | ||
|
||
# Optionally build your docs in additional formats such as PDF and ePub | ||
formats: all | ||
|
||
# Optionally set the version of Python and requirements required to build your docs | ||
python: | ||
version: 3.7 | ||
install: | ||
- requirements: docs-md/requirements.txt |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
# ksqldb | ||
Development repo for new ksqlDB documentation |
Large diffs are not rendered by default.
Oops, something went wrong.
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,40 @@ | ||
--- | ||
layout: page | ||
title: Collections | ||
tagline: Streams and tables | ||
description: Learn about ksqlDB's mutable and immutable collections named tables and streams | ||
keywords: ksqldb, stream, table | ||
--- | ||
|
||
Collections | ||
=========== | ||
|
||
Collections provide durable storage for sequences of events. ksqlDB offers | ||
two kinds of collections: streams and tables. Both operate under a simple | ||
key/value model. | ||
|
||
Streams | ||
------- | ||
|
||
Streams are immutable, append-only collections. They're useful for representing | ||
a series of historical facts. Adding multiple records with the same key means | ||
that they are simply appended to the end of the stream. | ||
|
||
Tables | ||
------ | ||
|
||
Tables are mutable collections. Adding multiple records with the same key means | ||
the table keeps only the value for the last key. They're helpful for modeling | ||
change over time, and they are often used to represent aggregations. | ||
|
||
|
||
Because ksqlDB leverages {{ site.aktm }} for its storage layer, creating a new | ||
collection equates to defining a stream or a table over a Kafka topic. You can | ||
declare a collection over an existing topic, or you can create a new topic for | ||
the collection at declaration time. | ||
|
||
TODO: expand on this | ||
|
||
How to create a collection and a topic at the same time. | ||
How to declare a collection over an existing topic. | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
--- | ||
layout: page | ||
title: KSQL Concepts | ||
tagline: Foundations of KSQL | ||
description: Learn about KSQL under the hood. | ||
keywords: ksql, architecture | ||
--- | ||
|
||
KSQL Concepts | ||
============= | ||
|
||
These topics describe KSQL concepts in {{ site.cp }}. | ||
|
||
- [KSQL Architecture](ksql-architecture.md) | ||
- [KSQL and Kafka Streams](ksql-and-kafka-streams.md) | ||
- [Time and Windows in KSQL](time-and-windows-in-ksql-queries.md) | ||
|
||
Page last revised on: {{ git_revision_date }} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,128 @@ | ||
--- | ||
layout: page | ||
title: KSQL and Kafka Streams | ||
tagline: The relationship between KSQL and Kafka Streams | ||
description: How KSQL leverages the technology of Kafka Streams | ||
--- | ||
|
||
KSQL and Kafka Streams | ||
====================== | ||
|
||
KSQL is the streaming database for {{ site.aktm }}. With KSQL, you | ||
can write event streaming applications by using a SQL-like query | ||
language. | ||
|
||
{{ site.kstreams }} is the {{ site.aktm }} library for writing streaming | ||
applications and microservices in Java and Scala. | ||
|
||
KSQL is built on {{ site.kstreams }}, a robust stream processing framework | ||
that is part of Kafka. | ||
|
||
![The Confluent Platform stack, with KSQL built on Kafka Streams](../img/ksql-kafka-streams-core-kafka-stack.png) | ||
|
||
KSQL gives you a query layer for building event streaming applications on Kafka | ||
topics. KSQL abstracts away much of the complex programming that's required for | ||
real-time operations on streams of data, so that one line of KSQL can do the | ||
work of a dozen lines of Java or Scala. | ||
|
||
For example, to implement simple fraud-detection logic on a Kafka topic | ||
named `payments`, you could write one line of KSQL: | ||
|
||
```sql | ||
CREATE STREAM fraudulent_payments AS | ||
SELECT fraudProbability(data) FROM payments | ||
WHERE fraudProbability(data) > 0.8 | ||
EMIT CHANGES; | ||
``` | ||
|
||
The equivalent Scala code on Kafka Streams might resemble: | ||
|
||
```scala | ||
// Example fraud-detection logic using the Kafka Streams API. | ||
object FraudFilteringApplication extends App { | ||
|
||
val builder: StreamsBuilder = new StreamsBuilder() | ||
val fraudulentPayments: KStream[String, Payment] = builder | ||
.stream[String, Payment]("payments-kafka-topic") | ||
.filter((_ ,payment) => payment.fraudProbability > 0.8) | ||
fraudulentPayments.to("fraudulent-payments-topic") | ||
|
||
val config = new java.util.Properties | ||
config.put(StreamsConfig.APPLICATION_ID_CONFIG, "fraud-filtering-app") | ||
config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-broker1:9092") | ||
|
||
val streams: KafkaStreams = new KafkaStreams(builder.build(), config) | ||
streams.start() | ||
} | ||
``` | ||
|
||
KSQL is easier to use, and Kafka Streams is more flexible. Which | ||
technology you choose for your real-time streaming applications depends | ||
on a number of considerations. Keep in mind that you can use both KSQL | ||
and Kafka Streams together in your implementations. | ||
|
||
Differences Between KSQL and Kafka Streams | ||
------------------------------------------ | ||
|
||
The following table summarizes some of the differences between KSQL and | ||
Kafka Streams. | ||
|
||
| Differences | KSQL | Kafka Streams | | ||
| ------------ | --------------- | -------------------------------- | | ||
| You write: | KSQL statements | JVM applications | | ||
| Graphical UI | Yes, in {{ site.c3 }} and {{ site.ccloud }} | No | | ||
| Console | Yes | No | | ||
| Data formats | Avro, JSON, CSV | Any data format, including Avro, JSON, CSV, Protobuf, XML | | ||
| REST API included | Yes | No, but you can implement your own | | ||
| Runtime included | Yes, the KSQL server | Applications run as standard JVM processes | | ||
| Queryable state | No | Yes | | ||
|
||
Developer Workflows | ||
------------------- | ||
|
||
There are different workflows for KSQL and Kafka Streams when you | ||
develop streaming applications. | ||
|
||
- KSQL: You write KSQL queries interactively and view the results in | ||
real-time, either in the KSQL CLI or in {{ site.c3 }}. You can save | ||
a .sql file and deploy it to production as a \"headless\" | ||
application, which runs without a GUI, CLI, or REST interface on | ||
KSQL servers. | ||
- Kafka Streams: You write code in Java or Scala, recompile, and run | ||
and test the application in an IDE, like IntelliJ. You deploy the | ||
application to production as a jar file that runs in a Kafka cluster. | ||
|
||
KSQL and Kafka Streams: Where to Start? | ||
--------------------------------------- | ||
|
||
Use the following table to help you decide between KSQL and Kafka | ||
Streams as a starting point for your real-time streaming application | ||
development. | ||
|
||
|
||
| Start with KSQL when... | Start with Kafka Streams when... | | ||
| -------------------------------- | --------------------------------- | | ||
| New to streaming and Kafka | Prefer writing and deploying JVM applications like Java and Scala; for example, due to people skills, tech environment | | ||
| To quicken and broaden the adoption and value of Kafka in your organization | Use case is not naturally expressible through SQL, for example, finite state machines | | ||
| Prefer an interactive experience with UI and CLI | Building microservices | | ||
| Prefer SQL to writing code in Java or Scala | Must integrate with external services, or use 3rd-party libraries (but KSQL UDFs may help) | | ||
| Use cases include enriching data; joining data sources; filtering, transforming, and masking data; identifying anomalous events | To customize or fine-tune a use case, for example, with the {{ site.kstreams }} Processor API: custom join variants, or probabilistic counting at very large scale with Count-Min Sketch | | ||
| Use case is naturally expressible by using SQL, with optional help from User Defined Functions (UDFs) | Need queryable state, which KSQL doesn't support | | ||
| Want the power of {{ site.kstreams }} but you aren't on the JVM: use the KSQL REST API from Python, Go, C#, JavaScript, shell | | | ||
|
||
Usually, KSQL isn't a good fit for BI reports, ad-hoc querying, or | ||
queries with random access patterns, because it's a continuous query | ||
system on data streams. | ||
|
||
To get started with KSQL, try the [KSQL Tutorials and Examples](../tutorials/index.md). | ||
|
||
To get started with {{ site.kstreams }}, try the [Streams Quick | ||
Start](https://docs.confluent.io/current/streams/quickstart.html). | ||
|
||
Next Steps | ||
---------- | ||
|
||
- [Write Streaming Queries Against {{ site.aktm }} Using KSQL](../tutorials/basics-docker.md) | ||
- [KSQL Developer Guide](../developer-guide/index.md) | ||
- [Streams Developer | ||
Guide](https://docs.confluent.io/current/streams/developer-guide/index.html) |
Oops, something went wrong.