Skip to content

Commit

Permalink
Added information how to run integration tests against Confluence stack
Browse files Browse the repository at this point in the history
Instructions in README.rst
Docker Compose file to start the Confluence stack
  • Loading branch information
juha-aiven committed May 18, 2021
1 parent 90e92a7 commit 80b3e6e
Show file tree
Hide file tree
Showing 2 changed files with 69 additions and 0 deletions.
17 changes: 17 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -166,6 +166,23 @@ create a configuration template with the correct working directory.
The integration tests are run in parallel e.g. in the CI-pipeline.
The tests need to be engineered taking this in mind.

Since the integration tests run against a Kafka cluster, Kafka REST and Kafka Schema Registry using
their APIs, it's possible to run them not only against Karapace that the tests internally
setup and teardown but against long-running services. This allows you to start
Karapace independently from the integration tests and e.g. inspect Kafka contents after the tests have run.

The integration tests can be configured to use a running (ZooKeeper),
Kafka (:code:`--kafka-bootstrap-servers`), Kafka REST (:code:`--rest-url`)
and Schema Registry (:code:`--registry-url`), e.g. like this::

python -m pytest -vvv --registry-url http://127.0.0.1:8081 --rest-url http://127.0.0.1:8082/ --kafka-bootstrap-servers 127.0.0.1:9092 tests

You can run the integration tests against Kafka REST and Schema Registry from Confluent.
You can freely start these services before running the tests however you wish, but for convenience
you can use the provided Docker Compose file to start ZooKeeper, Kafka, Kafka REST and Schema Registry::

docker-compose -f tests/integration/confluent-docker-compose.yml up -d

There are several coding style checks in `GitHub Actions <https://github.com/aiven/karapace/actions>`_.
Your code changes need to pass these tests. To run the checks locally,
you can run them manually::
Expand Down
52 changes: 52 additions & 0 deletions tests/integration/confluent-docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
---
version: '2'
services:
zookeeper:
image: confluentinc/cp-zookeeper:latest
hostname: zookeeper
ports:
- "2181:2181"
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000

kafka:
image: confluentinc/cp-kafka:latest
hostname: kafka
depends_on:
- zookeeper
ports:
- "9092:9092"
- "9101:9101"
environment:
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
KAFKA_CONFLUENT_SCHEMA_REGISTRY_URL: http://schema-registry:8081

schema-registry:
image: confluentinc/cp-schema-registry:6.1.1

hostname: schema-registry
depends_on:
- kafka
ports:
- "8081:8081"
environment:
SCHEMA_REGISTRY_HOST_NAME: schema-registry
SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: 'PLAINTEXT://kafka:29092'
SCHEMA_REGISTRY_LISTENERS: http://0.0.0.0:8081

rest:
image: confluentinc/cp-kafka-rest:latest
depends_on:
- kafka
ports:
- "8082:8082"
environment:
KAFKA_REST_HOST_NAME: confluent-rest
KAFKA_REST_BOOTSTRAP_SERVERS: 'kafka:29092'
KAFKA_REST_LISTENERS: "http://rest:8082"
KAFKA_REST_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'

0 comments on commit 80b3e6e

Please sign in to comment.