Skip to content

drewtech/kafka-pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kafka processing with python - A example datapipeline

Add json to messages.txt. The producer will pick them up, then transformer with change the data, add to another pipeline, onto a 3rd consumer.

You can view the kafka pipeline with (Kafdrop), it is defined in docker-compose.yml

$ localhost:9000

Pre-requisites

Docker, docker-compose.

To run the producer/consumers locally, install python3, then use pip3 to install the pre-requsites in requirements.txt

Local Development

Build:

$ docker-compose build

Run:

$ ./scripts/run.sh

Test Suite

Run tests:

TBA

Debugging

TBA

Logs

Logs are output to stdout and file logs.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published