GreenOps system backend
docker-compose up --build
###DOWN
docker-compose down
A rest API that accepts raw telemetry messages relays them to a kafka topic.
- Input: REST endpoint
http://localhost:80/process
- Output: Kafka topic
inbound-telemetry
cd InboundTelemetryService/
docker build -t gos-inbound-telemetry-service .
docker run -p 80:80 -e PYTHONUNBUFFERED=1 <image id>
curl --location 'http://localhost:80/process' \
--header 'Content-Type: application/json' \
--data '{
"query":"sanity"
}'
winpty docker exec -it greenopsstem-kafka-1 kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic inbound-telemetry --from-beginning
docker-compose logs -f inbound-telemetry-service
- Input: Kafka topic(s):
inbound-telemetry
- Output Mongo collection: (DB:collection)
gos_mongo
:inbound_telemetry
cd DataWritingService/
docker build -t gos-telemetry-writing-service .
docker run -e PYTHONUNBUFFERED=1 <image id>
(this will probably fail to run outside docker compose since no kafka broker nor mongo db are available when running this service standalone)
Get all entries:
docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").inbound_telemetry.find().pretty()'
Get all entries sorted descending by timestamp:
docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").inbound_telemetry.find().sort({"timestamp": -1}).pretty()'
docker-compose logs -f telemetry-writing-service
Read raw telemetry messages from a kafka topic, parses them and pushes the result into another kafka topic
- Input: Kafka topic
inbound-telemetry
- Output: Kafka topic
branch-energy
cd InboundTelemetryService/
docker build -t gos-telemetry-ingest-service .
docker run -p 80:80 -e PYTHONUNBUFFERED=1 <image id>
winpty docker exec -it greenopsstem-kafka-1 kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic branch-energy --from-beginning
docker-compose logs -f telemetry-ingest-service
- Input: Kafka topic(s):
branch-energy
- Output Mongo collection: (DB:collection)
gos_mongo
:branch_energy
cd DataWritingService/
docker build -t gos-branch-energy-writing-service .
docker run -e PYTHONUNBUFFERED=1 <image id>
(this will probably fail to run outside docker compose since no kafka broker nor mongo db are available when running this service standalone)
Get all entries:
docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").branch_energy.find().pretty()'
Get all entries sorted descending by payload_timestamp:
docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").branch_energy.find().sort({"payload_timestamp": -1}).pretty()'
Get all entries sorted descending by energy_timestamp:
docker exec greenopsstem-mongo-1 mongosh --eval 'db.getSiblingDB("gos_mongo").branch_energy.find().sort({"energy_timestamp": -1}).pretty()'
docker-compose logs -f branch-energy-writing-service
- Input/output: internal REST endpoint
http://branch-energy-reading-service:5000/branch/energy
cd BranchEnergyReadingService/
docker build -t gos-branch-energy-reading-service .
docker run -e PYTHONUNBUFFERED=1 <image id>
(this will probably fail to run outside docker compose since no kafka broker nor mongo db are available when running this service standalone)
curl -X POST http://branch-energy-reading-service:5000/branch/energy -H "Content-Type: application/json" -d '{"repo_name": "sanity-repo", "branch_name": "sanity-branch"}'
docker-compose logs -f branch-energy-reading-service