You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We would like a feature in the coordinator which can take all incoming data arriving from remote write, and stream it to a destination such as a Kafka topic. The data can be retained in its protobuf serialized form. This would allow processing on the destination end, e.g. sending to another local coordinator for DR; or exporting to an outside system for archival. From discussion, the data could simply be handed off after normal coordinator processing, from within the HTTP duty loop.
(Per discussion with @robskillington Aug 22)
The text was updated successfully, but these errors were encountered:
We would like a feature in the coordinator which can take all incoming data arriving from remote write, and stream it to a destination such as a Kafka topic. The data can be retained in its protobuf serialized form. This would allow processing on the destination end, e.g. sending to another local coordinator for DR; or exporting to an outside system for archival. From discussion, the data could simply be handed off after normal coordinator processing, from within the HTTP duty loop.
(Per discussion with @robskillington Aug 22)
The text was updated successfully, but these errors were encountered: