-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Auto_flush in multiline codec doesn't work with Kafka input plugin #4771
Comments
Auto_flush doesn't happen, although set; configuration something like this: input { |
I wouldn't expect it to work well with kafka (based on my expectations of what multiline codec tries to solve). What are you trying to do? |
multiline is not supported in kafka because messages can come out of order in different partitions. You could use the message key to group all parts of multiline data into one partition, but this is not something we've tested or hear about. |
Ok, that's really good to know! I was testing only with one Kafka partition and it worked pretty well, except that the last message from a burst was not handled, because of auto_flush not working. But, if there are issues when using multiple partitions, I will have to find another solution. Thanks! |
@ciprianpascu I don't know what your source of data is, but if it is files, you can use filebeat which has multiline functionality and can soon send to kafka directly (elastic/beats#942) |
Also, any beats in general in the future can send to Kafka |
According to this, elastic/beats#943, beats can not yet send direct to Kafka. Has that been implemented and released? |
@TinLe its been implemented and merged to master in elastic/beats#942. Not yet released. I believe its targeted for the 5.0 release |
Nice! :) |
My source of data is some processes sending logs, but now it seems I won't need the multiline codec, so auto_flush not needed either :). |
No description provided.
The text was updated successfully, but these errors were encountered: