-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build using logstash >=8.10 #235
Comments
Is logstash 8.10 apache v2 license-compatible? |
Does the presence of an 8.12 logstash-oss container imply that it is? https://www.docker.elastic.co/r/logstash/logstash-oss:8.12.1 |
I believe so, I think as long as we use the -oss version we're A-OK. But IANAL. Do you want to attempt an upgrade @eherot? |
I don't know if I have cycles for it this week but if no one else can handle it in the next couple of days I may be able to tackle it... |
Hi! I'd love to see updated logstash for the opensearch project, i am also hoping for an updated docker container 👍 |
Is your feature request related to a problem? Please describe.
Yes. As described in elastic/logstash#15908, JRuby 9.3 does not include the necessary Zlib function for processing gzipped S3 log files. The necessary function (
Zlib::GzipReader.zcat()
) exists only in JRuby 9.4. Logstash 8.10 is the first Logstash release to ship with JRuby 9.4.Describe the solution you'd like
Upgrade upstream Logstash to >=8.10
Describe alternatives you've considered
Currently I have worked around this problem by modifying the
logstash-input-s3
plugin to shell out to the systemzcat
but that is not ideal for (hopefully) obvious reasons.Additional context
To briefly describe the problem: CloudFront and AWS LBs (and probably other services) concatenate multiple GZipped chunks together into a single GZipped log file. When
Zlib::GZipReader.new
encounters such a file, it reads only the first chunk and then stops, ignoring the rest of the file. The problem is described in this bug report. As a solution thezcat
function was added in Ruby 3.The text was updated successfully, but these errors were encountered: