-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fluentd crashes with "Too many open files" #1696
Comments
Hmm... The root cause seems fluentd/lib/fluent/plugin/in_tail.rb Line 694 in bcdde3c
But it means in_tail skips target file. I don't have a raspbian, so I'm not sure why this error happens on raspbian. Maybe, one file is larger than 2^32? |
How can i trace the origin of the bignum conversion error? The stacktrace above is only about the in_tail plugin being not able to open that file. The files are definitely not 4 GB in size. Rather 500kb to a few megabyte at most. Could it be that the file handle descriptor number is too big? When is the file closed during normal operation? What happens if the file disappears (because of rotation) when fluentd opened the file but hasn't read it entirely? I am using kafka output.
|
Applying following patch shows what value is passed to `seek:
If this |
BTW, your log has |
Merge #1742. |
Have the same error on fluentd 1.0.2 but it seems to be cause by ForwardInput:
|
I have upgrade to the last 1.2.1 version and still have some issue of "Too many open files".
I have update my configuration in consequence and setup the debug mode:
If you need more debug, I'm ready to help. |
You need to check your fd limitations and the number of files/socket/thread. |
After some time fluentd crashes with an error "Too many open files". This problem only arises in connection with fast rolling log files (sometimes a file only exists for about 10 seconds - hence the reason why fluentd was necessary to trace those logs). Other applications without the fast rolling log behavior do not have this problem. Increasing the hard/soft limits of the number of open files in /etc/security/limits.conf simply prolongs the first incident of this problem.
It seems like fluentd is not able to close those files in time to keep reading at that pace. Maybe this is not an issue as long there are not that many files to read.
Fluentd process quits with following stacktrace:
The text was updated successfully, but these errors were encountered: