You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have adjusted timeout and spool size to avoid i/o timeout error( #438). Also in case of heavy load and default spool size it was truncating logs messages (#252)(resolved by setting spool size to 200).
I perform this test for 10000 logs on two servers (both are sending to logs to logstash) but only 4577 logs got processed.
I am not sure why i am getting this issue.
The text was updated successfully, but these errors were encountered:
It was a problem in the ruby-lumberjack library, the ack was incorrectly calculated and was causing unintended acknowledgements. This problem surfaced when you had IO errors between LSF and logstash.
This should be fixed in the latest version of the logstash-input-lumberjack. Logstash 1.4.x and 1.5.x are affected by this issue, if you are running logstash 1.5 upgrade the logstash-input-lumberjack plugin.
When forwarder gets started after making heavy load, some of the logs are not getting processed.
I am getting messages like
File truncated, seeking to beginning: /usr/share/tomcat/impression_logs/archives/impressions.2015-05-12-11-44.0.log
My forwarder config file contains:
{
"network": {
"servers": [ "mySerever:6782" ],
"ssl key": "/opt/logstash-forwarder/bin/logstash-forwarder.key",
"ssl ca": "/opt/logstash-forwarder/bin/logstash-forwarder.crt",
"timeout":120
},
"files": [
{
"paths": [ "/usr/share/tomcat/impression_logs/archives/impressions*.log"]
}
]
}
I have adjusted timeout and spool size to avoid i/o timeout error( #438). Also in case of heavy load and default spool size it was truncating logs messages (#252)(resolved by setting spool size to 200).
I perform this test for 10000 logs on two servers (both are sending to logs to logstash) but only 4577 logs got processed.
I am not sure why i am getting this issue.
The text was updated successfully, but these errors were encountered: