You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 1, 2024. It is now read-only.
I had a same issue as #112, one line in the log file was too big and when the ElasticsearchLogShipper tried to send the data, it failed and kept retry over and over again, which could be costly if using AWS Elasticsearch. Could we make a change to skip the large size record ? It will fail anyway and make the whole log shipper stop working and no way to recover unless we delete the line manually.
Here is some sample code:
while (count < _batchPostingLimit && TryReadLine(current, ref nextLineBeginsAtOffset, out nextLine))
{
if (nextLine.Length > 1000000) // We can make this number configurable
{
SelfLog.WriteLine("Line is too long. Length {0}. Skip sending to ElasticSearch", nextLine.Length);
continue;
}
// Add payload
++count;
}
Thanks,
The text was updated successfully, but these errors were encountered:
I had a same issue as #112, one line in the log file was too big and when the ElasticsearchLogShipper tried to send the data, it failed and kept retry over and over again, which could be costly if using AWS Elasticsearch. Could we make a change to skip the large size record ? It will fail anyway and make the whole log shipper stop working and no way to recover unless we delete the line manually.
Here is some sample code:
Thanks,
The text was updated successfully, but these errors were encountered: