-
Notifications
You must be signed in to change notification settings - Fork 207
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Document level bulk request error messages are overridden by bulk level error message when max limit is reached #3507
Comments
Related to #3504 |
Here is an example I got:
|
I also saw this while working on #3644. |
We should probably keep |
Is it possible to add data from the source or the event itself? We have a use-case, where data comes in from different applications and might fail due to field type collisions. In that case, it would be helpful to identify the origin of the events. For OTel events, this can be done by the resource attributes, for JSON messages by particular fields of the message. Since DataPrepper parsed the message, it might have access to that kind of data to add to the DLQ message. |
Describe the bug
When a bulk request fails to write to OpenSearch, failures will be handled after the
max_retries
has been exhausted. However, when logging the failure or sending the failure to the DLQ, the bulk level message ofNumber of retries reached the limit of max retries (configured value %d)
, instead of using the document's bulkResponse with the error code and the exception. This makes it so document level failure root cause is hidden due to the code here (data-prepper/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/BulkRetryStrategy.java
Line 251 in 910f451
Expected behavior
Given how clustered the code here is, I think it is simplest for us to add both the bulk-level error message (if it exists) as well as the document-level failure at all times to the failure message that is logged or sent to the dlq.
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: