You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I took over maintenance of an application that uses Spring Kafka. My former colleague configured the application to make use of ErrorHandlingDeserializer, but he didn't configure a failedDeserializationFunction and didn't test what happens in case deserialization errors occur.
That application finally got deployed to a real testing environment, and the application's KafkaListener component kept getting records with null values. I couldn't figure out why the values were null. When looking at the data in the topic using the Kafka CLI tools, I saw that the records did have values. Other than those record value NPEs, there weren't any errors in the application logs.
After some time spent debugging, I found out that ErrorHandlingDeserializer is silently swallowing exceptions when failedDeserializationFunction isn't set, and it passes along null values.
Is this a good design? Would one ever want to use ErrorHandlingDeserializer without a failedDeserializationFunction, or should it be a mandatory field?
The text was updated successfully, but these errors were encountered:
Perhaps you have a batch listener - with a record listener, such records are sent directly to the error handler and are never sent to the listener.
With a batch listener, the entire batch is sent to the listener and you have to examine the headers to see if a deserialization exception occurred on any of the records.
When using an ErrorHandlingDeserializer with a batch listener, you must check for the deserialization exceptions in message headers. When used with a RecoveringBatchErrorHandler, you can use that header to determine which record the exception failed on and communicate to the error handler via a BatchListenerFailedException.
@KafkaListener(id = "test", topics = "test")
void listen(List<Thing> in, @Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers) {
for (int i = 0; i < in.size(); i++) {
Thing thing = in.get(i);
if (thing == null
&& headers.get(i).get(SerializationUtils.VALUE_DESERIALIZER_EXCEPTION_HEADER) != null) {
DeserializationException deserEx = ListenerUtils.byteArrayToDeserializationException(this.logger,
(byte[]) headers.get(i).get(SerializationUtils.VALUE_DESERIALIZER_EXCEPTION_HEADER));
if (deserEx != null) {
logger.error(deserEx, "Record at index " + i + " could not be deserialized");
}
throw new BatchListenerFailedException("Deserialization", deserEx, i);
}
process(thing);
}
}
I took over maintenance of an application that uses Spring Kafka. My former colleague configured the application to make use of
ErrorHandlingDeserializer
, but he didn't configure afailedDeserializationFunction
and didn't test what happens in case deserialization errors occur.That application finally got deployed to a real testing environment, and the application's
KafkaListener
component kept getting records with null values. I couldn't figure out why the values were null. When looking at the data in the topic using the Kafka CLI tools, I saw that the records did have values. Other than those record value NPEs, there weren't any errors in the application logs.After some time spent debugging, I found out that
ErrorHandlingDeserializer
is silently swallowing exceptions whenfailedDeserializationFunction
isn't set, and it passes along null values.Is this a good design? Would one ever want to use
ErrorHandlingDeserializer
without afailedDeserializationFunction
, or should it be a mandatory field?The text was updated successfully, but these errors were encountered: