-
Notifications
You must be signed in to change notification settings - Fork 140
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support to add headers to event externalization #855
Comments
This is a great idea. I'm playing with a prototype that exposes an EventExternalizationConfiguration.defaults(…)
.headers(MyType.class, it -> Map.of(…)) // map from event type to headers
.build(); for the implementations of |
I just had some times to try it out and the code will throw an exception as
|
I am apparently too stupid to read Javadoc. 🤦🏼♂️ I've tweaked the setup and added a basic test case. Please give it another try. Make sure you're using the latest snapshot ( |
I got it to work, sadly this fix requires the user to use Spring Also the header and the mapping function are linked now. If in the mapping you have |
I'm not sure, I follow.
That's not the intention. The intention is to remove the need for a custom bean declaration solely for the fact of adding message headers by exposing As you're dealing with a custom event in the first place (which I wouldn't necessarily consider a good idea to start with, as this means that the event produce has to be aware of the mechanics at play here), maybe we should support EventExternalizationConfiguration.defaults(…)
.mapping(KafkaEvent.class, it -> MessageBuilder.payload(it.getData())
.copyHeaders(it.customKafkaHeaders())
.build())
.… We could discover that the result of the mapping step is a |
I will have a play with how
Normally I am able to use something like this in my configuration:
and modify the serializer with special ones like for example the one from confluent that can use the Kafka schema regsitry: I guess it is all possible using the send(Message message) too, but I am not used to it and need to check what extra configuration is needed. |
We enable Jackson by default as it seems to be a reasonable intermediate serialization format. That said, we should probably prefer the However, if you set |
From my simple test, it works without extra beans/configuration. Mapping the event to a I also tried to make something work for my own use-case and that is a bit more complicated, but I know this is a edge case. In my use-case I need to send protobuf messages that use a schema regsitry to serialise and desirialse. I was able to make this work with a custom messageConvertor Currently all MessageConverter require the Object to be of Json format. To complicate things even more, I am also required to encrypt some parts of the data or the entire event, event can contain private information, that cannot be persisted in the |
I guess we will have to untangle things a bit here. It looks like, with all the specialties of your specific scenario out of the picture, the new API allows defining headers without having to re-declare the entire bean. I'll go ahead and merge the feature then and close this ticket as resolved. Feel free to open further ones in case you think we can improve things to better cater to your special case's needs. |
EventExternalizationConfiguration now exposes a ….headers(Class<T>, Function<T, Map<String, Object>) to allow to define a function that extracts headers from the event that are supposed to added to the message to be sent out. The Kafka and AMQP implementations have been augmented to consider those configurations. Furthermore, if the mapping step prior to the externalization creates a Spring Message<?>, we add routing information as fallback and send it out as is.
In case no ObjectMapper bean instance is present, we now fall back to creating a default one to render a JSON byte array. This is useful in case Jackson is on the class path but not necessarily the JacksonObjectMapperBuilder, which is located in spring-web.
Currently the method used to send Kafka message is
CompletableFuture<SendResult<K, V>> send(String topic, K key, V data);
This doesn't allow to send extra custom headers.
The only way to make this currently possible is to provide a custom KafkaEventExternalizerConfiguration bean something like this:
I had to create the headers in this bean, to not have to deal with a custom Serialiser for the Kafka Header object.
The text was updated successfully, but these errors were encountered: