-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bridge crashes with OOM error when the Kafka topic contains large amounts of data with /records api #379
Comments
Hi, |
I get this error when I add the property
I also tried adding In the logs, I see that this property is getting picked up -
So, the issue might be related to something else. |
So I sent a few messages to Kafka after setting the |
Yes the specific consumer configuration supports just a limited properties so as you already did, the way is to set the property at bridge level in the |
My memory was wrong... in bridge you can set only these options: https://strimzi.io/docs/bridge/latest/#_consumer |
@HarshithBolar can you provide a reproducer so that we can check? |
@ppatierno The use case is to deliver messages from Kafka to an external client, because we cannot expose our internal Kafka architecture, we're using the Strimzi bridge to expose an endpoint to them using which they can pull messages. The messages will be streaming at approximately 1000-1500 transactions/second and the size of each message will be around 300-500 KB. The functional tests worked very well, but we're currently facing this issue with performance testing. The issue I see here is that the bridge is fetching more than what is specified in the I'm using version 0.14.0. This can be reproduced by setting |
Sorry if I ask for more insights. |
No, the bridge is being used only for consuming. I'm using a different technology (Flink) for publishing messages to Kafka. The producer is sending around 15-20 JSON messages that add up to 5 MB. The bridge consumes all these messages in a single GET request. The producer is sending it as normal JSON strings, not base64 encoded. |
@HarshithBolar I see you wrote |
I changed the line to
But it still has no effect, I ran the same test and the bridge still consumed 5 MB in a single request. |
So thinking more, the consumer on the bridge is working how it's expected for how Kafka works.
Actually it seems that the batch is 5 MBs and anyway it's got by the consumer because it works in this way. Can I ask you the producer configuration as well (at Kafka level) ? |
The issue happens when I publish around 10 GB of data to the topic and try a fetch from Strimzi. The topic has 30 partitions. There is just one bridge running at the moment. For testing, I have created just one consumer on the bridge. The producer has default configuration, so the If there is any specific producer property you're looking for or any more information from Kafka, I can get it for you. After sending 10 GB of data and sending a request to /records api, I immediately see the following thread block errors. Do they mean anything?
|
I added a new property |
@HarshithBolar so can I close this one? |
@ppatierno Yes, I'll close this. |
Hi @ppatierno Reopening this issue as setting
|
My consumer config -
When I publish large quantities of data (several GB) and hit the
/records
api to fetch messages, the bridge crashes with aThread blocked
error and finally out of memory - Java heap space error.Looks like the bridge is trying to fetch too many messages from Kafka in a single request. Is there any way to control the max number of messages (or max size) that can be fetched per request?
Error logs - https://paste.ubuntu.com/p/9y6RS9yxBZ/
The text was updated successfully, but these errors were encountered: