Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Galactic backport] Do not attempt to use void allocators for memory allocation #2001

Conversation

sykwer
Copy link

@sykwer sykwer commented Aug 24, 2022

We need to fix the type of allocator passed to a publisher and a subscriber, otherwise, extra memory space is allocated at various parts of rcl.
For instance, a memory area of size sizeof(rcutils_string_map_impl_t) * sizeof(MessageT) is allocated here.

This bug not only causes extra memory consumption, but also causes std::bad_alloc in the publisher (and subscriber) that handles the message type when a huge message (larger than a few hundred MB) such as static size array is defined.

backport #1657

Signed-off-by: Takahiro Ishikawa [email protected]

Keep a rebound allocator for byte-sized memory blocks around
for publisher and subscription options.

Follow-up after 1fc2d58

Signed-off-by: Michel Hidalgo <[email protected]>
@sykwer sykwer closed this Aug 24, 2022
@sykwer sykwer deleted the sykwer/galactic_backport_fix_memory_allocation branch August 24, 2022 13:54
@sykwer sykwer restored the sykwer/galactic_backport_fix_memory_allocation branch August 24, 2022 13:55
@sykwer sykwer reopened this Aug 24, 2022
@sykwer sykwer changed the title [Galactic backport] Do not attempt to use void allocators for memory allocation #1657 [Galactic backport] Do not attempt to use void allocators for memory allocation Aug 24, 2022
@fujitatomoya
Copy link
Collaborator

@sykwer thanks for bringing this up, instead of having dedicated backport MR, can #1657 backport to foxy / galactic straightforward? let me try.

@clalancette
Copy link
Contributor

Given that Galactic is now EOL, closing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants