Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Attempt to remove busywait in ChannelTx #407

Closed
wants to merge 1 commit into from

Conversation

EpicEric
Copy link
Contributor

@EpicEric EpicEric commented Dec 4, 2024

Some progress on #401. This passes the poll_mk_msg waker along with the window size in order to avoid busy-waiting for changes. This is currently untested, hence the draft PR.

@EpicEric
Copy link
Contributor Author

EpicEric commented Dec 4, 2024

@PonyPC Could you verify if this PR works for you?

.min(buf.len() as u32) as usize;
if writable == 0 {
// TODO fix this busywait
cx.waker().wake_by_ref();
if buf.is_empty() || self.sender.is_closed() {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if buf.is_empty() should lead to the channel being closed, but maybe the condition above should only be reached when the window_size is zero?

@EpicEric
Copy link
Contributor Author

EpicEric commented Dec 4, 2024

Mentioning the #197 PR here as it seems to have a more nuanced approach to the optimal solution.

@EpicEric EpicEric closed this Dec 4, 2024
@Eugeny
Copy link
Owner

Eugeny commented Dec 4, 2024

@EpicEric I've made a very rough attempt at this in #408 (not tested yet) - I'm not very good at low level async and could use a review
It's using an mpsc::watch as a notification channel for window size changes and waking up all senders at once when the window size changes - this wastes a poll if the window size is not enough for all senders but that should be fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants