Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kurtosis service logs --follow creates a busy loop in closeChannelWhenEmpty #2593

Closed
praetoriansentry opened this issue Nov 7, 2024 · 1 comment
Labels
bug Something isn't working cli For bugs relating to the CLI papercut

Comments

@praetoriansentry
Copy link
Collaborator

What's your CLI version?

1.4.1

Description & steps to reproduce

Spin up something:

kurtosis run --enclave my-testnet github.com/ethpandaops/[email protected]

Follow the logs a few times:

kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &
kurtosis service logs --follow my-testnet el-1-geth-lighthouse &

At this point everything is fine. But if you stop those processes, each call will use 100% CPU:

ps aux | grep 'kurtosis service logs' | grep -v grep | awk '{print $2}' | xargs kill

At this point the kurtosis-engine process will be using roughly 1000% cpu.
image

Desired behavior

I guess this loop should either have a delay for some kind of upper bound on the amount of time that it will wait.
https://github.com/kurtosis-tech/kurtosis/blob/v1.4.1/engine/server/engine/centralized_logs/client_implementations/persistent_volume/persistent_volume_logs_database_client.go#L178-L185

Since it's looping forever, I would guess the check on len(logsChan) isn't doing what we expect.

What is the severity of this bug?

Papercut; this bug is frustrating, but I have a workaround.

What area of the product does this pertain to?

CLI: the Command Line Interface

@praetoriansentry praetoriansentry added the bug Something isn't working label Nov 7, 2024
@github-actions github-actions bot added cli For bugs relating to the CLI papercut labels Nov 7, 2024
@tedim52
Copy link
Contributor

tedim52 commented Nov 7, 2024

Looking into this now

github-merge-queue bot pushed a commit that referenced this issue Nov 10, 2024
## Description
Addresses #2593 by
getting rid of the busy loop entirely. The loop was checking to make
sure all values from the channel were read before closing the channel.
When following logs, if a stream is cancelled before reading all logs in
the channel, it loops infinitely. This changes the consumer to loop
until it's read everything from the channel, even after its been closed.

 Also, adds a port for pprof to help with profiling in the future. 

## Is this change user facing?
YES 

## References 
#2593
@tedim52 tedim52 closed this as completed Nov 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working cli For bugs relating to the CLI papercut
Projects
None yet
Development

No branches or pull requests

2 participants