-
Notifications
You must be signed in to change notification settings - Fork 9.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
discovery: add upper limit for waiting on a retry #6649
Conversation
retryTime := time.Second * (0x1 << d.retries) | ||
var retryTime uint | ||
// logAndBackoffForRetry stops exponential backoff after the 8th try which is around 256 second. Afterward, it does a constant backoff. | ||
if d.retries < 9 { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use floor(log2(maxWaitInSeconds)) instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep, that's better
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
due to type conversion. the code looks a bit ugly
if d.retries < uint(math.Floor(math.Log2(float64(maxWaitInSeconds)))) {
retryTime = 0x1 << d.retries
} else {
retryTime = maxWaitInSeconds
}
I can change it to
d.retries++
var retryTimeInSecond time.Duration
// logAndBackoffForRetry stops exponential backoff when the retries are more than the maxExpoentialRetries and is set to a constant backoff afterward.
if d.retries < maxExpoentialRetries {
retryTimeInSecond = time.Duration(0x1<<d.retries) * time.Second
} else {
retryTimeInSecond = time.Duration(0x1<<maxExpoentialRetries) * time.Second
}
plog.Infof("%s: error connecting to %s, retrying in %s", step, d.url, retryTimeInSecond)
d.clock.Sleep(retryTimeInSecond)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok. sure.
2c35ab6
to
875f0a9
Compare
PTAL cc @xiang90 |
lgtm. |
I prefer something closer to https://github.com/coreos/ignition/blob/5dd13c9c174b40dfefb09d777d2bd890e8ba6fa7/internal/resource/http.go#L31. That way you can actually keep track of the number of attempts. It's probably okay though. |
@crawford My code does not modify the retries counter. So I could still track number of the failed attempts. |
Okay. This is no longer an exponential backoff. |
@crawford yeah, the exponential back off stops at 8th retry attempt. every retry afterward will be a constant backoff about 2^8 = 256 second. Is this a reasonable backoff strategy? |
It's linear not exponential though. Unless I'm missing something. |
if retries > maxExpoentialRetries { | ||
retries = maxExpoentialRetries | ||
} | ||
retryTimeInSecond := time.Duration(retries) * time.Second |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should still be (0x1 << retries)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch
For the first N retries, wait time grows exponentially. After that the wait time should fixed. The upper bound is 1 << N seconds. |
Adding upper limit ensures that expoential backoff doesn't reach more than 5 min on a re-try. FIX etcd-io#6648
875f0a9
to
eb9a012
Compare
LGTM |
CI fails due to #6535. So not relevant to this PR. |
Adding upper limit ensures that expoential backoff doesn't reach more than 5 min on a re-try.
FIX #6648