-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Downloading can panic with "supplied instant is later than self" #8042
Comments
It's not clear from the link which job failed (it looks like they ran again), so I'm not sure which platform this was on. I assume it was one of the Linux builders. I assume this was triggered by this line. I don't see how The only other thing I can think of is that CLOCK_MONOTONIC was not monotonic. I know in the (distant) past, vm's tended to be buggy where clocks were concerned, but I don't know what the current state of modern VMs is. I assume GitHub actions uses azure's custom hyper-v? Google doesn't reveal any people reporting issues in recent times, so I would assume that is less likely. |
Bah sorry I forgot that perma-linking for github actions builds doesn't actually perma-link. If I see this again I'll try to capture more context, but tbh the only other relevant context at the time was probably the Cargo/rustc version, which probably wasn't going to help all that much. I agree that the likely-ish culprit is a non-monotonic monotonic clock (oxymoron much?). I believe GH actions uses Azure yeah, though. I've gone ahead and tried to fix that line at least in #8114 and we can try to collect more info in the future if it shows up again. |
Try to avoid panics on buggy (?) clocks Try to avoid panics with `Instant` by only performing infallible operations. This tweaks a comparison located in #8042 to use `Instant` comparisons rather than `Duration` comparisons which should hopefully eliminate a source of panics in the face of buggy (maybe?) clocks. I'm not sure whether this actually fixes the original issue, but seeing that we have a pretty low chance of the issue recurring, it's probably fine to go ahead and say... Closes #8042
Linux 5.7.19-2-MANJARO
|
@motecshine that's a backtrace from rustup, not Cargo. (and this is the Cargo repository, not rustup) |
I just saw this happen on CI, where during
cargo fetch --locked
I saw:That's very bad!
Unfortunately this is likely going to be notoriously hard to reproduce and try to fix, but I wanted to open an issue at least as a TODO item to try to audit all time-related tracking in the downloading.
The text was updated successfully, but these errors were encountered: