-
Notifications
You must be signed in to change notification settings - Fork 12.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache const eval queries #48864
Cache const eval queries #48864
Conversation
@bors try (we should run real perf tests first I guess) |
⌛ Trying commit 34e40835466881add360ade69d2c1be23ddc2cb5 with merge 1096950c850a1c21d1e8a5b1caf6593da10997bf... |
☀️ Test successful - status-travis |
ping @simulacrum can you run a perf test? |
hmm... while this works and passes tests, it can cause AllocIds for statics to exist without a corresponding Allocation when running with incremental, which causes an ICE. |
Perf started for 1096950c850a1c21d1e8a5b1caf6593da10997bf. |
34e4083
to
6cdc8cf
Compare
Okay. I figured it out. incremental caches only works for queries that take a single DefId. You can get the query to get serialized, but the inverse won't work (and there's no compile-time error 😢 ). So now I implemented it from scratch. |
4e566ec
to
e2a3754
Compare
This should now improve incremental builds compared to pre-miri. Not just undo the regression. |
src/librustc/ty/maps/config.rs
Outdated
|
||
#[inline] | ||
fn cache_on_disk(key: Self::Key) -> bool { | ||
key.value.instance.def_id().is_local() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can non-local consts be loaded from metadata? Otherwise we just might want to cache everything.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not yet (at least not in general, some are), but it is planned
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's just return true here then.
@@ -223,6 +231,45 @@ impl<'sess> OnDiskCache<'sess> { | |||
encode_query_results::<trans_fn_attrs, _>(tcx, enc, qri)?; | |||
} | |||
|
|||
// encode successful constant evaluations | |||
let constants_index = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be great if we had a comment (here or elsewhere) explaining why constants even need special handling (like you did on IRC).
@bors try |
⌛ Trying commit e2a375426584ebaa8359d598a9ea947057a539eb with merge a04ce41981aa3b98cd562106e77e2a7c0761883c... |
☀️ Test successful - status-travis |
@Mark-Simulacrum, could you do a perf-run please? |
We could reuse the regular caching strategy, if either
The reason for this is that serializing/deserializing the error enum is a huge mess... We'd need to serialize things like |
Perf should be queued. |
That sounds fine to me but how is it related to the problems with statics that we talked about yesterday? Is this an additional problem? |
I'd change statics to always return I don't think anyone is reading the value of statics at compile time atm other than const eval itself, so having it |
e2a3754
to
0d88db1
Compare
OK, sounds good! @Mark-Simulacrum, I think that perf-run is unfortunately obsolete then. |
@bors try |
Cache const eval queries fixes #48846 (I think, still running more perf tests, but tuple-stress stops recomputing any constants) r? @michaelwoerister
☀️ Test successful - status-travis |
Perf queued. |
Style servo regressed in clean incremental, but not in patched incremental. Weird. I probably won't get to investigate this before monday Everything else gained compared to before miri. Now there are mostly regular compile-time regressions, not incremental regressions. |
@oli-obk servo style clean-incr regression is known to be spurious; you can check the graph and it's widely fluctuating. |
OK, I think we should merge this version. @eddyb, would you mind taking a look at the const evaluator changes? |
@bors r+ |
📌 Commit 0d88db1 has been approved by |
@bors p=1 Slightly bumping performance improvement PRs. |
Cache const eval queries fixes #48846 (I think, still running more perf tests, but tuple-stress stops recomputing any constants) r? @michaelwoerister
☀️ Test successful - status-appveyor, status-travis |
fixes #48846 (I think, still running more perf tests, but tuple-stress stops recomputing any constants)
r? @michaelwoerister