-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No performance improvement on rebuild with new cache #387
Comments
Since it's entirely possible this is intended behavior, is there a way to know when HardSource is done writing it's initial cache so we can automatically restart our webpack process and not require user intervention to enable the beautiful rebuild speeds we could be seeing with HardSource? |
@spalger Thank you for this info. I haven't noticed this difference myself. Or at least not such a large difference. My first guess I would try is dumping the memory cache webpack builds during the first run. During rebuilds webpack is using the modules it stores in there instead of the modules being returned by the module factory. In the rebuild those returned modules are from |
Alright, I took a shot at implementing your suggestion and you're right, after the second compilation rebuilds are as fast as they are when starting with a populated hard-source cache. To implement this I used the following, which I arrived to by inspecting the hard-source code and logging the modules that were being removed from the cache repetitively (meaning they aren't being restored by hard-source and getting this.compiler.hooks.done.tap({
name: 'kibana-flushNonHardSourceCache',
fn: (stats) => {
// after a compilation runs for the first time we look through the
// compilation cache and remove items that are cached by the
// hard-source-webpack-plugin, leading them to be recreated on the
// next compilation and behave much faster on the third compilation
const { compilation } = stats;
for (const [key, module] of Object.entries(compilation.cache)) {
if (!module) {
continue;
}
if (module.cacheItem) {
// item was restored by hard-source-weback-plugin
// so don't flush it from the cache
continue;
}
// try to identify if this module is eligible for hard-source caching
const probablyCachedByHardSource = (
// only things webpack considers cacheable
(module.buildInfo ? module.buildInfo.cacheable : module.cacheable)
// ignored modules are cachable, but hard-source ignores them too
&& !key.startsWith('mignored ')
// modules from mini-css-extract are ignored by hard-source because
// of how they are integrated with webpack
&& !(
key.startsWith('mcss ')
|| key.startsWith('mini-css-extract-plugin.')
|| key.match(/mini-css-extract-plugin[\\/]dist[\\/]loader.js/)
)
);
if (probablyCachedByHardSource) {
compilation.cache[key] = null;
}
}
}
}); I wasn't able to easily identify if a compilation was as rebuild, or if there was a hard-source cache to rely on, so instead I opted for a general cache filter that looks for modules which will probably be replaced with hard-source-backed versions if they are removed from the cache. I'm concerned about my check to define Would you be interested in such a feature? |
Yeah I'd like to add this as A few things to change:
|
Okay, great, I was filtering by |
Expected Behavior
I expected that after the initial build populated a brand new cache that rebuilds (using the watch compiler) would see a performance improvement.
Actual Behavior
It seems that the initial build must be from cache in order to see a performance benefit, so when webpack starts without a usable cache for HardSource to use I must restart the webpack compiler process before I start to see build improvements.
Is an error being thrown?
No
Steps to Reproduce
I'm not sure how to setup a reproduction large enough to demonstrate this behavior, but if you'd like to try running the PR I'm working on to integrate HardSource into our project I can share steps for that: elastic/kibana#20105
Operating System, Node, and NPM dependency versions
See dependency versions here: https://github.com/spalger/kibana/blob/ab55f8ae15278bee1d700b901a4f38bf06cd14b2/package.json
The text was updated successfully, but these errors were encountered: