-
-
Notifications
You must be signed in to change notification settings - Fork 8.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WORKAROUND] Build freezes at 91% at AWS EC2 #3374
Comments
I have the same issue. Thank you for providing at least a partial solution! |
@jsve disabling the optimization means have an admin panel bigger in terms of file size and slower to load and run in the client. We are going to work towards making the admin lighter to build but for now you will need to have more RAM. I would say at least 2Go but 4Go is the minimum I would recommebd for build and running a nodejs app in general |
@jsve By the way the bugs you are listing are all about WSL (windows) which isn't the same thing as your issue which only because of a lack of RAM. I'll close this issue as there isn't much to do about it for know. You can build your admin in a CI (with enough ressources) and deploy it to your server to avoid overload your instance. |
@alexandrebodin - While I agree there is not much to do about it for now. I don't think this issue should be closed as it's still a current issue. If someone were to search for an issue related to this, it would not pop up unless searching for closed issues (which I myself would assume means are resolved issues). Perhaps we could add a label of 'Workaround Provided'? |
@Naismith I edited the title so people can read through to know the workaround. We are hoping to improve significantly the build perf in the next months ! |
This is not correct. Like I said, this is an issue on AWS EC2 ubuntu instances and has nothing to do with WSL. It also doesn't matter how much I scale the EC2, it still gets the same bug. The bug seems to appear most often on WSL for other projects, but the fixes should address all common platforms. Disabling parallell and updating packages seems to be a fix in most other projects. If there is consensus that this is a bug, it is still very much still present. Seems wrong to close the issue then. Limiting official support to CI-services or certain machines would also "fix" this, but that is probably not a popular direction to take with this project. |
have you tried using a 4Go Ram ec2 instance ? |
@jsve as @alexandrebodin asked, could you please confirm that you used a T3 Small or larger instance (2G RAM)? When we were testing the docs here, we only had issues Using lower instances with EC2. |
A quite old topic, but just in case anyone reads this: one need to be careful with t2 and t3/t3a as they run under a 'cpu credit' system. This is a screenshot of some graphs found under the monitoring tab of the instance in the console: This is one is a t3a.large, lightly loaded, so CPU balance remains stable. But on a free-teer-t2.micro, running an simple 'npm install' can consume all the credit and then the instance freezes. Learned it the hard way :) and then moved to use t3a.large for development and build this page is worth the read: https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/burstable-credits-baseline-concepts.html |
Thank you for this insight |
Describe the bug
This is a known bug, probably related to uglifying / minifying. See webpack-contrib/terser-webpack-plugin#21, nuxt/nuxt#2921 and a lot of other google results.
Steps to reproduce the behavior
Setup EC2 t2.micro, setup a new Strapi project, try to build. Build freezes at 91%. Once at 91% the machine eats up all the memory (from approx 30% of 1GB before 90%). Machine crashes after 20 min.
Expected behavior
Build should complete.
Screenshots
System
Additional context
Build completes if I change
optimization: { minimize: true, ...}
to false innode_modules/strapi-admin/webpack.configure.js
. Issue might also be fixed by updating terser-webpack-plugin, but I haven't tried that.The text was updated successfully, but these errors were encountered: