-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gitea keeping taking 100% CPU usage, and runs slowly in git pushing #10661
Comments
Also, I observed that there are 3 processes called |
Those three processes (939, 998 and 1321) sound like the server side of git clients. The fact that all of them have But the culprit seems to be:
which is Gitea's update hook on the repository. The And yes, those hooks are Gitea's git plumbing that make things work behind the scenes. Some ideas:
Also, a Raspberry Pi Zero W doesn't have much RAM in it. Try checking whether your system is paging too much. Use the Or maybe The (Note: don't take the values in my pictures as expected for your system: mine is a CentOS 7 VM with 12GB of RAM assigned to it). |
[Update] I found these 2 gitea serv process pops up again, while I didn't even do anything.
|
Close for now, I found it could be my github desktop causing this. I change to GitExtensions, it never hangs. |
Some git tools may check repo status on background. |
Yes, you can tell from the log of github desktop, but how come the process stalls on the server side, and taking up all cpu resource, even the client is quit, the client end PC is powered off. |
Not sure - we would need to see some logs from gitea as to what it's doing. For example the |
Yeah, I will reopen this issue, and set my gitea log level to debug to collect more info. |
This is my log conf for debugging:
is there any thing to add to help locate the problem? |
It got stall again.
This process starts at 09:46:14, I checked each log around that time, what I find is only attached the logs I can find below. |
Have you tried running the fsck and gc? Those are important, given that you're using an SD-card to store the repos. If you are unable to do it via Gitea (e.g. system gets locked up before you get the chance), you can shutdown Gitea and run the commands yourself:
(Change the initial path accordingly if required) |
Note: |
Yes, I did tried fsck and gc, thank you for your bash snippt, it works like a charm, and ends perfectly. I guess some bugs introduced between 1.11.0+dev-481-g1df701fd1(good) and 1.10.5+18-gd6657644a(bad) on arch linux-arm-6. I just formated the SD card, to rule out my file system issue, and restore with the image I backupped days ago, with the gitea version: |
UpdateIt seems that we had some issues when tagging our latest Original message (just in case it's useful)But... those are not actual releases, and you are in fact downgrading. 🤔 Releases have only three digits in their version number, with nothing else after that. Our latest release is The
|
Yes, I noticed there is issue with version number. But I am sure there is issue in new version. |
Upgrade to Gitea Version: 1.11.2 (0768823), from github releases, problem persists. |
Are you definitely running MySQL and not Sqlite? |
MySQL: version: mysql Ver 15.1 Distrib 10.3.22-MariaDB, for debian-linux-gnueabihf (armv8l) using readline 5.2 I dont think this is related to database.
|
The stacktraces are not helpful - they'd be more useful for figuring out why something is causing trouble once we have identified an issue. That being said they're not too difficult to remove from the logs for quick review. I've looked at your daemon.log and I can't see any request to Gitea that is opened without closing and these are all at most around a second. I'm looking at the others right now. |
Do you have any logs from when there is a hang? |
These are the logs I catched when there is a hang |
I dont know if this is appropraite but if you like, I could give you ssh access to my server, just to help navigate the problem. |
Running
sshd[5028] is:
|
But I can't see what was happening at that point because there are no logs. |
Change a way to think, I pasted to process hanging details up there. This process started at Mar 9 09:46:14 and below is what I extracted from auth log, I believe this’s the hanging session, it doesn’t have a stop as I killed it manually or just reboot the server.
|
Of course you cannt find it. Let me summarize the problem: |
I updated some info I collected in past few day at the issue report comment (the first comment) |
I have the same problem after I updated gitea to 1.11.2. My system is Linux amd64 and I use sqlite. |
One process call @jedy If you exit all git clients and then upgrade to v1.11.2, are they still occured? |
I can only suggest creating a core dump from one of those processes. We may at least be able to see what it's doing. The problem is that there's no way of sanitizing a core dump, so you should not share its raw contents. If you've got delve or gdb installed, maybe you could open those cores ant tell us what are they doing. |
OK I will get it back to you. |
I compiled 1.11.2 with go 1.13.8. It seems OK now. |
@jedy Could you compile a linux-arm-6 version for me if possible? |
@duchenpaul Sorry. I don't have an arm environment.It got some error to cross compile on linux. |
gdb backtrace of the hanging process:
|
I ran strace on the hanging process. A lots of SIGURG popped up. I thought it's maybe related to the new preemptible runtime of Go which uses SIGURG. So I tried go 1.13.8. After an hour of running well, I think that's the problem. |
Nice catch! We need to look closely into this. |
The backtrace shows only the first thread, which is probably not the most useful. 🙁 |
#10684 may fix this, it will be released on v1.11.3 |
Here is my gdb dump, any thought on this?
|
I'm having this same issue after updating to 1.11.2 If someone needs more info regarding this I'm available. |
|
Language statistics are only in versions 1.12+ (1.12 is the currently unreleased master branch) |
OK but found in 1.11.2, looks it is fixed! Kudos!! |
I'm not sure this is actually fixed -- if compiling with go 1.13 fixes it then master and new versions should still have the same problem using go 1.14 going forward |
Yes @mrsdizzie We're gonna need to figure out what is causing this. The go release document is not very clear as to how this is supposed to be solved. |
@zeripath @mrsdizzie let me know if you need any assistant from me |
…14 in git env As seen in trouble shooting go-gitea#11032 the new feature of Go 1.14 is causing several second delays in startup in certain situations. Debugging shows it spending several seconds handling SIGURG commands during init: ``` 6922:04:51.984234 trace init() ./modules/queue/unique_queue_wrapped.go remote: ) = 69 <0.000012> remote: [pid 15984] 22:04:51 write(1, "\ttime taken: 236.761\302\265s\n\n", 25 time taken: 236.761µs remote: remote: ) = 25 <0.000011> remote: [pid 15984] 22:04:51 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- remote: [pid 15984] 22:04:52 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- remote: [pid 15984] 22:04:52 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- ``` This causes up to 20 seconds added to a push in some cases as it happens for each call of the gitea hook command. This is likely the cause of go-gitea#10661 as well and would start to effect users once we release 1.12 which would be the first release compiled with Go 1.14. I suspect this is just a slight issue with the upstream implementatation as there have been a few very similar bugs fixed and reported: golang/go#37741 golang/go#37942 We should revisit this in the future and see if a newer version of Go has solved it, but for now disable this option in the environment that gitea hook runs in to avoid it.
…14 in git env (#11237) As seen in trouble shooting #11032 the new feature of Go 1.14 is causing several second delays in startup in certain situations. Debugging shows it spending several seconds handling SIGURG commands during init: ``` 6922:04:51.984234 trace init() ./modules/queue/unique_queue_wrapped.go remote: ) = 69 <0.000012> remote: [pid 15984] 22:04:51 write(1, "\ttime taken: 236.761\302\265s\n\n", 25 time taken: 236.761µs remote: remote: ) = 25 <0.000011> remote: [pid 15984] 22:04:51 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- remote: [pid 15984] 22:04:52 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- remote: [pid 15984] 22:04:52 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- ``` This causes up to 20 seconds added to a push in some cases as it happens for each call of the gitea hook command. This is likely the cause of #10661 as well and would start to effect users once we release 1.12 which would be the first release compiled with Go 1.14. I suspect this is just a slight issue with the upstream implementatation as there have been a few very similar bugs fixed and reported: golang/go#37741 golang/go#37942 We should revisit this in the future and see if a newer version of Go has solved it, but for now disable this option in the environment that gitea hook runs in to avoid it.
…14 in git env (go-gitea#11237) As seen in trouble shooting go-gitea#11032 the new feature of Go 1.14 is causing several second delays in startup in certain situations. Debugging shows it spending several seconds handling SIGURG commands during init: ``` 6922:04:51.984234 trace init() ./modules/queue/unique_queue_wrapped.go remote: ) = 69 <0.000012> remote: [pid 15984] 22:04:51 write(1, "\ttime taken: 236.761\302\265s\n\n", 25 time taken: 236.761µs remote: remote: ) = 25 <0.000011> remote: [pid 15984] 22:04:51 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- remote: [pid 15984] 22:04:52 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- remote: [pid 15984] 22:04:52 --- SIGURG {si_signo=SIGURG, si_code=SI_TKILL, si_pid=15984, si_uid=0} --- ``` This causes up to 20 seconds added to a push in some cases as it happens for each call of the gitea hook command. This is likely the cause of go-gitea#10661 as well and would start to effect users once we release 1.12 which would be the first release compiled with Go 1.14. I suspect this is just a slight issue with the upstream implementatation as there have been a few very similar bugs fixed and reported: golang/go#37741 golang/go#37942 We should revisit this in the future and see if a newer version of Go has solved it, but for now disable this option in the environment that gitea hook runs in to avoid it.
Gitea version (or commit ref): 1.12.0+dev-440-gf5a20250a
Git version: git version 2.20.1
Operating system: Raspbian GNU/Linux 10 (buster) on Raspberry Pi zero w
Database (use
[x]
):Can you reproduce the bug at https://try.gitea.io:
Log gist:
Architecture: linux-arm-6
Description
After upgrading the gitea to this version, I found that gitea runs very slow.
By checking the CPU usage, the CPU usage keeps 100%.
And I checked the ps, found that seems the
gitea hook
processes (2395, 964) are taking up lots resource, I would like to know what are they and how can I track them, and how to stop them.As far as I know, there is no git hook setting in my repo that runs on this server. So I think this could be the gitea's system hook.
And I see PID 2366 saying
duchenpaul/homebridge-docker.git
is executing hook, but there is no hook script set in that repoAs the PID 961, I dont know what is going on there.
I checked the log, only the sql that queried in the log, no errors
[Update] Summarize
gitea serv key-3 --config=/home/git/gitea/custom/conf/app.ini
is introduced from git client's requests, for some reason, some git request did not finishes properly, instead, taking up all the CPU for hours and slow down the system.Screenshots
The text was updated successfully, but these errors were encountered: