-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Apply fails on Gitlab 16.4.0 with "Pull request must be mergeable before running apply." #3722
Comments
I will try 0.25.0 and see.
…On Thu, Aug 31, 2023 at 3:07 AM Tomislav Tomašić ***@***.***> wrote:
Community Note
- Please vote on this issue by adding a 👍 reaction
<https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/>
to the original issue to help the community and maintainers prioritize this
request. Searching for pre-existing feature requests helps us consolidate
datapoints for identical requirements into a single place, thank you!
- Please do not leave "+1" or other comments that do not add relevant
new information or questions, they generate extra noise for issue followers
and do not help prioritize the request.
- If you are interested in working on this issue or have submitted a
pull request, please leave a comment.
------------------------------
Overview of the Issue
Running atlantis apply on an approved Gitlab PR fails intermittently with Apply
Failed: Pull request must be mergeable before running apply..
Reproduction Steps
1. Run atlantis apply on approved PR
2. Atlantis comments with error: **Apply Failed**: Pull request must
be mergeable before running apply.
Logs Logs
August 23, 2023 at 17:03 (UTC+1:00) {"level":"error","ts":"2023-08-23T16:03:28.369Z","caller":"events/instrumented_project_command_runner.go:84","msg":"Failure running apply operation: Pull request must be mergeable before running apply.","json":{"repo":"ume-platform-engineering/tf-env-data-engineering","pull":"189"},"stacktrace":"github.com/runatlantis/atlantis/server/events.RunAndEmitStats\n\tgithub.com/runatlantis/atlantis/server/events/instrumented_project_command_runner.go:84\ngithub.com/runatlantis/atlantis/server/events.(*InstrumentedProjectCommandRunner).Apply\n\tgithub.com/runatlantis/atlantis/server/events/instrumented_project_command_runner.go:46\ngithub.com/runatlantis/atlantis/server/events.runProjectCmds\n\tgithub.com/runatlantis/atlantis/server/events/project_command_pool_executor.go:48\ngithub.com/runatlantis/atlantis/server/events.(*ApplyCommandRunner).Run\n\tgithub.com/runatlantis/atlantis/server/events/apply_command_runner.go:166\ngithub.com/runatlantis/atlantis/server/events.(*DefaultCommandRunner).RunCommentCommand\n\tgithub.com/runatlantis/atlantis/server/events/command_runner.go:298 <http://github.com/runatlantis/atlantis/server/events.RunAndEmitStats%5Cn%5Ctgithub.com/runatlantis/atlantis/server/events/instrumented_project_command_runner.go:84%5Cngithub.com/runatlantis/atlantis/server/events.(*InstrumentedProjectCommandRunner).Apply%5Cn%5Ctgithub.com/runatlantis/atlantis/server/events/instrumented_project_command_runner.go:46%5Cngithub.com/runatlantis/atlantis/server/events.runProjectCmds%5Cn%5Ctgithub.com/runatlantis/atlantis/server/events/project_command_pool_executor.go:48%5Cngithub.com/runatlantis/atlantis/server/events.(*ApplyCommandRunner).Run%5Cn%5Ctgithub.com/runatlantis/atlantis/server/events/apply_command_runner.go:166%5Cngithub.com/runatlantis/atlantis/server/events.(*DefaultCommandRunner).RunCommentCommand%5Cn%5Ctgithub.com/runatlantis/atlantis/server/events/command_runner.go:298>"}
August 23, 2023 at 17:03 (UTC+1:00) {"level":"info","ts":"2023-08-23T16:03:29.799Z","caller":"events/automerger.go:20","msg":"not automerging because project at dir \"XXX\", workspace \"default\" has status \"apply_errored\"","json":{"repo":"XXX","pull":"189"}}
August 23, 2023 at 17:03 (UTC+1:00) {"level":"error","ts":"2023-08-23T16:03:32.912Z","caller":"logging/simple_logger.go:163","msg":"invalid key: ***@***.******@***.******@***.******@***.******@***.******@***.******@***.******@***.***/negroni.go:111\nnet/http.serverHandler.ServeHTTP\n\tnet/http/server.go:2936\nnet/http.(*conn).serve\n\tnet/http/server.go:1995 ***@***.******@***.******@***.******@***.******@***.******@***.******@***.******@***.***/negroni.go:111%5Cnnet/http.serverHandler.ServeHTTP%5Cn%5Ctnet/http/server.go:2936%5Cnnet/http.(*conn).serve%5Cn%5Ctnet/http/server.go:1995>"}
Environment details
- Atlantis version: v0.23.2 & v0.23.4
- Deployment method: ecs
- If not running the latest Atlantis version have you tried to
reproduce this issue on the latest version: no
- Gitlab 16.4.0
Atlantis server-side config file:
{
"taskDefinitionArn": "arn:aws:ecs:XXX:XXX",
"containerDefinitions": [
{
"name": "atlantis",
"image": "XXX/atlantis:v0.24.3-0bcdeaa1",
"cpu": 2048,
"memory": 4096,
"memoryReservation": 128,
"portMappings": [
{
"containerPort": 4141,
"hostPort": 4141,
"protocol": "tcp"
}
],
"essential": true,
"environment": [
{
"name": "ATLANTIS_ALLOW_REPO_CONFIG",
"value": "false"
},
{
"name": "ATLANTIS_HIDE_PREV_PLAN_COMMENTS",
"value": "false"
},
{
"name": "ATLANTIS_WRITE_GIT_CREDS",
"value": "true"
},
{
"name": "ATLANTIS_SILENCE_NO_PROJECTS",
"value": "false"
},
{
"name": "ATLANTIS_GITLAB_USER",
"value": "XXX"
},
{
"name": "ATLANTIS_LOG_LEVEL",
"value": "debug"
},
{
"name": "ATLANTIS_AUTOMERGE",
"value": "true"
},
{
"name": "ATLANTIS_BITBUCKET_USER",
"value": ""
},
{
"name": "ATLANTIS_REPO_CONFIG_JSON",
"value": "{\"repos\":[{\"allow_custom_workflows\":true,\"allowed_overrides\":[\"workflow\"],\"apply_requirements\":[\"undiverged\",\"mergeable\",\"approved\"],\"id\":\"/.*/\",\"repo_config_file\":\"atlantis.yaml\"}],\"workflows\":{\"default\":{\"apply\":{\"steps\":[\"apply\",{\"run\":\"[ ! -z \\\"$PROJECT_NAME\\\" ] \\u0026\\u0026 export TAG=$PROJECT_NAME || export TAG=last-applied \\u0026\\u0026 git config --global user.name XXXn \\u0026\\u0026 git config --global user.email XXX \\u0026\\u0026 git fetch --tags -f \\u0026\\u0026 git fetch --all --tags \\u0026\\u0026 (git tag --delete $TAG || true) \\u0026\\u0026 git tag -a $TAG -m \\\"Tagged automatically by atlantis\\\" \\u0026\\u0026 (git push origin --delete $TAG || true) \\u0026\\u0026 git push origin $TAG\"}]},\"plan\":{\"steps\":[\"init\",\"plan\"]}}}}"
},
{
"name": "ATLANTIS_PARALLEL_POOL_SIZE",
"value": "50"
},
{
"name": "ATLANTIS_REPO_ALLOWLIST",
"value": "XXX"
},
{
"name": "ATLANTIS_GITLAB_HOSTNAME",
"value": "gitlab.com"
},
{
"name": "ATLANTIS_DEFAULT_TF_VERSION",
"value": "v0.13.4"
},
{
"name": "ATLANTIS_GH_APP_ID",
"value": ""
},
{
"name": "ATLANTIS_BITBUCKET_BASE_URL",
"value": ""
},
{
"name": "ATLANTIS_PORT",
"value": "4141"
},
{
"name": "ATLANTIS_GH_USER",
"value": ""
},
{
"name": "ATLANTIS_ATLANTIS_URL",
"value": "https://XXX"
}
],
"mountPoints": [],
"volumesFrom": [],
"secrets": [XXX]
Repo atlantis.yaml file:
version: 3projects:
- name: platform
dir: XXX/platform
autoplan:
when_modified: [ "*.tf" ]
enabled: true
...
Additional Context
Issue started happening on Atlantis v0.23.2. Still happening after
upgrading to v0.23.4.
- #3277 <#3277>
—
Reply to this email directly, view it on GitHub
<#3722>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQ3ERG5OCLOBLNCOO4SZS3XYBO5HANCNFSM6AAAAAA4F2G4FE>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Hi, my organization is also experiencing this issue after upgrading to Gitlab v16.4.0. It affects the latest version of atlantis (v0.25.0), as well as version v0.24.2. The only workaround is to disable required approvals altogether in repos.yaml, which of course is not ideal. Overview of the IssueWhen executing Reproduction Steps
LogsFrom systemctl status atlantis:
Environment details
Atlantis server-side config file: root@cit-prod-atlantis:/home/atlantis# cat atlantis.yaml
---
# Minimum required settings
atlantis-url: https://atlantis.c.bvnt.co:4141
gitlab-hostname: gitlab.c.bvnt.co
gitlab-token: '<SNIP>'
gitlab-user: atlantisbot
gitlab-webhook-secret: '<SNIP>'
repo-whitelist: 'gitlab.c.bvnt.co/*'
# Optional Settings
data-dir: /home/atlantis/.atlantis
default-tf-version: v0.11.15
port: 4141
repo-config: /home/atlantis/repos.yaml
ssl-cert-file: /home/atlantis/ssl/wc.c.bvnt.co.crt
ssl-key-file: /home/atlantis/ssl/wc.c.bvnt.co.key
# This setting means that atlantis will merge the destination branch into the source before running plans
checkout-strategy: merge
... Repo ---
version: 3
projects:
- dir: prod/gcp-hashicorp-vault
workflow: vault
terraform_version: v1.5.5
autoplan:
when_modified: ["*.tf"]
.... # Repeated 3-400 times
.... # this also occurs in repos with only one or two atlantis "projects"
root@cit-prod-atlantis:/home/atlantis# cat repos.yaml
repos:
- id: /.*/
apply_requirements: [approved] # This is the line we removed to work around this issue
allowed_overrides: [workflow]
# Global workflows
workflows:
default:
plan:
steps:
- run: /home/atlantis/atlantis-verify.sh plan $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- init
- plan
- run: /home/atlantis/bv-conftest.sh
apply:
steps:
- run: /home/atlantis/atlantis-verify.sh apply $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- apply
vault:
plan:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh plan $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- init:
extra_args: ["-reconfigure"]
- plan
- run: /home/atlantis/bv-conftest.sh
apply:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh apply $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- apply
tfvars:
plan:
steps:
- run: /home/atlantis/atlantis-verify.sh plan $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- run: /bin/cp "$DIR/_tfvars/$WORKSPACE.tfvars" "$DIR/temp-tfvars-file-copy.auto.tfvars"
- init
- plan
- run: /home/atlantis/bv-conftest.sh
apply:
steps:
- run: /home/atlantis/atlantis-verify.sh apply $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- apply
tfvars_vault:
plan:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh plan $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- run: /bin/cp "$DIR/_tfvars/$WORKSPACE.tfvars" "$DIR/temp-tfvars-file-copy.auto.tfvars"
- init
- plan
- run: /home/atlantis/bv-conftest.sh
apply:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh apply $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- apply
global_sgrules:
plan:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh plan $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- init
- plan
apply:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh apply $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- apply:
extra_args: ['--parallelism=1']
vaultforbluevoyantproduction:
plan:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh plan $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- init:
extra_args: ["-reconfigure"]
- plan
- run: /home/atlantis/bv-conftest.sh
apply:
steps:
- env:
name: VAULT_TOKEN
command: /home/atlantis/vault-login.sh
- run: /home/atlantis/atlantis-verify.sh apply $COMMENT_ARGS $BASE_REPO_OWNER $BASE_REPO_NAME
- apply Additional Context#3277 seems related, but has not fixed the problem fully. |
@jamengual Any updates on this? Thanks!
|
I do not use Gitlab sadly, never got to try this @lukemassa do you had this issue? |
I don't, but if I had to guess I'd say it's related to the fact that gitlab's determination of whether an MR is "mergeable" is, as far as I understand it, asynchronous, so there might be times where if atlantis catches it at the wrong time it's still "figuring out" whether it's mergeable. Let me see if I can reproduce this on my setup and get back to you |
So I took a look and I am unable to reproduce this on either 0.24.2, 0.25.0 or main. However, my company's gitlab is on v16.2.3-ee, whereas the reports says 16.4.0. Additionally, a line in the logs pasted above jumped out at me which is:
This makes me think there's a gitlab bug, not an atlantis bug. @saraangelmurphy What happens when you go to https://gitlab.c.bvnt.co/api/v4/projects/274 in your browser? In my case the analogous URL gets me something that looks like:
Also for what it's worth I think this is the line that's failing: https://github.com/runatlantis/atlantis/blob/main/server/events/vcs/gitlab_client.go#L293 Otherwise, I'm happy to dig into this bug when my company upgrades their gitlab instance to 16.4.0. |
Given the critical vulnerability in previous versions, I hope that you guys prioritize the upgrade |
@lukemassa you are entirely correct, that URL does return a 500 error in gitlab 16.4.0. We've raised a ticket with them, but it does certainly appear to be a regression on their end if the projects API works fine on 16.2.X. |
ohhhh Gitlab.....is still better than Bitbucket Let's see what they say in your ticket, if you can link it here @saraangelmurphy that will be useful. |
I don't see a 500 response in the logs, like @saraangelmurphy does. |
I was able to to get my hands on a server with 16.4.0 and was unfortunately unable to reproduce either the original issue or the 500 issue. Especially with the original issue being intermittent makes it tricky. Any more logs or attempts to isolate would be helpful. |
So this ended up being an issue with an incomplete upgrade to gitlab, that was due to SQL migration scripts failing during the upgrade when applying constraints to gitlab project regex rules. Specifically, regexes needed to be <521 characters, we had several that were over, the presence of these project push rules during the upgrade broke the projects API. once we removed the excessively lengthy push rule regexes and reran the migrations, the projects API started responding again and atlantis was happy. https://gitlab.com/gitlab-org/gitlab/-/issues/426066#note_1575868016 I also want to applaud @lukemassa @jamengual , and the other maintainers of atlantis! While my thanks is poor reward compared to financial support, the alacrity of responses here is better than we get from many paid projects. Atlantis is a fantastic project, and we are profoundly grateful for everyone's hard work making this a production-ready service that solves many needs. |
I'm so glad to hear you solved the issue and that we were able to help but I have to say that without @lukemassa we could not have done it, we need more people like that in this project and the world too. |
@saraangelmurphy that's great to hear! @syst0m any chance that saraangelmurphy's was helpful in addressing your issue? If not let us know if you have any luck isolating the issue or finding a way to reproduce it consistently. |
I'm afraid not, looks like they are running a self-managed gitlab instances, IIUC. In terms of trying to reproduce the issue on my end, I haven't been able to consistently reproduce it, I'm afraid. 😞 |
My company's gitlab instance is now on 16.4.0, and I'm running atlantis 0.24.4, and I've not yet experienced or had reports of unmergeability for approval-required MRs. If anyone else is experiencing this, let us know and hopefully we can figure out some commonalities! |
@lukemassa We are still experiencing this issue, with multiple atlantis instances, which are handling different repos.
|
Yeah I don't see any smoking gun there, hard to say why exactly atlantis thinks the MRs are not mergeable. If you're able to run from source, @X-Guardian recently added a lot of debug logs #3876, including to the |
@syst0m 0.27.0 has been released which adds a number of debug flags to gitlab calls, when you have a chance could you upgrade and try to reproduce? |
I work with @syst0m and this is still an issue for us. I have been doing some debugging when I have ran in to this. I noticed that when this occurs, if you look at the I have an example of this below, and the steps leading up to this was just our standard workflow with no complications:
In the above case running subsequent With this in mind I was exploring the gitlab commit status API and have written a script to manually update the commit status of the "stuck" atlantis plan job to "success" and this allows me to fix a stuck MR to work around the issue, so it feels like the issue is caused by atlantis not be posting the commit status back correctly, either the wrong request is being sent or the request is failing for some reason (e.g network / gitlab transient issues / rate limiting etc). I thought I might be able to do the reverse this workaround of this to reliably re-produce the problem (raise an MR, then manually set the atlantis plan status to "Running") but the behaviour wasn't identical - it still blocked the MR with the |
@lukemassa Hello, |
@oana-l the logs you have there are consistent with the situation where the @dougbw The pipelines still running issue might be related to #3852 (comment), which is an ongoing debate about how to deal with some of the complexities introduced by #3378. As for the original issue, @syst0m / @dougbw I'm still unable to reproduce any intermittent behavior here, if you had any luck with any of the new logging. It would help understand which parts of PullIsMergeable() are failing, and if none of them are, if there's a log issue elsewhere. |
@lukemassa apologies for the misunderstanding, I should have been more specific. I work together with @syst0mn and @dougbw . What @dougbw explained on the UI issue is what we noticed is ALSO happening when we get the intermittent error with "Pull request must be mergeable before running apply" although the MR itself is approved. I hope this is a bit clearer now, thanks in advance! |
Ah I see. I was hoping that we'd get some information out of the output from the API calls to gitlab, but they all appear to have succeeded. Given that, it has to be something in the logic itself that's causing the issue.
This indicates to me that the code got to at least this line: https://github.com/runatlantis/atlantis/blob/v0.27.1/server/events/vcs/gitlab_client.go#L331. There are no more log lines between there and the end of the function, where I'm fairly confident it's returning Do you have the ability to run atlantis from source? If so I added a bunch more debug lines here: #4186 maybe that'll help us understand what's going on. For example now when I run it against an unapproved MR I get this output:
If not you could test out some of those API calls and see what they might say, or I could clean up this PR and hopefully get it in to 0.27.2. Also just a note that atlantis is fully open source and maintained and contributed to by volunteers. I'm happy to continue to try to debug this issue, but also encourage you to dive into the code too if you have issues with it! :) |
we're seeing possibly the same issue with Gitlab.com and Atlantis v0.27.1 it looks as though Atlantis has set the pipeline status of the commit with a These pipelines with this ref also don't show up in the list of pipelines for the merge request We've found the easiest fix is to delete this extra pipeline created by Atlantis and then run |
Community Note
Overview of the Issue
Running
atlantis apply
on an approved Gitlab PR fails intermittently withApply Failed: Pull request must be mergeable before running apply.
.Reproduction Steps
atlantis apply
on approved PR**Apply Failed**: Pull request must be mergeable before running apply.
Logs
Logs
Environment details
Atlantis server-side config file:
Repo
atlantis.yaml
file:Additional Context
Issue started happening on Atlantis v0.23.2. Still happening after upgrading to v0.23.4.
The text was updated successfully, but these errors were encountered: