Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Show documents that might be around after N1QL emptying fails #6624

Merged
merged 4 commits into from
Jan 2, 2024

Conversation

torcolvin
Copy link
Collaborator

In some weekly-jenkins runs we see errors like:

2023-12-14T21:04:28.868Z TEST: b:sg_int_1_1702585440608119514 waiting for empty bucket indexes sg_int_1_1702585440608119514._default._default
2023-12-14T21:09:16.752Z TEST: b:sg_int_1_1702585440608119514 waitForPrimaryIndexEmpty returned an error: RetryLoop for Wait for index to be empty giving up after 61 attempts
2023-12-14T21:09:16.752Z TEST: b:sg_int_1_1702585440608119514 Couldn't ready bucket, got error: RetryLoop for Wait for index to be empty giving up after 61 attempts - Retrying
2023-12-14T21:09:16.752Z [WRN] b:sg_int_1_1702585440608119514 RetryLoop for Wait for index to be empty giving up after 61 attempts -- base.RetryLoop() at util.go:460
2023-12-14T21:09:18.753Z TEST: b:sg_int_1_1702585440608119514 Running bucket through readier function
2023-12-14T21:09:18.753Z TEST: b:sg_int_1_1702585440608119514 emptying bucket via N1QL, readying views and indexes
2023-12-14T21:09:18.765Z TEST: b:sg_int_1_1702585440608119514 Bucket not empty (2 items), emptying bucket via N1QL
2023-12-14T21:09:18.783Z TEST: b:sg_int_1_1702585440608119514 Finished emptying all docs index ... Total docs purged: 0
2023-12-14T21:09:18.792Z TEST: b:sg_int_1_1702585440608119514 waiting for empty bucket indexes sg_int_1_1702585440608119514._default._default

You can see here that it has gone on for 61 times and is fact growing documents. https://jenkins.sgwdev.com/job/weekly-matrix/BACKING_STORE=enterprise-7.1.6,QUERY_PROVIDER=gsi-default-collection,SYNC_STORE=inline,label=sgw-amazon-linux2023/100/

In this case, the test that seems to be leaking is: TestDbConfigPersistentSGVersions and I have to imagine that is config files that are leaking but I don't see how this test is potentially leaking goroutines. Note, I'm modifying this test in #6622 to make it work with rosmar.

Tested by removing N1QL delete in bucketReadier and making sure we get a better log message.

Pre-review checklist

  • Removed debug logging (fmt.Print, log.Print, ...)
  • Logging sensitive data? Make sure it's tagged (e.g. base.UD(docID), base.MD(dbName))
  • Updated relevant information in the API specifications (such as endpoint descriptions, schemas, ...) in docs/api

db/util_testing.go Outdated Show resolved Hide resolved
db/util_testing.go Outdated Show resolved Hide resolved
db/util_testing.go Outdated Show resolved Hide resolved
@torcolvin torcolvin assigned torcolvin and unassigned adamcfraser Jan 2, 2024
@torcolvin torcolvin assigned bbrks and unassigned torcolvin Jan 2, 2024
bbrks
bbrks previously approved these changes Jan 2, 2024
db/util_testing.go Outdated Show resolved Hide resolved
db/util_testing.go Outdated Show resolved Hide resolved
@torcolvin torcolvin enabled auto-merge (squash) January 2, 2024 16:37
@torcolvin torcolvin requested a review from bbrks January 2, 2024 16:37
@torcolvin torcolvin merged commit bea26ef into master Jan 2, 2024
16 of 17 checks passed
@torcolvin torcolvin deleted the add-debug-logging-to-bucket-pool branch January 2, 2024 16:57
bbrks pushed a commit that referenced this pull request Mar 28, 2024
* Show documents that might be around after N1QL emptying fails, useful for debugging test failures
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants