Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(plugins/statsd & opentelemetry): add return values for funcs called in batch queue #10052

Merged

Conversation

liverpool8056
Copy link
Contributor

@liverpool8056 liverpool8056 commented Jan 4, 2023

Summary

fix(plugins/statsd & opentelemetry): add return values for functions that will be processed in batch queue for these two plugins. Batch queue considers the result of processing falsy without return value, resulting the batch in being reprocessed.

Checklist

Issue reference

FTI-4645

@liverpool8056
Copy link
Contributor Author

opentelemetry always creates a batch queue with retry_count=0, I didn't find a way to write a test case for it.

@dndx
Copy link
Member

dndx commented Jan 4, 2023

Tests are still failing.

@chronolaw chronolaw changed the title add-return-values-for-funcs-called-in-batch-queue fix(plugins/statsd & opentelemetry): add return values for funcs called in batch queue Jan 4, 2023
@chronolaw
Copy link
Contributor

Notice: we should use the correct PR title, it is important for reviewers.

@mayocream mayocream force-pushed the plugins/add-return-values-for-processes-used-in-batch_queue branch from 8fad2ec to 5423842 Compare January 4, 2023 10:42
@mayocream mayocream self-requested a review January 4, 2023 11:40
@liverpool8056 liverpool8056 force-pushed the plugins/add-return-values-for-processes-used-in-batch_queue branch from 478822b to db86ed4 Compare January 4, 2023 13:59
kong/plugins/opentelemetry/handler.lua Outdated Show resolved Hide resolved
kong/plugins/opentelemetry/handler.lua Outdated Show resolved Hide resolved
spec/03-plugins/06-statsd/01-log_spec.lua Outdated Show resolved Hide resolved
@liverpool8056 liverpool8056 force-pushed the plugins/add-return-values-for-processes-used-in-batch_queue branch 3 times, most recently from 66c59d3 to 6e73663 Compare January 5, 2023 05:47
assert(#metrics == metrics_count, err)
end)

-- wait until shdict metrics could be sent again
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this introduce flakiness? I am good if it isn't. 🤔

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test case aims to ensure the changes(add return values) for functions of statsd plugins called in batch_queue won't be called repeatedly when the functions executed successfully. In this test case, I try to wait more metrics than it should be, actually , it should timeout as expected (because no more metrics then expected).
The next one is waiting for the shdict metrics being refreshed so that the third test case can pass.
I tried several times in my local environment successfully, so I think it won't be flaky.

@mayocream
Copy link
Contributor

opentelemetry always creates a batch queue with retry_count=0, I didn't find a way to write a test case for it.

In that case, it shouldn't be a critical problem in the opentelemetry plugin, it causes no effects.

spec/03-plugins/06-statsd/01-log_spec.lua Outdated Show resolved Hide resolved
@rspurgeon
Copy link

rspurgeon commented Jan 5, 2023

The DataDog plugin also uses the batch queue and returns a falsey value in it's log function:

logger:close_socket()

CHANGELOG.md Outdated Show resolved Hide resolved
@liverpool8056
Copy link
Contributor Author

@rspurgeon I have another PR #10044 for datadog

that will be processed in batch queue for these two plugins.
Batch queue considers the result of processing falsy without return
value, namely nil, resulting the batch being reprocessed.

FTI-4645
@liverpool8056 liverpool8056 force-pushed the plugins/add-return-values-for-processes-used-in-batch_queue branch from eee869f to bf0d789 Compare January 6, 2023 01:47
spec/03-plugins/06-statsd/01-log_spec.lua Outdated Show resolved Hide resolved
@pull-request-size pull-request-size bot added size/L and removed size/M labels Jan 9, 2023
@liverpool8056 liverpool8056 force-pushed the plugins/add-return-values-for-processes-used-in-batch_queue branch from 62ffbdb to f4a6a52 Compare January 9, 2023 13:01
@dndx dndx merged commit 0eae6b9 into master Jan 10, 2023
@dndx dndx deleted the plugins/add-return-values-for-processes-used-in-batch_queue branch January 10, 2023 03:48
windmgc pushed a commit that referenced this pull request Jan 17, 2023
…e callback functions (#10052)

Add return values for functions that will be processed in batch queue for these two plugins.
Batch queue considers the result of processing unsuccessful without a return
value, resulting the batch being reprocessed incorrectly.

FTI-4645
hanshuebner pushed a commit that referenced this pull request Jan 17, 2023
…0126)

* fix(plugins/datadog):  add return value for log function in batch queue (#10044)

The log function in Datadog is called by the batch
queue when a batch is processed, and batch queue relys
on the return value of the callback. The return value of log
function in Datadog always return `nil`, this makes batch queue consider
the result of processing as failed. In this commit, the correct return value
indicating the success in processing to fix this bug.

* fix(plugins/statsd & opentelemetry): add return values for batch queue callback functions (#10052)

Add return values for functions that will be processed in batch queue for these two plugins.
Batch queue considers the result of processing unsuccessful without a return
value, resulting the batch being reprocessed incorrectly.

FTI-4645

Co-authored-by: Robin Xiang <[email protected]>
curiositycasualty pushed a commit that referenced this pull request Oct 15, 2024
…ected when defining an object parameter with form style and exploded. (#10052)

FTI-6179
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants