-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(http_server source): add all headers to the namespace metadata #18922
Conversation
👷 Deploy request for vector-project pending review.Visit the deploys page to approve it
|
✅ Deploy Preview for vrl-playground ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
Thanks. It's not completely clear what the use case for this would be and it does add some extra processing overhead. I'm not seeing a relevant issue that this is solving. Would it be possible to raise one and add all the relevant details. We can discuss there if there is a need for this? |
filed an issue with more details about why we wanted this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We had a discussion around this change, and whilst we can see it would be useful for some - it is definitely not something that we want to have implemented all the time due to the overhead it introduces.
The way we would like to have this working is to change the headers config to allow for wildcards. You can use the glob crate to do the matching.
I'm not sure quite what the overhead will be of running the matching in glob
. It may be worth detecting at load time if the header
contains any wildcards - if it does use glob
otherwise use ==
like it currently does.
This way, if you want all the headers you can specify headers: ["*"]
, and other users will be unaffected.
Possibly a little more involved that you initially anticipated, but this provides the best way forward for everyone. Let me know if you have any questions.
08e3122
to
ce53704
Compare
@StephenWakely & @neuronull Something like this look better? |
@neuronull Updated this PR, I replaced the extra field with an enum & a function to build a vector of it from a Vec. I think that should address the concerns about resource cost. I also added a couple lines to the docs & the headers test, although I cannot for the life of me find the docs on how to actually run the test suite |
Thanks for iterating on the solution @sonnens
The most direct way is
, it looks like the modified test case is failing. You'll also want to run
The format lints can be fixed with |
@neuronull here you go. This should take care of those concerns. It'll bail on error now if glob::pattern fails , made the docs changes you requested, added a test for matching everything & fixed the tests which work now. In doing so, I had to double check with RFC2616 that header keys are case insensitive, but header values are sensitive, so I added a test for that as well. |
Signed-off-by: Jesse Szwedko <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One non-blocking grammar nit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We had just some small edit to the user facing docs and I added some code comments but otherwise this looks good to me.
Regression Detector ResultsRun ID: 3a9d86f1-c04b-4636-b4a9-42f7df1de8dc Explanation A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI". For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:
No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Experiments that were not declared erratic but were detected as being so, coefficient of variation cutoff 0.10:
Fine details of change detection per experiment.
|
Regression Detector ResultsRun ID: 6c317e81-ee64-4af5-a943-b2a672d907c7 Explanation A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI". For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:
Changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%:
Experiments that were not declared erratic but were detected as being so, coefficient of variation cutoff 0.10:
Fine details of change detection per experiment.
|
CI in the merge queue failed due to a too many requests error returned from docker. Take 2 |
Regression Detector ResultsRun ID: 8c3b1915-8aae-4b65-a78e-9bba6df0ec21 Explanation A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI". For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:
No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Experiments that were not declared erratic but were detected as being so, coefficient of variation cutoff 0.10:
Fine details of change detection per experiment.
|
Regression Detector ResultsRun ID: b6985e6f-ed10-4f20-847f-ee1946afe760 Explanation A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI". For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:
No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Experiments that were not declared erratic but were detected as being so, coefficient of variation cutoff 0.10:
Fine details of change detection per experiment.
|
Regression Detector ResultsRun ID: 54209fdd-ba45-4242-82c5-f94424a0cf19 Explanation A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI". For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:
Changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%:
Experiments that were not declared erratic but were detected as being so, coefficient of variation cutoff 0.10:
Fine details of change detection per experiment.
|
This was kicked out of the merge queue due to a known issue that we hope to resolve soon- there was a performance improvement on one of the performance regression test cases. This incorrectly was perceived as a failure. |
Regression Detector ResultsRun ID: d572889c-f81a-4cb2-85f1-8eb8eb48c7c4 Explanation A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI". For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:
No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Experiments that were not declared erratic but were detected as being so, coefficient of variation cutoff 0.10:
Fine details of change detection per experiment.
|
Thanks for the contribution @sonnens ! |
…ectordotdev#18922) * feat(http_server source): add all headers to the namespace metadata * feat(http_server source): allow wildcard matching in headers * style: whitespace typo * rework header glob matching, add docs and tests * examples, docs, tests, error on misconfiguration * fmt & clippy cleanup * Generate docs Signed-off-by: Jesse Szwedko <[email protected]> * docs grammar adjustment * add some code docs Ref: LOG-19103 --------- Signed-off-by: Jesse Szwedko <[email protected]> Co-authored-by: Jesse Szwedko <[email protected]> Co-authored-by: neuronull <[email protected]>
# [3.2.0](answerbook/vector@v3.1.3...v3.2.0) (2024-01-24) ### Bug Fixes * **sources**: `http_server` is not saving query params as key/val [733f3a9](answerbook/vector@733f3a9) - Darin Spivey [LOG-19104](https://logdna.atlassian.net/browse/LOG-19104) ### Features * **http_server source**: add all headers to the namespace metadata (vectordotdev#18922) [3772b19](answerbook/vector@3772b19) - Darin Spivey [LOG-19103](https://logdna.atlassian.net/browse/LOG-19103) * **sources**: `http_server` accepts query parameter wildcards [6627a95](answerbook/vector@6627a95) - Darin Spivey [LOG-19105](https://logdna.atlassian.net/browse/LOG-19105) ### Miscellaneous * Merge pull request vectordotdev#403 from answerbook/darinspivey/LOG-19103 [896eaed](answerbook/vector@896eaed) - GitHub [LOG-19103](https://logdna.atlassian.net/browse/LOG-19103)
…ectordotdev#18922) * feat(http_server source): add all headers to the namespace metadata * feat(http_server source): allow wildcard matching in headers * style: whitespace typo * rework header glob matching, add docs and tests * examples, docs, tests, error on misconfiguration * fmt & clippy cleanup * Generate docs Signed-off-by: Jesse Szwedko <[email protected]> * docs grammar adjustment * add some code docs --------- Signed-off-by: Jesse Szwedko <[email protected]> Co-authored-by: Jesse Szwedko <[email protected]> Co-authored-by: neuronull <[email protected]>
This adds all the http headers to the source metadata if log_namespace is enabled