-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fleet] Add source information to configuration settings #20591
Conversation
Bloop Bleep... Dogbot HereRegression Detector ResultsRun ID: 81956b41-e370-4435-ae36-bc22042f4b18 ExplanationA regression test is an integrated performance test for Because a target's optimization goal performance in each experiment will vary somewhat each time it is run, we can only estimate mean differences in optimization goal relative to the baseline target. We express these differences as a percentage change relative to the baseline target, denoted "Δ mean %". These estimates are made to a precision that balances accuracy and cost control. We represent this precision as a 90.00% confidence interval denoted "Δ mean % CI": there is a 90.00% chance that the true value of "Δ mean %" is in that interval. We decide whether a change in performance is a "regression" -- a change worth investigating further -- if both of the following two criteria are true:
The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values of "Δ mean %" mean that baseline is faster, whereas positive values of "Δ mean %" mean that comparison is faster. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for agent-platform owned files
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM on changes to OTel
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good for platform integrations!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good for Agent Shared Components
Previous PR before packages got moved around : #19832
TL;DR
Know which source is setting a configuration parameter
Future consequences
When using the
.Set()
method to change a configuration setting in the agent, you must now add amodel.Source
parameter, keeping in mind the following hierarchy between them:CLI > Remote Config > Agent Runtime > Env Var > File > Unknown > Default
You could also use the
.SetWithoutSource()
method, it will then assumeSourceUnknown
If you want to override a setting (e.g. for a unit test), you can now call
.Set()
specifying a source higher in the hierarchy, then use the.UnsetForSource()
method ; the applied setting will fallback to its previous valueWhat do the sources correspond to?
Default
is the lowest in the hierarchy, also Go fallback state (empty string). Also used when calling.SetDefault()
Unknown
is used by the.SetWithoutSource()
method, you should avoid using it on purposeFile
corresponds to thedatadog.yaml
configuration fileEnv Var
corresponds to the environment variablesAgent Runtime
is used when the agent itself changes values for its configuration, not to be confused with runtime settingsRemote Config
is used to remotely change the behaviour of the agentCLI
is used when a user manually enters commandsWhat does this PR do?
Source
type from thesettings
package tomodel
(Viper wrapper).Set()
method to include asource
parameter.SetWithoutSource()
as a legacy way to change a setting without specifying a sourceMotivation
For Fleet Automation, we want to display the configuration of the agent and the source of every setting.
Use case: a customer doesn't understand why when he changed a setting in the
datadog.yaml
file it didn't have any effect on the agent (because he had forgotten to unset anENV VAR
).Additional Notes
A lot of the changes are moving
.Set()
calls to.SetWithoutSource()
, the more impactful changes are in the following files:More detailed explanations
Because Viper doesn't support multiple sources of setting, the workaround is to spawn one instance of Viper per configuration source. Everytime a setting is changed, we check the value of every instance and the value of the highest source is applied to the "main" Viper instance.
Only the
Set
core concept is changed, theGet
behavior do not changePossible Drawbacks / Trade-offs
Describe how to test/QA your changes
Reviewer's Checklist
Triage
milestone is set.major_change
label if your change either has a major impact on the code base, is impacting multiple teams or is changing important well-established internals of the Agent. This label will be use during QA to make sure each team pay extra attention to the changed behavior. For any customer facing change use a releasenote.changelog/no-changelog
label has been applied.qa/skip-qa
label is not applied.team/..
label has been applied, indicating the team(s) that should QA this change.need-change/operator
andneed-change/helm
labels have been applied.k8s/<min-version>
label, indicating the lowest Kubernetes version compatible with this feature.