Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

increase memory request for minimize BTF jobs #30433

Merged
merged 1 commit into from
Oct 24, 2024

Conversation

brycekahle
Copy link
Member

What does this PR do?

Increases the memory limit for the minimized BTF generation jobs

Motivation

This job has started OOMing recently.

Describe how to test/QA your changes

Possible Drawbacks / Trade-offs

Reducing parallelism would also reduce memory usage, but would increase runtime.

Additional Notes

@brycekahle brycekahle added changelog/no-changelog team/ebpf-platform qa/done QA done before merge and regressions are covered by tests labels Oct 23, 2024
@brycekahle brycekahle added this to the 7.60.0 milestone Oct 23, 2024
@brycekahle brycekahle requested review from a team as code owners October 23, 2024 18:23
@agent-platform-auto-pr
Copy link
Contributor

Gitlab CI Configuration Changes

Modified Jobs

.generate_minimized_btfs_common
  .generate_minimized_btfs_common:
    artifacts:
      expire_in: 2 weeks
      paths:
      - $CI_PROJECT_DIR/minimized-btfs.tar.xz
    image: 486234852809.dkr.ecr.us-east-1.amazonaws.com/ci/datadog-agent-buildimages/btf-gen$DATADOG_AGENT_BTF_GEN_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BTF_GEN_BUILDIMAGES
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - cd $CI_PROJECT_DIR
    - export BTFS_ETAG=$(aws s3api head-object --region us-east-1 --bucket dd-agent-omnibus
      --key btfs/$BTFHUB_ARCHIVE_BRANCH/btfs-$ARCH.tar --query ETag --output text |
      tr -d \")
    - export OUTPUTS_HASH=$(sha256sum sysprobe-build-outputs.tar.xz.sum | cut -d' '
      -f1)
    - export MIN_BTFS_FILENAME=minimized-btfs-$BTFS_ETAG-$OUTPUTS_HASH.tar.xz
    - "# if running all builds, or this is a release branch, skip the cache check\n\
      if [[ \"$RUN_ALL_BUILDS\" != \"true\" && ! $CI_COMMIT_BRANCH =~ /^[0-9]+\\.[0-9]+\\\
      .x$/ ]]; then\n  if aws s3api head-object --region us-east-1 --bucket dd-ci-artefacts-build-stable\
      \ --key $CI_PROJECT_NAME/btfs/$MIN_BTFS_FILENAME; then\n    $S3_CP_CMD $S3_PROJECT_ARTIFACTS_URI/btfs/$MIN_BTFS_FILENAME\
      \ $CI_PROJECT_DIR/minimized-btfs.tar.xz\n    echo \"cached minimized BTFs exist\"\
      \n    exit 0\n  fi\nfi\n"
    - $S3_CP_CMD $S3_DD_AGENT_OMNIBUS_BTFS_URI/$BTFHUB_ARCHIVE_BRANCH/btfs-$ARCH.tar
      .
    - tar -xf btfs-$ARCH.tar
    - tar -xf sysprobe-build-outputs.tar.xz
    - inv -e system-probe.generate-minimized-btfs --source-dir "$CI_PROJECT_DIR/btfs-$ARCH"
      --output-dir "$CI_PROJECT_DIR/minimized-btfs" --bpf-programs "$CI_PROJECT_DIR/pkg/ebpf/bytecode/build/${ARCH}/co-re"
    - cd minimized-btfs
    - tar -cJf $CI_PROJECT_DIR/minimized-btfs.tar.xz *
    - $S3_CP_CMD $CI_PROJECT_DIR/minimized-btfs.tar.xz $S3_PROJECT_ARTIFACTS_URI/btfs/$MIN_BTFS_FILENAME
    stage: package_deps_build
    tags:
    - arch:amd64
    variables:
      KUBERNETES_CPU_REQUEST: 24
-     KUBERNETES_MEMORY_LIMIT: 32Gi
?                              ^^
+     KUBERNETES_MEMORY_LIMIT: 64Gi
?                              ^^
-     KUBERNETES_MEMORY_REQUEST: 32Gi
?                                ^^
+     KUBERNETES_MEMORY_REQUEST: 64Gi
?                                ^^
generate_minimized_btfs_arm64
  generate_minimized_btfs_arm64:
    artifacts:
      expire_in: 2 weeks
      paths:
      - $CI_PROJECT_DIR/minimized-btfs.tar.xz
    image: 486234852809.dkr.ecr.us-east-1.amazonaws.com/ci/datadog-agent-buildimages/btf-gen$DATADOG_AGENT_BTF_GEN_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BTF_GEN_BUILDIMAGES
    needs:
    - build_system-probe-arm64
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - cd $CI_PROJECT_DIR
    - export BTFS_ETAG=$(aws s3api head-object --region us-east-1 --bucket dd-agent-omnibus
      --key btfs/$BTFHUB_ARCHIVE_BRANCH/btfs-$ARCH.tar --query ETag --output text |
      tr -d \")
    - export OUTPUTS_HASH=$(sha256sum sysprobe-build-outputs.tar.xz.sum | cut -d' '
      -f1)
    - export MIN_BTFS_FILENAME=minimized-btfs-$BTFS_ETAG-$OUTPUTS_HASH.tar.xz
    - "# if running all builds, or this is a release branch, skip the cache check\n\
      if [[ \"$RUN_ALL_BUILDS\" != \"true\" && ! $CI_COMMIT_BRANCH =~ /^[0-9]+\\.[0-9]+\\\
      .x$/ ]]; then\n  if aws s3api head-object --region us-east-1 --bucket dd-ci-artefacts-build-stable\
      \ --key $CI_PROJECT_NAME/btfs/$MIN_BTFS_FILENAME; then\n    $S3_CP_CMD $S3_PROJECT_ARTIFACTS_URI/btfs/$MIN_BTFS_FILENAME\
      \ $CI_PROJECT_DIR/minimized-btfs.tar.xz\n    echo \"cached minimized BTFs exist\"\
      \n    exit 0\n  fi\nfi\n"
    - $S3_CP_CMD $S3_DD_AGENT_OMNIBUS_BTFS_URI/$BTFHUB_ARCHIVE_BRANCH/btfs-$ARCH.tar
      .
    - tar -xf btfs-$ARCH.tar
    - tar -xf sysprobe-build-outputs.tar.xz
    - inv -e system-probe.generate-minimized-btfs --source-dir "$CI_PROJECT_DIR/btfs-$ARCH"
      --output-dir "$CI_PROJECT_DIR/minimized-btfs" --bpf-programs "$CI_PROJECT_DIR/pkg/ebpf/bytecode/build/${ARCH}/co-re"
    - cd minimized-btfs
    - tar -cJf $CI_PROJECT_DIR/minimized-btfs.tar.xz *
    - $S3_CP_CMD $CI_PROJECT_DIR/minimized-btfs.tar.xz $S3_PROJECT_ARTIFACTS_URI/btfs/$MIN_BTFS_FILENAME
    stage: package_deps_build
    tags:
    - arch:amd64
    variables:
      ARCH: arm64
      KUBERNETES_CPU_REQUEST: 24
-     KUBERNETES_MEMORY_LIMIT: 32Gi
?                              ^^
+     KUBERNETES_MEMORY_LIMIT: 64Gi
?                              ^^
-     KUBERNETES_MEMORY_REQUEST: 32Gi
?                                ^^
+     KUBERNETES_MEMORY_REQUEST: 64Gi
?                                ^^
generate_minimized_btfs_x64
  generate_minimized_btfs_x64:
    artifacts:
      expire_in: 2 weeks
      paths:
      - $CI_PROJECT_DIR/minimized-btfs.tar.xz
    image: 486234852809.dkr.ecr.us-east-1.amazonaws.com/ci/datadog-agent-buildimages/btf-gen$DATADOG_AGENT_BTF_GEN_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BTF_GEN_BUILDIMAGES
    needs:
    - build_system-probe-x64
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - cd $CI_PROJECT_DIR
    - export BTFS_ETAG=$(aws s3api head-object --region us-east-1 --bucket dd-agent-omnibus
      --key btfs/$BTFHUB_ARCHIVE_BRANCH/btfs-$ARCH.tar --query ETag --output text |
      tr -d \")
    - export OUTPUTS_HASH=$(sha256sum sysprobe-build-outputs.tar.xz.sum | cut -d' '
      -f1)
    - export MIN_BTFS_FILENAME=minimized-btfs-$BTFS_ETAG-$OUTPUTS_HASH.tar.xz
    - "# if running all builds, or this is a release branch, skip the cache check\n\
      if [[ \"$RUN_ALL_BUILDS\" != \"true\" && ! $CI_COMMIT_BRANCH =~ /^[0-9]+\\.[0-9]+\\\
      .x$/ ]]; then\n  if aws s3api head-object --region us-east-1 --bucket dd-ci-artefacts-build-stable\
      \ --key $CI_PROJECT_NAME/btfs/$MIN_BTFS_FILENAME; then\n    $S3_CP_CMD $S3_PROJECT_ARTIFACTS_URI/btfs/$MIN_BTFS_FILENAME\
      \ $CI_PROJECT_DIR/minimized-btfs.tar.xz\n    echo \"cached minimized BTFs exist\"\
      \n    exit 0\n  fi\nfi\n"
    - $S3_CP_CMD $S3_DD_AGENT_OMNIBUS_BTFS_URI/$BTFHUB_ARCHIVE_BRANCH/btfs-$ARCH.tar
      .
    - tar -xf btfs-$ARCH.tar
    - tar -xf sysprobe-build-outputs.tar.xz
    - inv -e system-probe.generate-minimized-btfs --source-dir "$CI_PROJECT_DIR/btfs-$ARCH"
      --output-dir "$CI_PROJECT_DIR/minimized-btfs" --bpf-programs "$CI_PROJECT_DIR/pkg/ebpf/bytecode/build/${ARCH}/co-re"
    - cd minimized-btfs
    - tar -cJf $CI_PROJECT_DIR/minimized-btfs.tar.xz *
    - $S3_CP_CMD $CI_PROJECT_DIR/minimized-btfs.tar.xz $S3_PROJECT_ARTIFACTS_URI/btfs/$MIN_BTFS_FILENAME
    stage: package_deps_build
    tags:
    - arch:amd64
    variables:
      ARCH: x86_64
      KUBERNETES_CPU_REQUEST: 24
-     KUBERNETES_MEMORY_LIMIT: 32Gi
?                              ^^
+     KUBERNETES_MEMORY_LIMIT: 64Gi
?                              ^^
-     KUBERNETES_MEMORY_REQUEST: 32Gi
?                                ^^
+     KUBERNETES_MEMORY_REQUEST: 64Gi
?                                ^^

Changes Summary

Removed Modified Added Renamed
0 3 0 0

ℹ️ Diff available in the job log.

@agent-platform-auto-pr
Copy link
Contributor

[Fast Unit Tests Report]

On pipeline 47271813 (CI Visibility). The following jobs did not run any unit tests:

Jobs:
  • tests_deb-arm64-py3
  • tests_deb-x64-py3
  • tests_flavor_dogstatsd_deb-x64
  • tests_flavor_heroku_deb-x64
  • tests_flavor_iot_deb-x64
  • tests_rpm-arm64-py3
  • tests_rpm-x64-py3
  • tests_windows-x64

If you modified Go files and expected unit tests to run in these jobs, please double check the job logs. If you think tests should have been executed reach out to #agent-devx-help

Copy link

Regression Detector

Regression Detector Results

Run ID: 54a91a05-2d7a-4959-990a-b36b0fd12500 Metrics dashboard Target profiles

Baseline: 0b8caf8
Comparison: 3696682

Performance changes are noted in the perf column of each table:

  • ✅ = significantly better comparison variant performance
  • ❌ = significantly worse comparison variant performance
  • ➖ = no significant change in performance

No significant changes in experiment optimization goals

Confidence level: 90.00%
Effect size tolerance: |Δ mean %| ≥ 5.00%

There were no significant changes in experiment optimization goals at this confidence level and effect size tolerance.

Fine details of change detection per experiment

perf experiment goal Δ mean % Δ mean % CI trials links
quality_gate_idle_all_features memory utilization +1.08 [+0.96, +1.21] 1 Logs bounds checks dashboard
file_tree memory utilization +0.84 [+0.70, +0.99] 1 Logs
tcp_syslog_to_blackhole ingress throughput +0.29 [+0.23, +0.35] 1 Logs
file_to_blackhole_500ms_latency egress throughput +0.16 [-0.08, +0.41] 1 Logs
pycheck_lots_of_tags % cpu utilization +0.15 [-2.38, +2.68] 1 Logs
idle memory utilization +0.12 [+0.06, +0.17] 1 Logs bounds checks dashboard
file_to_blackhole_300ms_latency egress throughput +0.09 [-0.09, +0.27] 1 Logs
file_to_blackhole_100ms_latency egress throughput +0.01 [-0.21, +0.24] 1 Logs
uds_dogstatsd_to_api ingress throughput +0.01 [-0.09, +0.10] 1 Logs
tcp_dd_logs_filter_exclude ingress throughput +0.00 [-0.01, +0.01] 1 Logs
file_to_blackhole_0ms_latency egress throughput -0.00 [-0.34, +0.33] 1 Logs
otel_to_otel_logs ingress throughput -0.07 [-0.87, +0.73] 1 Logs
quality_gate_idle memory utilization -0.20 [-0.24, -0.15] 1 Logs bounds checks dashboard
file_to_blackhole_1000ms_latency egress throughput -0.27 [-0.75, +0.21] 1 Logs
idle_all_features memory utilization -0.33 [-0.43, -0.23] 1 Logs bounds checks dashboard
basic_py_check % cpu utilization -1.17 [-3.89, +1.54] 1 Logs
uds_dogstatsd_to_api_cpu % cpu utilization -1.69 [-2.40, -0.98] 1 Logs

Bounds Checks

perf experiment bounds_check_name replicates_passed
file_to_blackhole_0ms_latency memory_usage 10/10
file_to_blackhole_1000ms_latency memory_usage 10/10
file_to_blackhole_100ms_latency memory_usage 10/10
file_to_blackhole_300ms_latency memory_usage 10/10
file_to_blackhole_500ms_latency memory_usage 10/10
idle memory_usage 10/10
idle_all_features memory_usage 10/10
quality_gate_idle memory_usage 10/10
quality_gate_idle_all_features memory_usage 10/10

Explanation

A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI".

For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:

  1. Its estimated |Δ mean %| ≥ 5.00%, indicating the change is big enough to merit a closer look.

  2. Its 90.00% confidence interval "Δ mean % CI" does not contain zero, indicating that if our statistical model is accurate, there is at least a 90.00% chance there is a difference in performance between baseline and comparison variants.

  3. Its configuration does not mark it "erratic".

@brycekahle
Copy link
Member Author

/merge

@dd-devflow
Copy link

dd-devflow bot commented Oct 24, 2024

🚂 MergeQueue: pull request added to the queue

The median merge time in main is 22m.

Use /merge -c to cancel this operation!

@dd-mergequeue dd-mergequeue bot merged commit f1d83bf into main Oct 24, 2024
239 of 240 checks passed
@dd-mergequeue dd-mergequeue bot deleted the bryce.kahle/increase-btf-memory branch October 24, 2024 19:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
changelog/no-changelog qa/done QA done before merge and regressions are covered by tests team/ebpf-platform
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants