Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Delete test/kitchen folder #31525

Merged
merged 17 commits into from
Dec 6, 2024
Merged

Conversation

KevinFairise2
Copy link
Member

@KevinFairise2 KevinFairise2 commented Nov 27, 2024

What does this PR do?

Delete test/kitchen folder and update tests that were using this folder to store artifacts.

Motivation

Cleanup things that we no longer use

Describe how to test/QA your changes

Possible Drawbacks / Trade-offs

Additional Notes

@agent-platform-auto-pr
Copy link
Contributor

agent-platform-auto-pr bot commented Nov 27, 2024

Test changes on VM

Use this command from test-infra-definitions to manually test this PR changes on a VM:

inv aws.create-vm --pipeline-id=50418491 --os-family=ubuntu

Note: This applies to commit 47c828f

Copy link

cit-pr-commenter bot commented Nov 27, 2024

Regression Detector

Regression Detector Results

Metrics dashboard
Target profiles
Run ID: 919b299e-c683-412d-b54e-ed03ce65845a

Baseline: 48cbbbf
Comparison: 47c828f
Diff

Optimization Goals: ✅ No significant changes detected

Fine details of change detection per experiment

perf experiment goal Δ mean % Δ mean % CI trials links
uds_dogstatsd_to_api_cpu % cpu utilization +1.41 [+0.69, +2.13] 1 Logs
otel_to_otel_logs ingress throughput +0.90 [+0.25, +1.55] 1 Logs
file_to_blackhole_300ms_latency egress throughput +0.15 [-0.49, +0.78] 1 Logs
file_to_blackhole_1000ms_latency egress throughput +0.13 [-0.67, +0.92] 1 Logs
file_to_blackhole_100ms_latency egress throughput +0.07 [-0.64, +0.78] 1 Logs
file_to_blackhole_0ms_latency egress throughput +0.03 [-0.81, +0.88] 1 Logs
tcp_dd_logs_filter_exclude ingress throughput -0.00 [-0.01, +0.01] 1 Logs
uds_dogstatsd_to_api ingress throughput -0.02 [-0.10, +0.07] 1 Logs
quality_gate_idle memory utilization -0.04 [-0.09, +0.00] 1 Logs bounds checks dashboard
file_to_blackhole_500ms_latency egress throughput -0.09 [-0.86, +0.67] 1 Logs
file_to_blackhole_1000ms_latency_linear_load egress throughput -0.21 [-0.68, +0.25] 1 Logs
quality_gate_idle_all_features memory utilization -0.32 [-0.44, -0.21] 1 Logs bounds checks dashboard
tcp_syslog_to_blackhole ingress throughput -0.34 [-0.41, -0.28] 1 Logs
file_tree memory utilization -0.63 [-0.77, -0.49] 1 Logs
quality_gate_logs % cpu utilization -3.75 [-6.64, -0.85] 1 Logs

Bounds Checks: ❌ Failed

perf experiment bounds_check_name replicates_passed links
file_to_blackhole_0ms_latency lost_bytes 9/10
file_to_blackhole_0ms_latency memory_usage 10/10
file_to_blackhole_1000ms_latency memory_usage 10/10
file_to_blackhole_1000ms_latency_linear_load memory_usage 10/10
file_to_blackhole_100ms_latency lost_bytes 10/10
file_to_blackhole_100ms_latency memory_usage 10/10
file_to_blackhole_300ms_latency lost_bytes 10/10
file_to_blackhole_300ms_latency memory_usage 10/10
file_to_blackhole_500ms_latency lost_bytes 10/10
file_to_blackhole_500ms_latency memory_usage 10/10
quality_gate_idle memory_usage 10/10 bounds checks dashboard
quality_gate_idle_all_features memory_usage 10/10 bounds checks dashboard
quality_gate_logs lost_bytes 10/10
quality_gate_logs memory_usage 10/10

Explanation

Confidence level: 90.00%
Effect size tolerance: |Δ mean %| ≥ 5.00%

Performance changes are noted in the perf column of each table:

  • ✅ = significantly better comparison variant performance
  • ❌ = significantly worse comparison variant performance
  • ➖ = no significant change in performance

A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI".

For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:

  1. Its estimated |Δ mean %| ≥ 5.00%, indicating the change is big enough to merit a closer look.

  2. Its 90.00% confidence interval "Δ mean % CI" does not contain zero, indicating that if our statistical model is accurate, there is at least a 90.00% chance there is a difference in performance between baseline and comparison variants.

  3. Its configuration does not mark it "erratic".

CI Pass/Fail Decision

Passed. All Quality Gates passed.

  • quality_gate_idle, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_logs, bounds check lost_bytes: 10/10 replicas passed. Gate passed.
  • quality_gate_logs, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_idle_all_features, bounds check memory_usage: 10/10 replicas passed. Gate passed.

@agent-platform-auto-pr
Copy link
Contributor

agent-platform-auto-pr bot commented Nov 27, 2024

Gitlab CI Configuration Changes

Modified Jobs

stages (configuration)
  stages:
  - .pre
  - setup
  - maintenance_jobs
  - deps_build
  - deps_fetch
  - lint
  - source_test
  - source_test_stats
  - software_composition_analysis
  - binary_build
  - package_deps_build
  - kernel_matrix_testing_prepare
  - kernel_matrix_testing_system_probe
  - kernel_matrix_testing_security_agent
  - kernel_matrix_testing_cleanup
  - integration_test
  - benchmarks
  - package_build
  - packaging
  - pkg_metrics
- - kitchen_deploy
- - kitchen_testing
  - container_build
  - container_scan
  - check_deploy
  - dev_container_deploy
  - deploy_containers
  - deploy_packages
  - deploy_cws_instrumentation
  - deploy_dca
  - choco_and_install_script_build
  - trigger_release
  - choco_and_install_script_deploy
  - internal_image_deploy
+ - e2e_deploy
  - install_script_testing
  - e2e_pre_test
  - e2e_init
  - e2e
  - e2e_cleanup
  - e2e_k8s
  - e2e_install_packages
- - kitchen_cleanup
  - functional_test
- - functional_test_cleanup
  - junit_upload
  - internal_kubernetes_deploy
  - post_rc_build
  - check_merge
  - notify
  - .post
variables (configuration)
  variables:
    AGENT_API_KEY_ORG2: agent-api-key-org-2
    AGENT_APP_KEY_ORG2: agent-ci-app-key-org-2
    AGENT_BINARIES_DIR: bin/agent
    AGENT_GITHUB_APP: agent-github-app
    AGENT_QA_E2E: agent-qa-e2e
    API_KEY_ORG2: ci.datadog-agent.datadog_api_key_org2
    ARTIFACT_DOWNLOAD_ATTEMPTS: 2
    ATLASSIAN_WRITE: atlassian-write
    BTFHUB_ARCHIVE_BRANCH: main
    BUCKET_BRANCH: dev
    CHANGELOG_COMMIT_SHA: ci.datadog-agent.gitlab_changelog_commit_sha
    CHOCOLATEY_API_KEY: ci.datadog-agent.chocolatey_api_key
    CI_IMAGE_BTF_GEN: v50263243-1a30c934
    CI_IMAGE_BTF_GEN_SUFFIX: ''
    CI_IMAGE_DD_AGENT_TESTING: v50263243-1a30c934
    CI_IMAGE_DD_AGENT_TESTING_SUFFIX: ''
    CI_IMAGE_DEB_ARM64: v50263243-1a30c934
    CI_IMAGE_DEB_ARM64_SUFFIX: ''
    CI_IMAGE_DEB_ARMHF: v50263243-1a30c934
    CI_IMAGE_DEB_ARMHF_SUFFIX: ''
    CI_IMAGE_DEB_X64: v50263243-1a30c934
    CI_IMAGE_DEB_X64_SUFFIX: ''
    CI_IMAGE_DOCKER_ARM64: v50263243-1a30c934
    CI_IMAGE_DOCKER_ARM64_SUFFIX: ''
    CI_IMAGE_DOCKER_X64: v50263243-1a30c934
    CI_IMAGE_DOCKER_X64_SUFFIX: ''
    CI_IMAGE_GITLAB_AGENT_DEPLOY: v50263243-1a30c934
    CI_IMAGE_GITLAB_AGENT_DEPLOY_SUFFIX: ''
    CI_IMAGE_LINUX_GLIBC_2_17_X64: v50263243-1a30c934
    CI_IMAGE_LINUX_GLIBC_2_17_X64_SUFFIX: ''
    CI_IMAGE_LINUX_GLIBC_2_23_ARM64: v50263243-1a30c934
    CI_IMAGE_LINUX_GLIBC_2_23_ARM64_SUFFIX: ''
    CI_IMAGE_RPM_ARM64: v50263243-1a30c934
    CI_IMAGE_RPM_ARM64_SUFFIX: ''
    CI_IMAGE_RPM_ARMHF: v50263243-1a30c934
    CI_IMAGE_RPM_ARMHF_SUFFIX: ''
    CI_IMAGE_RPM_X64: v50263243-1a30c934
    CI_IMAGE_RPM_X64_SUFFIX: ''
    CI_IMAGE_SYSTEM_PROBE_ARM64: v50263243-1a30c934
    CI_IMAGE_SYSTEM_PROBE_ARM64_SUFFIX: ''
    CI_IMAGE_SYSTEM_PROBE_X64: v50263243-1a30c934
    CI_IMAGE_SYSTEM_PROBE_X64_SUFFIX: ''
    CI_IMAGE_WIN_1809_X64: v50263243-1a30c934
    CI_IMAGE_WIN_1809_X64_SUFFIX: ''
    CI_IMAGE_WIN_LTSC2022_X64: v50263243-1a30c934
    CI_IMAGE_WIN_LTSC2022_X64_SUFFIX: ''
    CLANG_LLVM_VER: 12.0.1
    CLUSTER_AGENT_BINARIES_DIR: bin/datadog-cluster-agent
    CLUSTER_AGENT_CLOUDFOUNDRY_BINARIES_DIR: bin/datadog-cluster-agent-cloudfoundry
    CODECOV: codecov
    CODECOV_TOKEN: ci.datadog-agent.codecov_token
    CWS_INSTRUMENTATION_BINARIES_DIR: bin/cws-instrumentation
    DATADOG_AGENT_ARMBUILDIMAGES: v50263243-1a30c934
    DATADOG_AGENT_ARMBUILDIMAGES_SUFFIX: ''
    DATADOG_AGENT_BTF_GEN_BUILDIMAGES: v50263243-1a30c934
    DATADOG_AGENT_BTF_GEN_BUILDIMAGES_SUFFIX: ''
    DATADOG_AGENT_BUILDIMAGES: v50263243-1a30c934
    DATADOG_AGENT_BUILDIMAGES_SUFFIX: ''
    DATADOG_AGENT_EMBEDDED_PATH: /opt/datadog-agent/embedded
    DATADOG_AGENT_SYSPROBE_BUILDIMAGES: v50263243-1a30c934
    DATADOG_AGENT_SYSPROBE_BUILDIMAGES_SUFFIX: ''
    DATADOG_AGENT_WINBUILDIMAGES: v50263243-1a30c934
    DATADOG_AGENT_WINBUILDIMAGES_SUFFIX: ''
-   DD_AGENT_TESTING_DIR: $CI_PROJECT_DIR/test/kitchen
?                                              ^^ -- ^
+   DD_AGENT_TESTING_DIR: $CI_PROJECT_DIR/test/new-e2e/tests
?                                              ^^^^^^^^  ^^^
    DD_PKG_VERSION: latest
    DEB_GPG_KEY: ci.datadog-agent.deb_signing_private_key_${DEB_GPG_KEY_ID}
    DEB_GPG_KEY_ID: c0962c7d
    DEB_GPG_KEY_NAME: Datadog, Inc. APT key
    DEB_RPM_TESTING_BUCKET_BRANCH: testing
    DEB_S3_BUCKET: apt.datad0g.com
    DEB_SIGNING_PASSPHRASE: ci.datadog-agent.deb_signing_key_passphrase_${DEB_GPG_KEY_ID}
    DEB_TESTING_S3_BUCKET: apttesting.datad0g.com
    DOCKER_REGISTRY_LOGIN: ci.datadog-agent.docker_hub_login
    DOCKER_REGISTRY_PWD: ci.datadog-agent.docker_hub_pwd
    DOCKER_REGISTRY_RO: dockerhub-readonly
    DOCKER_REGISTRY_URL: docker.io
    DOGSTATSD_BINARIES_DIR: bin/dogstatsd
    E2E_AZURE: e2e-azure
    E2E_GCP: e2e-gcp
    EXECUTOR_JOB_SECTION_ATTEMPTS: 2
    FF_KUBERNETES_HONOR_ENTRYPOINT: true
    FF_SCRIPT_SECTIONS: 1
    GENERAL_ARTIFACTS_CACHE_BUCKET_URL: https://dd-agent-omnibus.s3.amazonaws.com
    GET_SOURCES_ATTEMPTS: 2
    GITLAB_TOKEN: gitlab-token
    GO_TEST_SKIP_FLAKE: 'true'
    INSTALL_SCRIPT_API_KEY_ORG2: install-script-api-key-org-2
    INTEGRATION_WHEELS_CACHE_BUCKET: dd-agent-omnibus
    KERNEL_MATRIX_TESTING_ARM_AMI_ID: ami-0b5f838a19d37fc61
    KERNEL_MATRIX_TESTING_X86_AMI_ID: ami-05b3973acf5422348
-   KITCHEN_AWS: kitchen-aws
-   KITCHEN_AZURE: kitchen-azure
    KITCHEN_INFRASTRUCTURE_FLAKES_RETRY: 2
    MACOS_GITHUB_APP_1: macos-github-app-one
    MACOS_GITHUB_APP_2: macos-github-app-two
    MACOS_S3_BUCKET: dd-agent-macostesting
    OMNIBUS_BASE_DIR: /omnibus
    OMNIBUS_GIT_CACHE_DIR: /tmp/omnibus-git-cache
    OMNIBUS_PACKAGE_DIR: $CI_PROJECT_DIR/omnibus/pkg/
    OMNIBUS_PACKAGE_DIR_SUSE: $CI_PROJECT_DIR/omnibus/suse/pkg
    PROCESS_S3_BUCKET: datad0g-process-agent
    RELEASE_VERSION_6: nightly
    RELEASE_VERSION_7: nightly-a7
    RESTORE_CACHE_ATTEMPTS: 2
    RPM_GPG_KEY: ci.datadog-agent.rpm_signing_private_key_${RPM_GPG_KEY_ID}
    RPM_GPG_KEY_ID: b01082d3
    RPM_GPG_KEY_NAME: Datadog, Inc. RPM key
    RPM_S3_BUCKET: yum.datad0g.com
    RPM_SIGNING_PASSPHRASE: ci.datadog-agent.rpm_signing_key_passphrase_${RPM_GPG_KEY_ID}
    RPM_TESTING_S3_BUCKET: yumtesting.datad0g.com
    RUN_E2E_TESTS: auto
    RUN_KMT_TESTS: auto
    RUN_UNIT_TESTS: auto
    S3_ARTIFACTS_URI: s3://dd-ci-artefacts-build-stable/$CI_PROJECT_NAME/$CI_PIPELINE_ID
    S3_CP_CMD: aws s3 cp $S3_CP_OPTIONS
    S3_CP_OPTIONS: --no-progress --region us-east-1 --sse AES256
    S3_DD_AGENT_OMNIBUS_BTFS_URI: s3://dd-agent-omnibus/btfs
    S3_DD_AGENT_OMNIBUS_JAVA_URI: s3://dd-agent-omnibus/openjdk
    S3_DD_AGENT_OMNIBUS_LLVM_URI: s3://dd-agent-omnibus/llvm
    S3_DSD6_URI: s3://dsd6-staging
    S3_OMNIBUS_CACHE_BUCKET: dd-ci-datadog-agent-omnibus-cache-build-stable
    S3_PERMANENT_ARTIFACTS_URI: s3://dd-ci-persistent-artefacts-build-stable/$CI_PROJECT_NAME
    S3_PROJECT_ARTIFACTS_URI: s3://dd-ci-artefacts-build-stable/$CI_PROJECT_NAME
    S3_RELEASE_ARTIFACTS_URI: s3://dd-release-artifacts/$CI_PROJECT_NAME/$CI_PIPELINE_ID
    S3_RELEASE_INSTALLER_ARTIFACTS_URI: s3://dd-release-artifacts/datadog-installer/$CI_PIPELINE_ID
    S3_SBOM_STORAGE_URI: s3://sbom-root-us1-ddbuild-io/$CI_PROJECT_NAME/$CI_PIPELINE_ID
    SLACK_AGENT: slack-agent-ci
    SMP_ACCOUNT: smp
    STATIC_BINARIES_DIR: bin/static
    SYSTEM_PROBE_BINARIES_DIR: bin/system-probe
    USE_S3_CACHING: --omnibus-s3-cache
    VCPKG_BLOB_SAS_URL: ci.datadog-agent-buildimages.vcpkg_blob_sas_url
    WINDOWS_BUILDS_S3_BUCKET: $WIN_S3_BUCKET/builds
    WINDOWS_POWERSHELL_DIR: $CI_PROJECT_DIR/signed_scripts
    WINDOWS_TESTING_S3_BUCKET_A6: pipelines/A6/$CI_PIPELINE_ID
    WINDOWS_TESTING_S3_BUCKET_A7: pipelines/A7/$CI_PIPELINE_ID
    WINGET_PAT: ci.datadog-agent.winget_pat
    WIN_S3_BUCKET: dd-agent-mstesting
.deploy_deb_testing-a7
  .deploy_deb_testing-a7:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
.deploy_rpm_testing-a7
  .deploy_rpm_testing-a7:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
.tests_windows_sysprobe
  .tests_windows_sysprobe:
    artifacts:
      paths:
-     - $DD_AGENT_TESTING_DIR/site-cookbooks/dd-system-probe-check/files
+     - $CI_PROJECT_DIR/test/new-e2e/tests/sysprobe-functional/artifacts
      when: always
    needs:
    - go_deps
    - go_tools_deps
    script:
    - $ErrorActionPreference = "Stop"
    - $_instance_id = (iwr  -UseBasicParsing http://169.254.169.254/latest/meta-data/instance-id).content
      ; Write-Host "Running on instance $($_instance_id)"
    - 'docker run --rm -m 16384M -v "$(Get-Location):c:\mnt" -e AWS_NETWORKING=true
      -e CI_PIPELINE_ID=${CI_PIPELINE_ID} -e CI_PROJECT_NAME=${CI_PROJECT_NAME} -e SIGN_WINDOWS_DD_WCS=true
      -e GOMODCACHE="c:\modcache" -e PIP_INDEX_URL=${PIP_INDEX_URL} registry.ddbuild.io/ci/datadog-agent-buildimages/windows_1809_${ARCH}${Env:DATADOG_AGENT_WINBUILDIMAGES_SUFFIX}:${Env:DATADOG_AGENT_WINBUILDIMAGES}
      c:\mnt\tasks\winbuildscripts\sysprobe.bat
  
      '
    - If ($lastExitCode -ne "0") { throw "Previous command returned $lastExitCode" }
    stage: source_test
    tags:
    - runner:windows-docker
    - windowsversion:1809
deploy_deb_testing-a7_arm64
  deploy_deb_testing-a7_arm64:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer_deb-arm64
    - agent_deb-arm64-a7
    - lint_linux-arm64
    rules:
    - if: ($CI_COMMIT_BRANCH == "main"  || $DEPLOY_AGENT == "true" || $RUN_E2E_TESTS
        == "on" || $DDR_WORKFLOW_ID != null) && $RUN_E2E_TESTS != "off"
    - if: $RUN_E2E_TESTS == "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $RUN_E2E_TESTS == "on"
      when: on_success
    - if: $CI_COMMIT_BRANCH == "main"
      when: on_success
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
      when: on_success
    - if: $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+-rc\.[0-9]+$/
      when: on_success
    - changes:
        compare_to: main
        paths:
        - .gitlab/e2e/e2e.yml
        - test/new-e2e/pkg/**/*
        - test/new-e2e/go.mod
        - flakes.yaml
    - changes:
        compare_to: main
        paths:
        - .gitlab/**/*
        - omnibus/config/**/*
        - pkg/fleet/**/*
        - cmd/installer/**/*
        - test/new-e2e/tests/installer/**/*
        - tasks/installer.py
      when: on_success
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - allow_failure: true
      when: manual
    script:
    - printf -- "$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DEB_GPG_KEY)" | gpg --import
      --batch
    - EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
      needs gitlab runner restart"; exit $EXIT; fi
    - APT_SIGNING_KEY_PASSPHRASE=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DEB_SIGNING_PASSPHRASE)
      || exit $?
    - set +x
    - echo 'deb [signed-by=/usr/share/keyrings/datadog-archive-keyring.gpg] https://apt.datadoghq.com/
      stable 7' > /etc/apt/sources.list.d/datadog.list
    - touch /usr/share/keyrings/datadog-archive-keyring.gpg
    - chmod a+r /usr/share/keyrings/datadog-archive-keyring.gpg
    - curl https://keys.datadoghq.com/DATADOG_APT_KEY_CURRENT.public | gpg --no-default-keyring
      --keyring /usr/share/keyrings/datadog-archive-keyring.gpg --import --batch
    - apt-get -o Acquire::Retries="5" update
    - apt-get -o Acquire::Retries="5" -o "Dir::Cache::archives=$OMNIBUS_PACKAGE_DIR"
      install --download-only datadog-signing-keys
    - pushd $OMNIBUS_PACKAGE_DIR
    - filename=$(ls datadog-signing-keys*.deb); mv $filename datadog-signing-keys_${DD_PIPELINE_ID}.deb
    - popd
    - echo "$APT_SIGNING_KEY_PASSPHRASE" | deb-s3 upload -c "pipeline-$DD_PIPELINE_ID-arm64"
      -m 7 -b $DEB_TESTING_S3_BUCKET -a arm64 --sign=$DEB_GPG_KEY_ID --gpg_options="--passphrase-fd
      0 --batch --digest-algo SHA512" --preserve_versions --visibility public $OMNIBUS_PACKAGE_DIR/datadog-*_7*arm64.deb
    - echo "$APT_SIGNING_KEY_PASSPHRASE" | deb-s3 upload -c "pipeline-$DD_PIPELINE_ID-arm64"
      -m 7 -b $DEB_TESTING_S3_BUCKET -a arm64 --sign=$DEB_GPG_KEY_ID --gpg_options="--passphrase-fd
      0 --batch --digest-algo SHA512" --preserve_versions --visibility public $OMNIBUS_PACKAGE_DIR/datadog-signing-keys_${DD_PIPELINE_ID}.deb
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
deploy_deb_testing-a7_x64
  deploy_deb_testing-a7_x64:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer_deb-amd64
    - agent_deb-x64-a7
    - agent_heroku_deb-x64-a7
    - iot_agent_deb-x64
    - dogstatsd_deb-x64
    - lint_linux-x64
    rules:
    - if: $DEPLOY_AGENT == "false" && $DDR_WORKFLOW_ID == null && $RUN_E2E_TESTS ==
        "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - printf -- "$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DEB_GPG_KEY)" | gpg --import
      --batch
    - EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
      needs gitlab runner restart"; exit $EXIT; fi
    - APT_SIGNING_KEY_PASSPHRASE=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DEB_SIGNING_PASSPHRASE)
      || exit $?
    - set +x
    - echo 'deb [signed-by=/usr/share/keyrings/datadog-archive-keyring.gpg] https://apt.datadoghq.com/
      stable 7' > /etc/apt/sources.list.d/datadog.list
    - touch /usr/share/keyrings/datadog-archive-keyring.gpg
    - chmod a+r /usr/share/keyrings/datadog-archive-keyring.gpg
    - curl https://keys.datadoghq.com/DATADOG_APT_KEY_CURRENT.public | gpg --no-default-keyring
      --keyring /usr/share/keyrings/datadog-archive-keyring.gpg --import --batch
    - apt-get -o Acquire::Retries="5" update
    - apt-get -o Acquire::Retries="5" -o "Dir::Cache::archives=$OMNIBUS_PACKAGE_DIR"
      install --download-only datadog-signing-keys
    - pushd $OMNIBUS_PACKAGE_DIR
    - filename=$(ls datadog-signing-keys*.deb); mv $filename datadog-signing-keys_${DD_PIPELINE_ID}.deb
    - popd
    - echo "$APT_SIGNING_KEY_PASSPHRASE" | deb-s3 upload -c "pipeline-$DD_PIPELINE_ID-x86_64"
      -m 7 -b $DEB_TESTING_S3_BUCKET -a amd64 --sign=$DEB_GPG_KEY_ID --gpg_options="--passphrase-fd
      0 --batch --digest-algo SHA512" --preserve_versions --visibility public $OMNIBUS_PACKAGE_DIR/datadog-*_7*amd64.deb
    - echo "$APT_SIGNING_KEY_PASSPHRASE" | deb-s3 upload -c "pipeline-$DD_PIPELINE_ID-x86_64"
      -m 7 -b $DEB_TESTING_S3_BUCKET -a x86_64 --sign=$DEB_GPG_KEY_ID --gpg_options="--passphrase-fd
      0 --batch --digest-algo SHA512" --preserve_versions --visibility public $OMNIBUS_PACKAGE_DIR/datadog-*_7*amd64.deb
    - echo "$APT_SIGNING_KEY_PASSPHRASE" | deb-s3 upload -c "pipeline-$DD_PIPELINE_ID-x86_64"
      -m 7 -b $DEB_TESTING_S3_BUCKET -a amd64 --sign=$DEB_GPG_KEY_ID --gpg_options="--passphrase-fd
      0 --batch --digest-algo SHA512" --preserve_versions --visibility public $OMNIBUS_PACKAGE_DIR/datadog-signing-keys_${DD_PIPELINE_ID}.deb
    - echo "$APT_SIGNING_KEY_PASSPHRASE" | deb-s3 upload -c "pipeline-$DD_PIPELINE_ID-x86_64"
      -m 7 -b $DEB_TESTING_S3_BUCKET -a x86_64 --sign=$DEB_GPG_KEY_ID --gpg_options="--passphrase-fd
      0 --batch --digest-algo SHA512" --preserve_versions --visibility public $OMNIBUS_PACKAGE_DIR/datadog-signing-keys_${DD_PIPELINE_ID}.deb
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
deploy_rpm_testing-a7_arm64
  deploy_rpm_testing-a7_arm64:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer_rpm-arm64
    - agent_rpm-arm64-a7
    - lint_linux-arm64
    rules:
    - if: ($CI_COMMIT_BRANCH == "main"  || $DEPLOY_AGENT == "true" || $RUN_E2E_TESTS
        == "on" || $DDR_WORKFLOW_ID != null) && $RUN_E2E_TESTS != "off"
    - if: $RUN_E2E_TESTS == "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $RUN_E2E_TESTS == "on"
      when: on_success
    - if: $CI_COMMIT_BRANCH == "main"
      when: on_success
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
      when: on_success
    - if: $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+-rc\.[0-9]+$/
      when: on_success
    - changes:
        compare_to: main
        paths:
        - .gitlab/e2e/e2e.yml
        - test/new-e2e/pkg/**/*
        - test/new-e2e/go.mod
        - flakes.yaml
    - changes:
        compare_to: main
        paths:
        - .gitlab/**/*
        - omnibus/config/**/*
        - pkg/fleet/**/*
        - cmd/installer/**/*
        - test/new-e2e/tests/installer/**/*
        - tasks/installer.py
      when: on_success
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - allow_failure: true
      when: manual
    script:
    - printf -- "$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_GPG_KEY)" | gpg --import
      --batch
    - EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
      needs gitlab runner restart"; exit $EXIT; fi
    - RPM_SIGNING_PASSPHRASE=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_SIGNING_PASSPHRASE)
      || exit $?
    - set +x
    - echo "$RPM_SIGNING_PASSPHRASE" | python2 /opt/rpm-s3/bin/rpm-s3 --verbose --visibility
      public-read -c "https://s3.amazonaws.com" -b $RPM_TESTING_S3_BUCKET -p "testing/pipeline-$DD_PIPELINE_ID/7/aarch64/"
      -a "aarch64" --sign --metadata-signing-key $RPM_GPG_KEY_ID $OMNIBUS_PACKAGE_DIR/datadog-*-7.*aarch64.rpm
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
deploy_rpm_testing-a7_x64
  deploy_rpm_testing-a7_x64:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer_rpm-amd64
    - agent_rpm-x64-a7
    - iot_agent_rpm-x64
    - dogstatsd_rpm-x64
    - lint_linux-x64
    rules:
    - if: $DEPLOY_AGENT == "false" && $DDR_WORKFLOW_ID == null && $RUN_E2E_TESTS ==
        "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - printf -- "$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_GPG_KEY)" | gpg --import
      --batch
    - EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
      needs gitlab runner restart"; exit $EXIT; fi
    - RPM_SIGNING_PASSPHRASE=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_SIGNING_PASSPHRASE)
      || exit $?
    - set +x
    - echo "$RPM_SIGNING_PASSPHRASE" | python2 /opt/rpm-s3/bin/rpm-s3 --verbose --visibility
      public-read -c "https://s3.amazonaws.com" -b $RPM_TESTING_S3_BUCKET -p "testing/pipeline-$DD_PIPELINE_ID/7/x86_64/"
      -a "x86_64" --sign --metadata-signing-key $RPM_GPG_KEY_ID $OMNIBUS_PACKAGE_DIR/datadog-*-7.*x86_64.rpm
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
deploy_suse_rpm_testing_arm64-a7
  deploy_suse_rpm_testing_arm64-a7:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR_SUSE
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer_suse_rpm-arm64
    - agent_suse-arm64-a7
    - lint_linux-arm64
    rules:
    - if: ($CI_COMMIT_BRANCH == "main"  || $DEPLOY_AGENT == "true" || $RUN_E2E_TESTS
        == "on" || $DDR_WORKFLOW_ID != null) && $RUN_E2E_TESTS != "off"
    - if: $RUN_E2E_TESTS == "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $RUN_E2E_TESTS == "on"
      when: on_success
    - if: $CI_COMMIT_BRANCH == "main"
      when: on_success
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
      when: on_success
    - if: $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+-rc\.[0-9]+$/
      when: on_success
    - changes:
        compare_to: main
        paths:
        - .gitlab/e2e/e2e.yml
        - test/new-e2e/pkg/**/*
        - test/new-e2e/go.mod
        - flakes.yaml
    - changes:
        compare_to: main
        paths:
        - .gitlab/**/*
        - omnibus/config/**/*
        - pkg/fleet/**/*
        - cmd/installer/**/*
        - test/new-e2e/tests/installer/**/*
        - tasks/installer.py
      when: on_success
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - allow_failure: true
      when: manual
    script:
    - printf -- "$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_GPG_KEY)" | gpg --import
      --batch
    - EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
      needs gitlab runner restart"; exit $EXIT; fi
    - RPM_SIGNING_PASSPHRASE=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_SIGNING_PASSPHRASE)
      || exit $?
    - set +x
    - echo "$RPM_SIGNING_PASSPHRASE" | python2 /opt/rpm-s3/bin/rpm-s3 --verbose --visibility
      public-read -c "https://s3.amazonaws.com" -b $RPM_TESTING_S3_BUCKET -p "suse/testing/pipeline-$DD_PIPELINE_ID/7/aarch64/"
      -a "aarch64" --sign --metadata-signing-key $RPM_GPG_KEY_ID --repodata-store-public-key
      $OMNIBUS_PACKAGE_DIR_SUSE/datadog-*-7.*aarch64.rpm
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
deploy_suse_rpm_testing_x64-a7
  deploy_suse_rpm_testing_x64-a7:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR_SUSE
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer_suse_rpm-amd64
    - agent_suse-x64-a7
    - iot_agent_suse-x64
    - dogstatsd_suse-x64
    - lint_linux-x64
    rules:
    - if: $DEPLOY_AGENT == "false" && $DDR_WORKFLOW_ID == null && $RUN_E2E_TESTS ==
        "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - printf -- "$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_GPG_KEY)" | gpg --import
      --batch
    - EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
      needs gitlab runner restart"; exit $EXIT; fi
    - RPM_SIGNING_PASSPHRASE=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $RPM_SIGNING_PASSPHRASE)
      || exit $?
    - set +x
    - echo "$RPM_SIGNING_PASSPHRASE" | python2 /opt/rpm-s3/bin/rpm-s3 --verbose --visibility
      public-read -c "https://s3.amazonaws.com" -b $RPM_TESTING_S3_BUCKET -p "suse/testing/pipeline-$DD_PIPELINE_ID/7/x86_64/"
      -a "x86_64" --sign --metadata-signing-key $RPM_GPG_KEY_ID --repodata-store-public-key
      $OMNIBUS_PACKAGE_DIR_SUSE/datadog-*-7.*x86_64.rpm
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
    variables:
      DD_PIPELINE_ID: $CI_PIPELINE_ID-a7
deploy_windows_testing-a7
  deploy_windows_testing-a7:
    before_script:
    - ls $OMNIBUS_PACKAGE_DIR
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/gitlab_agent_deploy$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - lint_windows-x64
    - windows_msi_and_bosh_zip_x64-a7
    - windows-installer-amd64
    rules:
    - if: $DEPLOY_AGENT == "false" && $DDR_WORKFLOW_ID == null && $RUN_E2E_TESTS ==
        "off"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - $S3_CP_CMD --recursive --exclude "*" --include "datadog-agent-7.*.msi" --include
      "datadog-installer-*-1-x86_64.msi" --include "datadog-installer-*-1-x86_64.exe"
      $OMNIBUS_PACKAGE_DIR s3://$WIN_S3_BUCKET/$WINDOWS_TESTING_S3_BUCKET_A7 --grants
      read=uri=http://acs.amazonaws.com/groups/global/AllUsers full=id=3a6e02b08553fd157ae3fb918945dd1eaae5a1aa818940381ef07a430cf25732
-   stage: kitchen_deploy
?          ----- ^
+   stage: e2e_deploy
?           ^^
    tags:
    - arch:amd64
tests_windows_sysprobe_x64
  tests_windows_sysprobe_x64:
    artifacts:
      paths:
-     - $DD_AGENT_TESTING_DIR/site-cookbooks/dd-system-probe-check/files
+     - $CI_PROJECT_DIR/test/new-e2e/tests/sysprobe-functional/artifacts
      when: always
    needs:
    - go_deps
    - go_tools_deps
    script:
    - $ErrorActionPreference = "Stop"
    - $_instance_id = (iwr  -UseBasicParsing http://169.254.169.254/latest/meta-data/instance-id).content
      ; Write-Host "Running on instance $($_instance_id)"
    - 'docker run --rm -m 16384M -v "$(Get-Location):c:\mnt" -e AWS_NETWORKING=true
      -e CI_PIPELINE_ID=${CI_PIPELINE_ID} -e CI_PROJECT_NAME=${CI_PROJECT_NAME} -e SIGN_WINDOWS_DD_WCS=true
      -e GOMODCACHE="c:\modcache" -e PIP_INDEX_URL=${PIP_INDEX_URL} registry.ddbuild.io/ci/datadog-agent-buildimages/windows_1809_${ARCH}${Env:DATADOG_AGENT_WINBUILDIMAGES_SUFFIX}:${Env:DATADOG_AGENT_WINBUILDIMAGES}
      c:\mnt\tasks\winbuildscripts\sysprobe.bat
  
      '
    - If ($lastExitCode -ne "0") { throw "Previous command returned $lastExitCode" }
    stage: source_test
    tags:
    - runner:windows-docker
    - windowsversion:1809
    variables:
      ARCH: x64

Removed Jobs

  • .kitchen_azure_location_west_central_us
  • .kitchen_test_chef_agent
  • .kitchen_datadog_agent_flavor
  • .kitchen_test_upgrade7
  • .kitchen_ec2_spot_instances
  • .kitchen_test_chef
  • .kitchen_azure_x64
  • .kitchen_azure
  • .kitchen_azure_location_north_central_us
  • .kitchen_ec2_x64
  • .kitchen_azure_location_central_us
  • .kitchen_cleanup_s3_common
  • .kitchen_common_with_junit
  • .kitchen_test_upgrade5
  • .on_default_kitchen_tests
  • .kitchen_ec2
  • .on_kitchen_tests_always
  • .kitchen_test_upgrade7_agent
  • .kitchen_agent_a7
  • .kitchen_ec2_arm64
  • .kitchen_test_upgrade5_agent
  • .kitchen_common
  • .on_default_kitchen_tests_always
  • .kitchen_os_windows
  • .kitchen_cleanup_azure_common
  • .kitchen_azure_location_south_central_us
  • .on_kitchen_invoke_tasks_changes
  • cleanup_kitchen_functional_test
  • kitchen_invoke_unit_tests
  • kitchen_cleanup_azure-a7
  • periodic_kitchen_cleanup_azure
  • periodic_kitchen_cleanup_ec2
  • periodic_kitchen_cleanup_s3

Renamed Jobs

  • .kitchen_ec2_location_us_east_1 -> .kmt_ec2_location_us_east_1
  • .on_kitchen_tests -> .on_e2e_tests

Changes Summary

Removed Modified Added Renamed
33 13 0 2

ℹ️ Diff available in the job log.

@KevinFairise2
Copy link
Member Author

/trigger-ci --variable RUN_ALL_BUILDS=true --variable RUN_KITCHEN_TESTS=true --variable RUN_E2E_TESTS=on --variable RUN_UNIT_TESTS=on --variable RUN_KMT_TESTS=on

@dd-devflow
Copy link

dd-devflow bot commented Nov 28, 2024

Devflow running: /trigger-ci --variable RUN_ALL_BUILDS=true --varia...

View all feedbacks in Devflow UI.


2024-11-28 08:29:46 UTC ℹ️ Gitlab pipeline started

Started pipeline #50095386

@dd-devflow
Copy link

dd-devflow bot commented Nov 28, 2024

Devflow running: /trigger-ci --variable RUN_ALL_BUILDS=true --varia...

View all feedbacks in Devflow UI.


2024-11-28 08:29:47 UTC ℹ️ Devflow: /trigger-ci --variable RUN_ALL_BUILDS=true --variable RUN_KITCHEN_TESTS=true --variable RUN_E2E_TESTS=on --variable RUN_UNIT_TESTS=on --variable RUN_KMT_TESTS=on

@KevinFairise2
Copy link
Member Author

/trigger-ci --variable RUN_ALL_BUILDS=true --variable RUN_KITCHEN_TESTS=true --variable RUN_E2E_TESTS=on --variable RUN_UNIT_TESTS=on --variable RUN_KMT_TESTS=on

1 similar comment
@KevinFairise2
Copy link
Member Author

/trigger-ci --variable RUN_ALL_BUILDS=true --variable RUN_KITCHEN_TESTS=true --variable RUN_E2E_TESTS=on --variable RUN_UNIT_TESTS=on --variable RUN_KMT_TESTS=on

@dd-devflow
Copy link

dd-devflow bot commented Nov 28, 2024

Devflow running: /trigger-ci --variable RUN_ALL_BUILDS=true --varia...

View all feedbacks in Devflow UI.


2024-11-28 15:44:07 UTC ℹ️ Gitlab pipeline started

Started pipeline #50113238

@dd-devflow
Copy link

dd-devflow bot commented Nov 28, 2024

Devflow running: /trigger-ci --variable RUN_ALL_BUILDS=true --varia...

View all feedbacks in Devflow UI.


2024-11-28 15:44:10 UTC ℹ️ Gitlab pipeline started

Started pipeline #50113241

@KevinFairise2 KevinFairise2 force-pushed the kfairise/kitchen-remove-everything branch from 0537b54 to 77f2225 Compare November 29, 2024 09:51
@KevinFairise2
Copy link
Member Author

/trigger-ci --variable RUN_ALL_BUILDS=true --variable RUN_KITCHEN_TESTS=true --variable RUN_E2E_TESTS=on --variable RUN_UNIT_TESTS=on --variable RUN_KMT_TESTS=on

@dd-devflow
Copy link

dd-devflow bot commented Nov 29, 2024

Devflow running: /trigger-ci --variable RUN_ALL_BUILDS=true --varia...

View all feedbacks in Devflow UI.


2024-11-29 10:08:43 UTC ℹ️ Branch synced on Gitlab

Branch kfairise/kitchen-remove-everything was out-of-sync on Gitlab - it has been resynced


2024-11-29 10:09:16 UTC ℹ️ Gitlab pipeline started

Started pipeline #50136550

@KevinFairise2 KevinFairise2 marked this pull request as ready for review November 29, 2024 14:38
@KevinFairise2 KevinFairise2 requested a review from a team as a code owner November 29, 2024 14:38
@KevinFairise2
Copy link
Member Author

/merge

@dd-devflow
Copy link

dd-devflow bot commented Dec 3, 2024

Devflow running: /merge

View all feedbacks in Devflow UI.


2024-12-03 16:35:08 UTC ℹ️ MergeQueue: waiting for PR to be ready

This merge request is not mergeable yet, because of pending checks/missing approvals. It will be added to the queue as soon as checks pass and/or get approvals.
Note: if you pushed new commits since the last approval, you may need additional approval.
You can remove it from the waiting list with /remove command.


2024-12-03 20:35:11 UTC ⚠️ MergeQueue: This merge request was unqueued

This merge request was unqueued

Comment on lines 1013 to 1016
# test/files/default/tests/pkg/network/testsuite
# test/files/default/tests/pkg/network/netlink/testsuite
# test/files/default/tests/pkg/ebpf/testsuite
# test/files/default/tests/pkg/ebpf/bytecode/testsuite
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These comments are not accurate, because it will be rooted in ./test/new-e2e/tests/sysprobe-functional/artifacts

tasks/kmt.py Outdated
@@ -1372,7 +1372,7 @@ def clean(ctx: Context, stack: str | None = None, container=False, image=False):
stack
), f"Stack {stack} does not exist. Please create with 'inv kmt.create-stack --stack=<name>'"

ctx.run("rm -rf ./test/kitchen/site-cookbooks/dd-system-probe-check/files/default/tests/pkg")
ctx.run("rm -rf ./test/files/default/tests/pkg")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This used to be {KITCHEN_ARTIFACT_DIR}/pkg, should it now be {E2E_ARTIFACT_DIR}/pkg?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file may need to be modified and moved to the new location where these artifacts end up

@@ -217,7 +217,7 @@ def get_test_results(self) -> dict[str, bool | None]:
The values are True if test passed, False if failed, None if skipped.
"""
junit_archive_name = f"junit-{self.arch}-{self.distro}-{self.vmset}.tar.gz"
junit_archive = self.artifact_file_binary(f"test/kitchen/{junit_archive_name}", ignore_not_found=True)
junit_archive = self.artifact_file_binary(f"test/{junit_archive_name}", ignore_not_found=True)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was {DD_AGENT_TESTING_DIR}/{junit_archive_name} before, so I think it still needs to match that. I believe the default directory is now test/new-e2e/tests.

tasks/kmt.py Outdated
@@ -2290,7 +2290,7 @@ def download_complexity_data(ctx: Context, commit: str, dest_path: str | Path, k
_, test_jobs = get_all_jobs_for_pipeline(pipeline_id)
for job in test_jobs:
complexity_name = f"verifier-complexity-{job.arch}-{job.distro}-{job.component}"
complexity_data_fname = f"test/kitchen/{complexity_name}.tar.gz"
complexity_data_fname = f"test/{complexity_name}.tar.gz"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was {DD_AGENT_TESTING_DIR}/{complexity_name}.tar.gz before, so I think it still needs to match that. I believe the default directory is now test/new-e2e/tests.

@@ -1628,38 +1628,38 @@ def is_bpftool_compatible(ctx):
return False


def kitchen_prepare_btfs(ctx, files_dir, arch=CURRENT_ARCH):
def e2e_prepare_btfs(ctx, files_dir, arch=CURRENT_ARCH):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can delete this entire function and the one reference. It is only needed when launching kitchen tests locally, which is not something that will be possible anymore.

Copy link
Member

@brycekahle brycekahle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like some paths need to synced up with the changes, rather than just removing the kitchen part of the path.

@KevinFairise2
Copy link
Member Author

/merge -c

@dd-devflow
Copy link

dd-devflow bot commented Dec 4, 2024

Devflow running: /merge -c

View all feedbacks in Devflow UI.


2024-12-04 09:01:14 UTCDevflow: /merge -c

This merge request was already processed and can't be unqueued anymore.

To get help about command usage, write /merge --help

If you need support, contact us on Slack #devflow with those details!

@KevinFairise2
Copy link
Member Author

Looks like some paths need to synced up with the changes, rather than just removing the kitchen part of the path.

Indeed, I first wanted to move the artifacts at the root of test/ folder and then switched to a folder inside test/new-e2e/tests to do the same as what had already been done on security-agent tests. But look like I missed some by doing so.
I think I adressed all the comments, let me know if you see something more

@KevinFairise2
Copy link
Member Author

/merge

@dd-devflow
Copy link

dd-devflow bot commented Dec 5, 2024

Devflow running: /merge

View all feedbacks in Devflow UI.


2024-12-05 12:01:32 UTC ℹ️ MergeQueue: pull request added to the queue

The median merge time in main is 24m.


2024-12-05 12:04:33 UTCMergeQueue: The build pipeline contains failing jobs for this merge request

Build pipeline has failing jobs for 8d61ebe:

⚠️ Do NOT retry failed jobs directly (why?).

What to do next?

  • Investigate the failures and when ready, re-add your pull request to the queue!
  • Any question, go check the FAQ.
Details

Since those jobs are not marked as being allowed to fail, the pipeline will most likely fail.
Therefore, and to allow other builds to be processed, this merge request has been rejected and the pipeline got canceled.

@KevinFairise2
Copy link
Member Author

/merge

@dd-devflow
Copy link

dd-devflow bot commented Dec 6, 2024

Devflow running: /merge

View all feedbacks in Devflow UI.


2024-12-06 08:26:17 UTC ℹ️ MergeQueue: pull request added to the queue

The median merge time in main is 23m.

@dd-mergequeue dd-mergequeue bot merged commit 0cd172f into main Dec 6, 2024
315 checks passed
@dd-mergequeue dd-mergequeue bot deleted the kfairise/kitchen-remove-everything branch December 6, 2024 09:03
@github-actions github-actions bot added this to the 7.62.0 milestone Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
changelog/no-changelog component/system-probe long review PR is complex, plan time to review it qa/no-code-change No code change in Agent code requiring validation team/agent-devx-loops
Projects
None yet
Development

Successfully merging this pull request may close these issues.