Dark Mode

Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Update SMP CLI to v0.26.1#47538

Open
GeorgeHahn wants to merge 3 commits intomainfrom
hahn/smp-0.26.1
Open

Update SMP CLI to v0.26.1#47538
GeorgeHahn wants to merge 3 commits intomainfrom
hahn/smp-0.26.1

Conversation

Copy link
Contributor

GeorgeHahn commented Mar 6, 2026 *
edited
Loading

Summary

  • Updates the SMP CLI version from v0.25.1 to v0.26.1 for the agent regression detector
  • Run regression detector when the workflow changes

Test plan

  • SMP regression job runs successfully with the new CLI version

chatgpt-codex-connector[bot] reacted with thumbs up emoji
Co-Authored-By: Claude Opus 4.6
GeorgeHahn requested a review from a team as a code owner March 6, 2026 22:11
dd-octo-sts bot added internal Identify a non-fork PR team/agent-devx labels Mar 6, 2026
github-actions bot added the short review PR is simple enough to be reviewed quickly label Mar 6, 2026
Copy link
Contributor

agent-platform-auto-pr bot commented Mar 6, 2026 *
edited
Loading

Gitlab CI Configuration Changes

Updated: .gitlab-ci.yml

Modified Jobs

.artifacts_build_impacting_paths
.artifacts_build_impacting_paths:
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
.on_dev_branches_with_artifact_changes
.on_dev_branches_with_artifact_changes:
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
.on_dev_branches_with_artifact_changes_manual
.on_dev_branches_with_artifact_changes_manual:
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- allow_failure: true
changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
when: manual
files_inventory_check
files_inventory_check:
allow_failure: true
artifacts:
expire_in: 2 weeks
paths:
- '**/*_size_report_*.yml'
image: registry.ddbuild.io/ci/datadog-agent-buildimages/linux$CI_IMAGE_LINUX_SUFFIX:$CI_IMAGE_LINUX
needs:
- agent_deb-x64-a7
rules:
- if: $E2E_COVERAGE_PIPELINE == "true"
when: never
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- when: on_success
script:
- GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
|| exit $?; export GITHUB_KEY_B64
- GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
|| exit $?; export GITHUB_APP_ID
- GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secr et.sh $AGENT_GITHUB_APP
installation_id) || exit $?; export GITHUB_INSTALLATION_ID
- echo "Using agent GitHub App"
- dda inv -- files-inventory.check ${CI_COMMIT_BRANCH} ${OMNIBUS_PACKAGE_DIR}
stage: functional_test
tags:
- arch:amd64
- specific:true
manual_gate_threshold_update
manual_gate_threshold_update:
image: registry.ddbuild.io/ci/datadog-agent-buildimages/docker_x64$CI_IMAGE_DOCKER_X64_SUFFIX:$CI_IMAGE_DOCKER_X64
needs:
- static_quality_gates
rules:
- if: $E2E_COVERAGE_PIPELINE == "true"
when: never
- allow_failure: true
if: $CI_COMMIT_BRANCH == "main"
when: manual
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- allow_failure: true
changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
when: manual
- allow_failure: true
when: manual
script:
- DOCKER_LOGIN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DOCKER_REGISTRY_RO user)
|| exit $?
- $CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DOCKER_REGISTRY_RO token | crane auth
login --username "$DOCKER_LOGIN" --password-stdin "$DOCKER_REGISTRY_URL"
- EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
needs gitlab runner restart"; exit $EXIT; fi
- GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
|| exit $?; export GITHUB_KEY_B64
- GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
|| exit $?; export GITHUB_APP_ID
- GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secr et.sh $AGENT_GITHUB_APP
installation_id) || exit $?; export GITHUB_INSTALLATION_ID
- echo "Using agent GitHub App"
- SLACK_DATADOG_AGENT_BOT_TOKEN=$($CI_PROJECT_DIR/tools/ci/fet ch_secret.sh $SLACK_AGENT
token) || exit $?; export SLACK_DATADOG_AGENT_BOT_TOKEN
- dda inv -- quality-gates.manual-threshold-update || exit $?
stage: functional_test
tags:
- arch:amd64
- specific:true
single_machine_performance-full-amd64-a7
single_machine_performance-full-amd64-a7:
image: registry.ddbuild.io/ci/datadog-agent-buildimages/docker_x64$CI_IMAGE_DOCKER_X64_SUFFIX:$CI_IMAGE_DOCKER_X64
needs:
- docker_build_agent7_full
rules:
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
script:
- "if [[ \"$BUCKET_BRANCH\" == \"nightly\" && ( \"$IMG_SOURCES\" =~ \"$SRC_AGENT\"\
\ || \"$IMG_SOURCES\" =~ \"$SRC_OTEL_AGENT\" || \"$IMG_SOURCES\" =~ \"$SRC_DDOT_EBPF\"\
\ || \"$IMG_SOURCES\" =~ \"$SRC_DCA\" || \"$IMG_SOURCES\" =~ \"$SRC_CWS_INSTRUMENTATION\"\
\ || \"$IMG_VARIABLES\" =~ \"$SRC_AGENT\" || \"$IMG_VARIABLES\" =~ \"$SRC_DDOT_EBPF\"\
\ || \"$IMG_VARIABLES\" =~ \"$SRC_DCA\" || \"$IMG_VARIABLES\" =~ \"$SRC_CWS_INSTRUMENTATION\"\
\ ) ]]; then\n export ECR_RELEASE_SUFFIX=\"-nightly\"\nelse\n export ECR_RELEASE_SUFFIX=\"\
${CI_COMMIT_TAG+-release}\"\nfi\n"
- IMG_VARIABLES="$(sed -E "s#(${SRC_AGENT}|${SRC_OTEL_AGENT}|${SRC_DDOT_EBPF}|${SRC_DSD}|${SRC_DCA}|${SRC_CWS_INSTRUMENTATION})#\1${ECR_RELEASE_SUFFIX}#g"
<<<"$IMG_VARIABLES")"
- IMG_SOURCES="$(sed -E "s#(${SRC_AGENT}|${SRC_OTEL_AGENT}|${SRC_DDOT_EBPF}|${SRC_DSD}|${SRC_DCA}|${SRC_CWS_INSTRUMENTATION})#\1${ECR_RELEASE_SUFFIX}#g"
<<<"$IMG_SOURCES")"
- dda inv pipeline.trigger-child-pipeline --project-name DataDog/public-images --git-ref
main --timeout 1800 --variable IMG_VARIABLES --variable IMG_REGISTRIES --variable
IMG_SOURCES --variable IMG_DESTINATIONS --variable IMG_TAG_REFERENCE --variable
IMG_NEW_TAGS --variable IMG_SIGNING --variable APPS --variable BAZEL_TARGET --variable
DDR --variable DDR_WORKFLOW_ID --variable TARGET_ENV --variable DYNAMIC_BUILD_RENDER_TARGET_FORWARD_PARAMETERS
stage: container_build
tags:
- arch:amd64
- specific:true
variables:
IMG_DESTINATIONS: 08450328-agent:${CI_COMMIT_SHA}-7-full-amd64,52130853-agent:${CI_COMMIT_SHA}-7-full-amd64
IMG_REGISTRIES: smp
IMG_SIGNING: 'false'
IMG_SOURCES: ${SRC_AGENT}:v${CI_PIPELINE_ID}-${CI_COMMIT_SHORT_SHA}-7-full-amd64
IMG_VARIABLES: ''
SRC_AGENT: registry.ddbuild.io/ci/datadog-agent/agent
SRC_CWS_INSTRUMENTATION: registry.ddbuild.io/ci/datadog-agent/cws-instrumentation
SRC_DCA: registry.ddbuild.io/ci/datadog-agent/cluster-agent
SRC_DDOT_EBPF: registry.ddbuild.io/ci/datadog-agent/ddot-ebpf
SRC_DSD: registry.ddbuild.io/ci/datadog-agent/dogstatsd
SRC_OTEL_AGENT: registry.ddbuild.io/ci/datadog-agent/otel-agent
single_machine_performance-regression_detector-merge_base_check
single_machine_performance-regression_detector-merge_base_check:
artifacts:
expire_in: 1 day
paths:
- regression_detector.env
reports:
dotenv:
- regression_detector.env
image: registry.ddbuild.io/ci/datadog-agent-buildimages/docker_x64$CI_IMAGE_DOCKER_X64_SUFFIX:$CI_IMAGE_DOCKER_X64
needs: []
rules:
- if: $E2E_COVERAGE_PIPELINE == "true"
when: never
- if: $CI_COMMIT_BRANCH == "main"
needs:
- artifacts: false
job: single_machine_performance-full-amd64-a7
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
script:
- DATADOG_API_KEY="$("$CI_PROJECT_DIR"/tools/ci/fetch_secret.s h "$AGENT_API_KEY_ORG2"
token)" || exit $?; export DATADOG_API_KEY
- git fetch origin
- SMP_BASE_BRANCH=$(dda inv release.get-release-json-value base_branch --no-worktree)
- FOUR_DAYS_BEFORE_NOW=$(date --date="-4 days +1 hour" "+%s")
- "if [[ \"$CI_COMMIT_BRANCH\" == \"$SMP_BASE_BRANCH\" ]]; then\n # On the base\
\ branch, use the parent commit as the baseline\n BASELINE_SHA=$(git rev-parse\
\ \"${CI_COMMIT_SHA}^\")\n echo \"On base branch, using parent commit ${BASELINE_SHA}\
\ as initial baseline\"\nelse\n # On a dev branch, compute the merge base with\
\ the base branch\n echo \"Looking for merge base for branch ${SMP_BASE_BRANCH}\"\
\n SMP_MERGE_BASE=$(git merge-base ${CI_COMMIT_SHA} origin/${SMP_BASE_BRANCH})\n\
\ echo \"Merge base is ${SMP_MERGE_BASE}\"\n BASELINE_SHA=\"${SMP_MERGE_BASE}\"\
\nfi\n"
- BASELINE_COMMIT_TIME=$(git -c log.showSignature=false show --no-patch --format=%ct
${BASELINE_SHA})
- "if [[ ${BASELINE_COMMIT_TIME} -le ${FOUR_DAYS_BEFORE_NOW} ]]\nthen\n echo\
\ \"ERROR: Merge-base of this branch is too old for SMP. Please update your branch\
\ by merging an up-to-date main branch into your branch or by rebasing it on an\
\ up-to-date main branch.\"\n datadog-ci tag --level job --tags smp_merge_base_failure_reason:\"\
branch_too_old\"\n exit 1\nfi\n"
- echo "Commit ${BASELINE_SHA} is recent enough"
- AWS_NAMED_PROFILE="single-machine-performance"
- SMP_ACCOUNT_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT account_id)
|| exit $?
- SMP_AGENT_TEAM_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT agent_team_id)
|| exit $?
- SMP_BOT_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT bot_login)
|| exit $?
- SMP_BOT_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT bot_token)
|| exit $?
- aws configure set aws_access_key_id "$SMP_BOT_ID" --profile ${AWS_NAMED_PROFILE}
- aws configure set aws_secret_access_key "$SMP_BOT_KEY" --profile ${AWS_NAMED_PROFILE}
- aws configure set region us-west-2 --profile ${AWS_NAMED_PROFILE}
- echo "Checking if image exists for commit ${BASELINE_SHA}..."
- "while [[ ! $(aws ecr describe-images --region us-west-2 --profile single-machine-performance\
\ --registry-id \"${SMP_ACCOUNT_ID}\" --repository-name \"${SMP_AGENT_TEAM_ID}-agent\"\
\ --image-ids imageTag=\"${BASELINE_SHA}-7-full-amd64\") ]]\ndo\n echo \"No\
\ image exists for ${BASELINE_SHA} - checking predecessor of ${BASELINE_SHA} next\"\
\n BASELINE_SHA=$(git rev-parse ${BASELINE_SHA}^)\n echo \"Checking if commit\
\ ${BASELINE_SHA} is recent enough...\"\n BASELINE_COMMIT_TIME=$(git -c log.showSignature=false\
\ show --no-patch --format=%ct ${BASELINE_SHA})\n if [[ ${BASELINE_COMMIT_TIME}\
\ -le ${FOUR_DAYS_BEFORE_NOW} ]]\n then\n echo \"ERROR: Merge-base of\
\ this branch is too old for SMP. Please update your branch by merging an up-to-date\
\ main branch into your branch or by rebasing it on an up-to-date main branch.\"\
\n datadog-ci tag --level job --tags smp_merge_base_failure_reason:\"branch_too_old\"\
\n exit 1\n fi\n echo \"Commit ${BASELINE_SHA} is recent enough\"\
\n echo \"Checking if image exists for commit ${BASELINE_SHA}...\"\ndone\n"
- echo "Image exists for commit ${BASELINE_SHA}"
- echo "BASELINE_SHA=${BASELINE_SHA}" > regression_detector.env
- echo "Merge-base check passed. Baseline SHA saved to artifact."
stage: functional_test
tags:
- arch:amd64
- specific:true
timeout: 10m
variables:
GIT_DEPTH: 0
static_quality_gates
static_quality_gates:
artifacts:
expire_in: 1 week
paths:
- extract_rpm_package_report
- static_gate_report.json
when: always
image: registry.ddbuild.io/ci/datadog-agent-buildimages/docker_x64$CI_IMAGE_DOCKER_X64_SUFFIX:$CI_IMAGE_DOCKER_X64
needs:
- agent_deb-x64-a7
- agent_deb-x64-a7-fips
- agent_rpm-x64-a7
- agent_rpm-x64-a7-fips
- agent_rpm-arm64-a7
- agent_rpm-arm64-a7-fips
- agent_suse-x64-a7
- agent_suse-x64-a7-fips
- agent_suse-arm64-a7
- agent_suse-arm64-a7-fips
- agent_heroku_deb-x64-a7
- docker_build_agent7
- docker_build_agent7_arm64
- docker_build_agent7_jmx
- docker_build_agent7_jmx_arm64
- docker_build_cluster_agent_amd64
- docker_build_cluster_agent_arm64
- docker_build_cws_instrumentation_amd64
- docker_build_cws_instrumentation_arm64
- docker_build_dogstatsd_amd64
- docker_build_dogstatsd_arm64
- dogstatsd_deb-x64
- dogstatsd_deb-arm64
- dogstatsd_rpm-x64
- dogstatsd_suse-x64
- iot_agent_deb-x64
- iot_agent_deb-arm64
- iot_agent_deb-armhf
- iot_agent_rpm-x64
- iot_agent_suse-x64
- job: windows_msi_and_bosh_zip_x64-a7
optional: true
retry:
exit_codes:
- 42
max: 2
when:
- runner_system_failure
- stuck_or_timeout_failure
- unknown_failure
- api_failure
- scheduler_failure
- stale_schedule
- data_integrity_failure
rules:
- if: $E2E_COVERAGE_PIPELINE == "true"
when: never
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: on_success
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
- allow_failure: true
when: manual
script:
- DOCKER_LOGIN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DOCKER_REGISTRY_RO user)
|| exit $?
- $CI_PROJECT_DIR/tools/ci/fetch_secret.sh $DOCKER_REGISTRY_RO token | crane auth
login --username "$DOCKER_LOGIN" --password-stdin "$DOCKER_REGISTRY_URL"
- EXIT="${PIPESTATUS[0]}"; if [ $EXIT -ne 0 ]; then echo "Unable to locate credentials
needs gitlab runner restart"; exit $EXIT; fi
- DATADOG_API_KEY="$("$CI_PROJECT_DIR"/tools/ci/fetch_secret.s h "$AGENT_API_KEY_ORG2"
token)" || exit $?; export DATADOG_API_KEY
- export DD_API_KEY="$DATADOG_API_KEY"
- DD_APP_KEY="$("$CI_PROJECT_DIR"/tools/ci/fetch_secret.sh "$AGENT_APP_KEY_ORG2"
token)" || exit $?; export DD_APP_KEY
- GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
|| exit $?; export GITHUB_KEY_B64
- GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
|| exit $?; export GITHUB_APP_ID
- GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secr et.sh $AGENT_GITHUB_APP
installation_id) || exit $?; export GITHUB_INSTALLATION_ID
- echo "Using agent GitHub App"
- SLACK_DATADOG_AGENT_BOT_TOKEN=$($CI_PROJECT_DIR/tools/ci/fet ch_secret.sh $SLACK_AGENT
token) || exit $?; export SLACK_DATADOG_AGENT_BOT_TOKEN
- dda inv -- quality-gates.parse-and-trigger-gates || exit $?
stage: functional_test
tags:
- arch:amd64
- specific:true
variables:
GIT_DEPTH: 0
KUBERNETES_CPU_REQUEST: 8
OVERRIDE_GIT_STRATEGY: clone
trigger-single-machine-performance-regression_detector
trigger-single-machine-performance-regression_detector:
allow_failure: true
needs:
- artifacts: true
job: single_machine_performance-regression_detector-merge_base_check
- artifacts: false
job: single_machine_performance-full-amd64-a7
optional: true
rules:
- if: $E2E_COVERAGE_PIPELINE == "true"
when: never
- if: $CI_COMMIT_BRANCH == "main"
- if: $CI_COMMIT_BRANCH == "main"
when: never
- if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
when: never
- if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
when: never
- if: $CI_COMMIT_TAG != null
when: never
- changes:
compare_to: $COMPARE_TO_BRANCH
paths:
- go.mod
- go.sum
- cmd/**/*
- comp/**/*
- internal/**/*
- pkg/**/*
- rtloader/**/*
- tasks/**/*.py
- deps/**/*
- omnibus/**/*
- .bazelrc
- .bazelversion
- bazel/**/*
- BUILD.bazel
- MODULE.bazel
- release.json
- test/regression/**/*
+ - .gitlab/childs/smp-regression-child-pipeline.yml
+ - .gitlab/test/functional_test/regression_detector.yml
- Dockerfiles/**/*
stage: functional_test
trigger:
include:
- local: .gitlab/childs/smp-regression-child-pipeline.yml
variables:
BASELINE_SHA: $BASELINE_SHA
FF_KUBERNETES_HONOR_ENTRYPOINT: false
PARENT_PIPELINE_ID: $CI_PIPELINE_ID

Changes Summary

Removed Modified Added Renamed
0 9 0 0

Updated: .gitlab/childs/smp-regression-child-pipeline.yml

Modified Jobs

single-machine-performance-regression_detector
0. Gate **FAILED**.\"\n )\n failed\ \ = True\n else:\n decision_record.append(\n \ \ f\"- **{exp_name}**, bounds check **{check_name}**: {num_passed}/{num_total}\ \ replicas passed. Gate passed.\"\n )\n\nwith open('outputs/decision_record.md',\ \ 'w') as f:\n # Extra newline since this is appended to another report\n \ \ f.write('\\n\\n## CI Pass/Fail Decision\\n\\n')\n if failed:\n f.write('\u274C\ \ **Failed.** Some Quality Gates were violated.\\n\\n')\n f.write('\\n'.join(decision_record))\n\ \ else:\n f.write('\u2705 **Passed.** All Quality Gates passed.\\n\\\ n')\n f.write('\\n'.join(decision_record))\n\nif failed:\n print(\"\ Quality gate failed, see decision record\")\n sys.exit(1)\nelse:\n print(\"\ Quality gate passed.\")\n sys.exit(0)\nEOF\n" - datadog-ci tag --level job --tags smp_quality_gates:"passed" stage: functional_test tags: - arch:amd64 - specific:true timeout: 1h10m variables: BOT_LOGIN: bot_login BOT_TOKEN: bot_token SMP_API_URL: api_url SMP_TEAM_NAME: agent - SMP_VERSION: v0.25.1 ? ^ + SMP_VERSION: v0.26.1 ? ^"> single-machine-performance-regression_detector:
allow_failure: true
artifacts:
expire_in: 1 weeks
paths:
- submission_metadata
- outputs/report.md
- outputs/regression_signal.json
- outputs/bounds_check_signal.json
- outputs/junit.xml
- outputs/report.json
- outputs/decision_record.md
when: always
before_script:
- echo "BASELINE_SHA=${BASELINE_SHA}" > regression_detector.env
image: registry.ddbuild.io/ci/datadog-agent-buildimages/docker_x64$CI_IMAGE_DOCKER_X64_SUFFIX:$CI_IMAGE_DOCKER_X64
retry:
exit_codes:
- 42
max: 2
when:
- runner_system_failure
- stuck_or_timeout_failure
- unknown_failure
- api_failure
- scheduler_failure
- stale_schedule
- data_integrity_failure
script:
- DATADOG_API_KEY="$("$CI_PROJECT_DIR"/tools/ci/fetch_secret.s h "$AGENT_API_KEY_ORG2"
token)" || exit $?; export DATADOG_API_KEY
- datadog-ci tag --level job --tags smp_failure_mode:"timeout"
- mkdir outputs
- AWS_NAMED_PROFILE="single-machine-performance"
- "SMP_ACCOUNT_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT account_id)\
\ || {\n exit_code=$?\n datadog-ci tag --level job --tags smp_failure_mode:\"\
setup-secrets\"\n exit $exit_code\n}\n"
- SMP_ECR_URL=${SMP_ACCOUNT_ID}.dkr.ecr.us-west-2.amazonaws.com
- "SMP_TEAM_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT \"${SMP_TEAM_NAME}_team_id\"\
) || {\n exit_code=$?\n datadog-ci tag --level job --tags smp_failure_mode:\"\
setup-secrets\"\n exit $exit_code\n}\n"
- "SMP_API=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT $SMP_API_URL)\
\ || {\n exit_code=$?\n datadog-ci tag --level job --tags smp_failure_mode:\"\
setup-secrets\"\n exit $exit_code\n}\n"
- "SMP_BOT_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT $BOT_LOGIN)\
\ || {\n exit_code=$?\n datadog-ci tag --level job --tags smp_failure_mode:\"\
setup-secrets\"\n exit $exit_code\n}\n"
- "SMP_BOT_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SMP_ACCOUNT $BOT_TOKEN)\
\ || {\n exit_code=$?\n datadog-ci tag --level job --tags smp_failure_mode:\"\
setup-secrets\"\n exit $exit_code\n}\n"
- aws configure set aws_access_key_id "$SMP_BOT_ID" --profile ${AWS_NAMED_PROFILE}
- aws configure set aws_secret_access_key "$SMP_BOT_KEY" --profile ${AWS_NAMED_PROFILE}
- aws configure set region us-west-2 --profile ${AWS_NAMED_PROFILE}
- "aws --profile single-machine-performance s3 cp s3://smp-cli-releases/${SMP_VERSION}/x86_64-unknown-linux-musl/smp\
\ smp || {\n exit_code=$?\n datadog-ci tag --level job --tags smp_failure_mode:\"\
setup-fetch-cli\"\n exit $exit_code\n}\n"
- chmod +x smp
- source regression_detector.env
- echo "Baseline SHA is ${BASELINE_SHA}"
- echo -n "${BASELINE_SHA}" > "${CI_COMMIT_SHA}-baseline_sha"
- aws s3 cp --profile single-machine-performance --only-show-errors "${CI_COMMIT_SHA}-baseline_sha"
"s3://${SMP_TEAM_ID}-smp-artifacts/information/"
- BASELINE_IMAGE=${SMP_ECR_URL}/${SMP_TEAM_ID}-agent:${BASELINE_SHA}-7-full-amd64
- echo "${BASELINE_SHA} | ${BASELINE_IMAGE}"
- COMPARISON_IMAGE=${SMP_ECR_URL}/${SMP_TEAM_ID}-agent:${CI_COMMIT_SHA}-7-full-amd64
- echo "${CI_COMMIT_SHA} | ${COMPARISON_IMAGE}"
- SMP_TAGS="ci_pipeline_id=${PARENT_PIPELINE_ID},ci_job_id=${CI_JOB_ID},ci_commit_branch=${CI_COMMIT_BRANCH},purpose=agent_ci"
- echo "Tags passed through SMP are ${SMP_TAGS}"
- RUST_LOG="info,aws_config::profile::credentials=error"
- RUST_LOG_DEBUG="debug,aws_config::profile::credentials=error"
- "RUST_LOG=\"${RUST_LOG}\" ./smp --team-id ${SMP_TEAM_ID} --api-base ${SMP_API}\
\ --aws-named-profile ${AWS_NAMED_PROFILE} \\\njob submit \\\n--baseline-image\
\ ${BASELINE_IMAGE} \\\n--comparison-image ${COMPARISON_IMAGE} \\\n--baseline-sha\
\ ${BASELINE_SHA} \\\n--comparison-sha ${CI_COMMIT_SHA} \\\n--target-config-dir\
\ test/regression/ \\\n--submission-metadata submission_metadata \\\n--tags ${SMP_TAGS}\
\ || {\n exit_code=$?\n echo \"smp job submit command failed with code $exit_code\"\
\n datadog-ci tag --level job --tags smp_failure_mode:\"job-submission\"\n exit\
\ $exit_code\n}\n"
- SMP_JOB_ID=$(jq -r '.jobId' submission_metadata)
- echo "SMP Job Id is ${SMP_JOB_ID}"
- datadog-ci tag --level job --tags smp_job_id:${SMP_JOB_ID}
- "RUST_LOG=\"${RUST_LOG}\" ./smp --team-id ${SMP_TEAM_ID} --api-base ${SMP_API}\
\ --aws-named-profile ${AWS_NAMED_PROFILE} \\\njob status \\\n--wait \\\n--wait-delay-seconds\
\ 60 \\\n--submission-metadata submission_metadata || {\n exit_code=$?\n echo\
\ \"smp job status command failed with code $exit_code\"\n datadog-ci tag --level\
\ job --tags smp_failure_mode:\"job-status\"\n exit $exit_code\n}\n"
- "RUST_LOG=\"${RUST_LOG}\" ./smp --team-id ${SMP_TEAM_ID} --api-base ${SMP_API}\
\ --aws-named-profile ${AWS_NAMED_PROFILE} \\\njob sync \\\n--submission-metadata\
\ submission_metadata \\\n--output-path outputs || {\n exit_code=$?\n echo \"\
smp job sync command failed with code $exit_code\"\n datadog-ci tag --level job\
\ --tags smp_failure_mode:\"job-sync\"\n exit $exit_code\n}\n"
- cat outputs/report.md | sed "s/^\$/$(echo -ne '\uFEFF\u00A0\u200B')/g"
- datadog-ci junit upload --service datadog-agent outputs/junit.xml
- datadog-ci tag --level job --tags smp_failure_mode:"none"
- datadog-ci tag --level job --tags smp_optimization_goal:"passed"
- "RUST_LOG=\"${RUST_LOG}\" ./smp --team-id ${SMP_TEAM_ID} --api-base ${SMP_API}\
\ --aws-named-profile ${AWS_NAMED_PROFILE} \\\n job result \\\n --submission-metadata\
\ submission_metadata --signal regression-detector || {\n exit_code=$?\n echo\
\ \"smp regression detector has detected a regression\"\n datadog-ci tag --level\
\ job --tags smp_optimization_goal:\"failed\"\n}\n"
- datadog-ci tag --level job --tags smp_bounds_check:"passed"
- "RUST_LOG=\"${RUST_LOG}\" ./smp --team-id ${SMP_TEAM_ID} --api-base ${SMP_API}\
\ --aws-named-profile ${AWS_NAMED_PROFILE} \\\n job result \\\n --submission-metadata\
\ submission_metadata --signal bounds-check || {\n exit_code=$?\n echo \"smp\
\ regression detector has detected a failed bounds check\"\n datadog-ci tag --level\
\ job --tags smp_bounds_check:\"failed\"\n}\n"
- datadog-ci tag --level job --tags smp_quality_gates:"failed"
- "python3 <<'EOF'\nimport json\nimport sys\n\ntry:\n with open('outputs/report.json')\
\ as f:\n data = json.load(f)\nexcept FileNotFoundError:\n print(\"\
Machine readable report not found.\")\n sys.exit(1)\nexcept json.JSONDecodeError\
\ as e:\n print(f\"Error parsing JSON report: {e}\")\n sys.exit(1)\n\nexperiments\
\ = data.get('experiments', {})\nfailed = False\ndecision_record = []\n\nfor exp_name,\
\ exp_data in experiments.items():\n if exp_name.startswith('quality_gate_'):\n\
\ bounds_checks = exp_data.get('bounds_checks', {})\n for check_name,\
\ check_data in bounds_checks.items():\n results = check_data.get('results',\
\ {})\n comparison = results.get('comparison', [])\n num_total\
\ = len(comparison)\n failed_replicates = [\n replicate\
\ for replicate in comparison if not replicate.get('passed', False)\n \
\ ]\n num_failed = len(failed_replicates)\n num_passed\
\ = num_total - num_failed\n if failed_replicates:\n \
\ decision_record.append(\n f\"- **{exp_name}**, bounds check\
\ **{check_name}**: {num_passed}/{num_total} replicas passed. Failed {num_failed}\
\ which is > 0. Gate **FAILED**.\"\n )\n failed\
\ = True\n else:\n decision_record.append(\n \
\ f\"- **{exp_name}**, bounds check **{check_name}**: {num_passed}/{num_total}\
\ replicas passed. Gate passed.\"\n )\n\nwith open('outputs/decision_record.md',\
\ 'w') as f:\n # Extra newline since this is appended to another report\n \
\ f.write('\\n\\n## CI Pass/Fail Decision\\n\\n')\n if failed:\n f.write('\u274C\
\ **Failed.** Some Quality Gates were violated.\\n\\n')\n f.write('\\n'.join(decision_record))\n\
\ else:\n f.write('\u2705 **Passed.** All Quality Gates passed.\\n\\\
n')\n f.write('\\n'.join(decision_record))\n\nif failed:\n print(\"\
Quality gate failed, see decision record\")\n sys.exit(1)\nelse:\n print(\"\
Quality gate passed.\")\n sys.exit(0)\nEOF\n"
- datadog-ci tag --level job --tags smp_quality_gates:"passed"
stage: functional_test
tags:
- arch:amd64
- specific:true
timeout: 1h10m
variables:
BOT_LOGIN: bot_login
BOT_TOKEN: bot_token
SMP_API_URL: api_url
SMP_TEAM_NAME: agent
- SMP_VERSION: v0.25.1
? ^
+ SMP_VERSION: v0.26.1
? ^

Changes Summary

Removed Modified Added Renamed
0 1 0 0

i Diff available in the job log.

Copy link
Contributor

agent-platform-auto-pr bot commented Mar 6, 2026 *
edited
Loading

Files inventory check summary

File checks results against ancestor 7a3c63ad:

Results for datadog-agent_7.78.0~devel.git.413.57b038f.pipeline.101087717-1_amd64.deb:

No change detected

GeorgeHahn added changelog/no-changelog No changelog entry needed team/single-machine-performance Single Machine Performance qa/no-code-change No code change in Agent code requiring validation and removed team/agent-devx labels Mar 6, 2026
GeorgeHahn requested a review from a team March 6, 2026 22:57
Add the SMP child pipeline and regression detector configs to
artifacts_build_impacting_paths so changes to these files trigger
the SMP regression detector on dev branches.

Co-Authored-By: Claude Opus 4.6
dd-octo-sts bot added the team/agent-devx label Mar 6, 2026
Copy link

cit-pr-commenter-54b7da bot commented Mar 7, 2026 *
edited
Loading

Regression Detector

Regression Detector Results

Metrics dashboard
Target profiles
Run ID: 70c6c8e8-bdde-401e-a9d1-e01b380cd65b

Baseline: 7a3c63a
Comparison: 57b038f
Diff

Optimization Goals: No significant changes detected

Experiments ignored for regressions

Regressions in experiments with settings containing erratic: true are ignored.

perf experiment goal D mean % D mean % CI trials links
docker_containers_cpu % cpu utilization -4.18 [-7.15, -1.21] 1 Logs

Fine details of change detection per experiment

perf experiment goal D mean % D mean % CI trials links
quality_gate_logs % cpu utilization +3.37 [+1.63, +5.10] 1 Logs bounds checks dashboard
ddot_logs memory utilization +0.71 [+0.64, +0.79] 1 Logs
uds_dogstatsd_20mb_12k_contexts_20_senders memory utilization +0.62 [+0.56, +0.68] 1 Logs
ddot_metrics_sum_delta memory utilization +0.42 [+0.25, +0.59] 1 Logs
otlp_ingest_metrics memory utilization +0.27 [+0.11, +0.43] 1 Logs
ddot_metrics_sum_cumulativetodelta_exporter memory utilization +0.18 [-0.05, +0.40] 1 Logs
tcp_syslog_to_blackhole ingress throughput +0.09 [-0.05, +0.22] 1 Logs
file_to_blackhole_1000ms_latency egress throughput +0.04 [-0.39, +0.46] 1 Logs
file_to_blackhole_100ms_latency egress throughput +0.04 [-0.04, +0.11] 1 Logs
file_tree memory utilization +0.03 [-0.02, +0.09] 1 Logs
tcp_dd_logs_filter_exclude ingress throughput +0.01 [-0.09, +0.10] 1 Logs
docker_containers_memory memory utilization +0.00 [-0.07, +0.08] 1 Logs
uds_dogstatsd_to_api_v3 ingress throughput -0.00 [-0.20, +0.20] 1 Logs
uds_dogstatsd_to_api ingress throughput -0.01 [-0.20, +0.18] 1 Logs
file_to_blackhole_500ms_latency egress throughput -0.02 [-0.41, +0.36] 1 Logs
quality_gate_idle memory utilization -0.03 [-0.08, +0.03] 1 Logs bounds checks dashboard
file_to_blackhole_0ms_latency egress throughput -0.04 [-0.57, +0.49] 1 Logs
quality_gate_metrics_logs memory utilization -0.07 [-0.30, +0.16] 1 Logs bounds checks dashboard
otlp_ingest_logs memory utilization -0.10 [-0.21, +0.00] 1 Logs
quality_gate_idle_all_features memory utilization -0.25 [-0.29, -0.22] 1 Logs bounds checks dashboard
ddot_metrics memory utilization -0.27 [-0.45, -0.10] 1 Logs
ddot_metrics_sum_cumulative memory utilization -0.47 [-0.61, -0.33] 1 Logs
docker_containers_cpu % cpu utilization -4.18 [-7.15, -1.21] 1 Logs

Bounds Checks: Passed

perf experiment bounds_check_name replicates_passed observed_value links
docker_containers_cpu simple_check_run 10/10 625 >= 26
docker_containers_memory memory_usage 10/10 274.92MiB <= 370MiB
docker_containers_memory simple_check_run 10/10 468 >= 26
file_to_blackhole_0ms_latency memory_usage 10/10 0.20GiB <= 1.20GiB
file_to_blackhole_0ms_latency missed_bytes 10/10 0B = 0B
file_to_blackhole_1000ms_latency memory_usage 10/10 0.23GiB <= 1.20GiB
file_to_blackhole_1000ms_latency missed_bytes 10/10 0B = 0B
file_to_blackhole_100ms_latency memory_usage 10/10 0.20GiB <= 1.20GiB
file_to_blackhole_100ms_latency missed_bytes 10/10 0B = 0B
file_to_blackhole_500ms_latency memory_usage 10/10 0.21GiB <= 1.20GiB
file_to_blackhole_500ms_latency missed_bytes 10/10 0B = 0B
quality_gate_idle intake_connections 10/10 3 = 3 bounds checks dashboard
quality_gate_idle memory_usage 10/10 173.73MiB <= 175MiB bounds checks dashboard
quality_gate_idle_all_features intake_connections 10/10 3 = 3 bounds checks dashboard
quality_gate_idle_all_features memory_usage 10/10 492.63MiB <= 550MiB bounds checks dashboard
quality_gate_logs intake_connections 10/10 4 <= 6 bounds checks dashboard
quality_gate_logs memory_usage 10/10 202.56MiB <= 220MiB bounds checks dashboard
quality_gate_logs missed_bytes 10/10 0B = 0B bounds checks dashboard
quality_gate_metrics_logs cpu_usage 10/10 356.75 <= 2000 bounds checks dashboard
quality_gate_metrics_logs intake_connections 10/10 4 <= 6 bounds checks dashboard
quality_gate_metrics_logs memory_usage 10/10 400.40MiB <= 475MiB bounds checks dashboard
quality_gate_metrics_logs missed_bytes 10/10 0B = 0B bounds checks dashboard

Explanation

Confidence level: 90.00%
Effect size tolerance: |D mean %| >= 5.00%

Performance changes are noted in the perf column of each table:

  • = significantly better comparison variant performance
  • = significantly worse comparison variant performance
  • = no significant change in performance

A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "D mean % CI".

For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:

  1. Its estimated |D mean %| >= 5.00%, indicating the change is big enough to merit a closer look.

  2. Its 90.00% confidence interval "D mean % CI" does not contain zero, indicating that if our statistical model is accurate, there is at least a 90.00% chance there is a difference in performance between baseline and comparison variants.

  3. Its configuration does not mark it "erratic".

CI Pass/Fail Decision

Passed. All Quality Gates passed.

  • quality_gate_idle_all_features, bounds check intake_connections: 10/10 replicas passed. Gate passed.
  • quality_gate_idle_all_features, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_metrics_logs, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_metrics_logs, bounds check cpu_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_metrics_logs, bounds check missed_bytes: 10/10 replicas passed. Gate passed.
  • quality_gate_metrics_logs, bounds check intake_connections: 10/10 replicas passed. Gate passed.
  • quality_gate_logs, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_logs, bounds check intake_connections: 10/10 replicas passed. Gate passed.
  • quality_gate_logs, bounds check missed_bytes: 10/10 replicas passed. Gate passed.
  • quality_gate_idle, bounds check memory_usage: 10/10 replicas passed. Gate passed.
  • quality_gate_idle, bounds check intake_connections: 10/10 replicas passed. Gate passed.

Copy link
Contributor

agent-platform-auto-pr bot commented Mar 7, 2026 *
edited
Loading

Static quality checks

Please find below the results from static quality gates
Comparison made with ancestor 7a3c63a
Static Quality Gates Dashboard
SQG Job

31 successful checks with minimal change (< 2 KiB)
Quality gate Current Size
agent_deb_amd64 745.367 MiB
agent_deb_amd64_fips 703.715 MiB
agent_heroku_amd64 311.682 MiB
agent_msi 609.593 MiB
agent_rpm_amd64 745.351 MiB
agent_rpm_amd64_fips 703.699 MiB
agent_rpm_arm64 723.382 MiB
agent_rpm_arm64_fips 684.727 MiB
agent_suse_amd64 745.351 MiB
agent_suse_amd64_fips 703.699 MiB
agent_suse_arm64 723.382 MiB
agent_suse_arm64_fips 684.727 MiB
docker_agent_amd64 805.705 MiB
docker_agent_arm64 808.525 MiB
docker_agent_jmx_amd64 996.620 MiB
docker_agent_jmx_arm64 988.219 MiB
docker_cluster_agent_amd64 203.573 MiB
docker_cluster_agent_arm64 217.974 MiB
docker_cws_instrumentation_amd64 7.135 MiB
docker_cws_instrumentation_arm64 6.689 MiB
docker_dogstatsd_amd64 38.619 MiB
docker_dogstatsd_arm64 36.938 MiB
dogstatsd_deb_amd64 29.839 MiB
dogstatsd_deb_arm64 27.992 MiB
dogstatsd_rpm_amd64 29.839 MiB
dogstatsd_suse_amd64 29.839 MiB
iot_agent_deb_amd64 43.040 MiB
iot_agent_deb_arm64 40.099 MiB
iot_agent_deb_armhf 40.839 MiB
iot_agent_rpm_amd64 43.041 MiB
iot_agent_suse_amd64 43.041 MiB
On-wire sizes (compressed)
Quality gate Change Size (prev - curr - max)
agent_deb_amd64 -17.33 KiB (0.01% reduction) 173.961 - 173.944 - 177.700
agent_deb_amd64_fips -2.46 KiB (0.00% reduction) 164.865 - 164.863 - 172.230
agent_heroku_amd64 neutral 75.160 MiB - 79.970
agent_msi -4.0 KiB (0.00% reduction) 137.984 - 137.980 - 146.220
agent_rpm_amd64 +10.93 KiB (0.01% increase) 175.939 - 175.950 - 180.780
agent_rpm_amd64_fips -11.42 KiB (0.01% reduction) 167.789 - 167.778 - 173.370
agent_rpm_arm64 -6.93 KiB (0.00% reduction) 158.337 - 158.331 - 161.610
agent_rpm_arm64_fips -6.13 KiB (0.00% reduction) 151.141 - 151.135 - 155.910
agent_suse_amd64 +10.93 KiB (0.01% increase) 175.939 - 175.950 - 180.780
agent_suse_amd64_fips -11.42 KiB (0.01% reduction) 167.789 - 167.778 - 173.370
agent_suse_arm64 -6.93 KiB (0.00% reduction) 158.337 - 158.331 - 161.610
agent_suse_arm64_fips -6.13 KiB (0.00% reduction) 151.141 - 151.135 - 155.910
docker_agent_amd64 neutral 266.546 MiB - 271.240
docker_agent_arm64 neutral 253.824 MiB - 259.800
docker_agent_jmx_amd64 neutral 335.204 MiB - 339.870
docker_agent_jmx_arm64 +3.95 KiB (0.00% increase) 318.439 - 318.443 - 324.390
docker_cluster_agent_amd64 neutral 71.255 MiB - 71.920
docker_cluster_agent_arm64 neutral 66.885 MiB - 67.220
docker_cws_instrumentation_amd64 neutral 2.995 MiB - 3.330
docker_cws_instrumentation_arm64 neutral 2.727 MiB - 3.090
docker_dogstatsd_amd64 neutral 14.945 MiB - 15.820
docker_dogstatsd_arm64 neutral 14.274 MiB - 14.830
dogstatsd_deb_amd64 neutral 7.886 MiB - 8.790
dogstatsd_deb_arm64 neutral 6.771 MiB - 7.710
dogstatsd_rpm_amd64 neutral 7.897 MiB - 8.800
dogstatsd_suse_amd64 neutral 7.897 MiB - 8.800
iot_agent_deb_amd64 neutral 11.349 MiB - 12.040
iot_agent_deb_arm64 neutral 9.662 MiB - 10.450
iot_agent_deb_armhf neutral 9.893 MiB - 10.620
iot_agent_rpm_amd64 neutral 11.366 MiB - 12.060
iot_agent_suse_amd64 neutral 11.366 MiB - 12.060

goxberry approved these changes Mar 7, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Reviewers

goxberry goxberry approved these changes

At least 1 approving review is required to merge this pull request.

Assignees

No one assigned

Labels

changelog/no-changelog No changelog entry needed internal Identify a non-fork PR qa/no-code-change No code change in Agent code requiring validation short review PR is simple enough to be reviewed quickly team/agent-devx team/single-machine-performance Single Machine Performance

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

2 participants