Skip to content
3 changes: 3 additions & 0 deletions .github/workflows/run-end-to-end.yml
Original file line number Diff line number Diff line change
Expand Up @@ -255,6 +255,9 @@ jobs:
- name: Run LIBRARY_CONF_CUSTOM_HEADER_TAGS_INVALID scenario
if: always() && steps.build.outcome == 'success' && contains(inputs.scenarios, '"LIBRARY_CONF_CUSTOM_HEADER_TAGS_INVALID"')
run: ./run.sh LIBRARY_CONF_CUSTOM_HEADER_TAGS_INVALID
- name: Run OTLP_RUNTIME_METRICS scenario
if: always() && steps.build.outcome == 'success' && contains(inputs.scenarios, '"OTLP_RUNTIME_METRICS"')
run: ./run.sh OTLP_RUNTIME_METRICS
- name: Run RUNTIME_METRICS_ENABLED scenario
if: always() && steps.build.outcome == 'success' && contains(inputs.scenarios, '"RUNTIME_METRICS_ENABLED"')
run: ./run.sh RUNTIME_METRICS_ENABLED
Expand Down
1 change: 1 addition & 0 deletions manifests/cpp.yml
Original file line number Diff line number Diff line change
Expand Up @@ -310,5 +310,6 @@ manifest:
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_telemetry.py::Test_Telemetry::test_telemetry_message_has_datadog_container_id: "irrelevant (cgroup in weblog is 0::/, so this test can't work)"
tests/test_telemetry.py::Test_Telemetry::test_telemetry_message_required_headers: missing_feature
1 change: 1 addition & 0 deletions manifests/dotnet.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1123,6 +1123,7 @@ manifest:
tests/test_library_conf.py::Test_HeaderTags_Wildcard_Response_Headers: missing_feature
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: v3.42.0
tests/test_profiling.py::Test_Profile:
- weblog_declaration:
"*": v1.0 # real version not known
Expand Down
1 change: 1 addition & 0 deletions manifests/golang.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1392,6 +1392,7 @@ manifest:
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_profiling.py::Test_Profile::test_process_tags_svc: missing_feature
tests/test_protobuf.py: missing_feature
tests/test_resource_renaming.py:
Expand Down
1 change: 1 addition & 0 deletions manifests/java.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4216,6 +4216,7 @@ manifest:
akka-http: irrelevant (integration injects Date header after bytecode injection occurs)
play: irrelevant (integration injects Date header after bytecode injection occurs)
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_profiling.py::Test_Profile:
- weblog_declaration:
akka-http: v1.22.0
Expand Down
1 change: 1 addition & 0 deletions manifests/nodejs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2310,6 +2310,7 @@ manifest:
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_profiling.py::Test_Profile: *ref_5_16_0
tests/test_profiling.py::Test_Profile::test_process_tags: missing_feature
tests/test_profiling.py::Test_Profile::test_process_tags_svc: missing_feature
Expand Down
1 change: 1 addition & 0 deletions manifests/php.yml
Original file line number Diff line number Diff line change
Expand Up @@ -996,6 +996,7 @@ manifest:
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_profiling.py::Test_Profile:
- declaration: missing_feature (profiling seems not to be activated)
component_version: <1.16.0
Expand Down
1 change: 1 addition & 0 deletions manifests/python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2152,6 +2152,7 @@ manifest:
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_profiling.py::Test_Profile:
- weblog_declaration:
"*": v0.1 # actual version unknown
Expand Down
1 change: 1 addition & 0 deletions manifests/ruby.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1938,6 +1938,7 @@ manifest:
tests/test_library_logs.py::Test_NoExceptions::test_dotnet: irrelevant (only for .NET)
tests/test_library_logs.py::Test_NoExceptions::test_java_logs: irrelevant (only for Java)
tests/test_library_logs.py::Test_NoExceptions::test_java_telemetry_logs: irrelevant (only for Java)
tests/test_otlp_runtime_metrics.py::Test_OtlpRuntimeMetrics: missing_feature
tests/test_profiling.py::Test_Profile: # Modified by easy win activation script
- declaration: missing_feature (temporary fix, scenario not working on dd-trace-rb CI)
component_version: <2.24.0
Expand Down
141 changes: 141 additions & 0 deletions tests/test_otlp_runtime_metrics.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,141 @@
"""Test that runtime metrics are exported via OTLP using OTel semantic convention names.

When DD_RUNTIME_METRICS_ENABLED=true and DD_METRICS_OTEL_ENABLED=true, dd-trace-*
libraries should send runtime metrics via OTLP with OTel-native naming (dotnet.*,
jvm.*, go.*, v8js.*, etc.) instead of DD-proprietary naming (runtime.dotnet.*,
runtime.go.*, runtime.node.*, etc.).

Related PRs:
- .NET: https://github.com/DataDog/dd-trace-dotnet/pull/8299
- Go: https://github.com/DataDog/dd-trace-go/pull/4611
- Node.js: https://github.com/DataDog/dd-trace-js/pull/7869
- Java: https://github.com/DataDog/dd-trace-java/pull/10985
"""

from utils import context, features, interfaces, scenarios, weblog


# All OTel semconv metric names that MUST be present per language.
# These are the complete instrument sets from each tracer's OTLP runtime metrics implementation.
EXPECTED_METRICS = {
"dotnet": [
"dotnet.assembly.count",
"dotnet.exceptions",
"dotnet.gc.collections",
"dotnet.gc.heap.total_allocated",
"dotnet.gc.last_collection.heap.fragmentation.size",
"dotnet.gc.last_collection.heap.size",
"dotnet.gc.last_collection.memory.committed_size",
"dotnet.gc.pause.time",
"dotnet.jit.compilation.time",
"dotnet.jit.compiled_il.size",
"dotnet.jit.compiled_methods",
"dotnet.monitor.lock_contentions",
"dotnet.process.cpu.count",
"dotnet.process.cpu.time",
"dotnet.process.memory.working_set",
"dotnet.thread_pool.queue.length",
"dotnet.thread_pool.thread.count",
"dotnet.thread_pool.work_item.count",
"dotnet.timer.count",
],
"golang": [
"go.config.gogc",
"go.goroutine.count",
"go.memory.allocated",
"go.memory.allocations",
"go.memory.gc.goal",
"go.memory.limit",
"go.memory.used",
"go.processor.limit",
],
"nodejs": [
"nodejs.eventloop.delay.max",
"nodejs.eventloop.delay.mean",
"nodejs.eventloop.delay.min",
"nodejs.eventloop.delay.p50",
"nodejs.eventloop.delay.p90",
"nodejs.eventloop.delay.p99",
"nodejs.eventloop.utilization",
"process.cpu.utilization",
"process.memory.usage",
"v8js.memory.heap.limit",
"v8js.memory.heap.space.available_size",
"v8js.memory.heap.space.physical_size",
"v8js.memory.heap.used",
],
"java": [
"jvm.buffer.count",
"jvm.buffer.memory.limit",
"jvm.buffer.memory.used",
"jvm.class.count",
"jvm.class.loaded",
"jvm.class.unloaded",
"jvm.cpu.count",
"jvm.cpu.recent_utilization",
"jvm.cpu.time",
"jvm.file_descriptor.count",
"jvm.file_descriptor.limit",
"jvm.memory.committed",
"jvm.memory.init",
"jvm.memory.limit",
"jvm.memory.used",
"jvm.memory.used_after_last_gc",
"jvm.system.cpu.utilization",
"jvm.thread.count",
],
}

# DD-proprietary prefixes that should NOT appear when OTLP metrics are active
DD_PROPRIETARY_PREFIXES = {
"dotnet": "runtime.dotnet.",
"golang": "runtime.go.",
"nodejs": "runtime.node.",
"java": "jvm.heap_memory",
}


def get_runtime_metric_names():
"""Extract runtime metric names from the agent interface (agent -> backend series).

Uses interfaces.agent.get_metrics() — the same approach as
Test_Config_RuntimeMetrics_Enabled in test_config_consistency.py.
"""
metric_names = set()
for _, metric in interfaces.agent.get_metrics():
metric_names.add(metric["metric"])
return metric_names


@scenarios.otlp_runtime_metrics
@features.runtime_metrics
class Test_OtlpRuntimeMetrics:
"""Verify runtime metrics are sent via OTLP with OTel names, not DD-proprietary names."""

def setup_main(self):
self.req = weblog.get("/")

def test_main(self):
assert self.req.status_code == 200

library = context.library.name
if library not in EXPECTED_METRICS:
return

metric_names = get_runtime_metric_names()

# All expected OTel-named metrics must be present
expected = EXPECTED_METRICS[library]
for expected_name in expected:
assert expected_name in metric_names, (
f"Expected OTel runtime metric '{expected_name}' not found for {library}. "
f"Got metrics: {sorted(metric_names)}"
)

# DD-proprietary names must NOT be present
dd_prefix = DD_PROPRIETARY_PREFIXES.get(library)
if dd_prefix:
dd_named_metrics = [n for n in metric_names if n.startswith(dd_prefix)]
assert len(dd_named_metrics) == 0, (
f"Found DD-proprietary metric names for {library}: {dd_named_metrics}. Expected OTel-native names only."
)
13 changes: 13 additions & 0 deletions utils/_context/_scenarios/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -1188,6 +1188,19 @@ class _Scenarios:
doc="Test runtime metrics",
)

otlp_runtime_metrics = EndToEndScenario(
"OTLP_RUNTIME_METRICS",
weblog_env={
"DD_METRICS_OTEL_ENABLED": "true",
"OTEL_EXPORTER_OTLP_PROTOCOL": "http/protobuf",
"OTEL_EXPORTER_OTLP_METRICS_ENDPOINT": f"http://proxy:{ProxyPorts.open_telemetry_weblog}/v1/metrics",
"OTEL_EXPORTER_OTLP_METRICS_HEADERS": "dd-protocol=otlp,dd-otlp-path=agent",
},
runtime_metrics_enabled=True,
library_interface_timeout=20,
doc="Test runtime metrics exported via OTLP with OTel semantic convention names",
)

# Appsec Lambda Scenarios
appsec_lambda_default = LambdaScenario(
"APPSEC_LAMBDA_DEFAULT",
Expand Down
1 change: 1 addition & 0 deletions utils/scripts/ci_orchestrators/workflow_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -635,6 +635,7 @@ def _is_supported(library: str, weblog: str, scenario: str, _ci_environment: str
"IPV6",
"LIBRARY_CONF_CUSTOM_HEADER_TAGS",
"LIBRARY_CONF_CUSTOM_HEADER_TAGS_INVALID",
"OTLP_RUNTIME_METRICS",
"PERFORMANCES",
"PROFILING",
"REMOTE_CONFIG_MOCKED_BACKEND_ASM_DD",
Expand Down
Loading