Skip to content

Memory leak/unbounded memory growth in ingest-replay-recordings consumer #4294

@timur-ND

Description

@timur-ND

Self-Hosted Version

26.3.1

CPU Architecture

x86_64

Docker Version

24.0.5

Docker Compose Version

2.20.3

Machine Specification

  • My system meets the minimum system requirements of Sentry

Installation Type

Upgrade

Steps to Reproduce

Steps to Reproduce:

  1. Run self-hosted Sentry with ingest-replay-recordings consumer enabled (default)
  2. Have little to no replay traffic (or disable Session Replay in all projects)
  3. Wait several days (21 days in my case)
  4. Observe memory growth in ingest-replay-recordings container ( 9.3GB RAM usage)

Expected Result

Memory usage should remain stable when there is no replay traffic to process.

Actual Result

After 21 days of uptime, ingest-replay-recordings consumes 9.3GB RAM despite processing only 4 replay events in the last 30 days across all projects.

Memory analysis (/proc/smaps_rollup):

  • RSS: 9,527,620 kB (~9.3GB)
  • Anonymous: 9,520,480 kB (100% heap — Python malloc, no file mappings)
  • Private_Dirty: 9,520,480 kB
  • Swap: 0 kB
docker top sentry-self-hosted-ingest-replay-recordings-1
UID                 PID                 PPID                C                   STIME               TTY                 TIME                CMD
systemd+            2956898             2956864             0                   Mar31               ?                   00:00:31            tini -- sentry run consumer ingest-replay-recordings --consumer-group ingest-replay-recordings --healthcheck-file-path /tmp/health.txt
systemd+            2962173             2956898             8                   Mar31               ?                   1-20:48:10          /.venv/bin/python3 /.venv/bin/sentry run consumer ingest-replay-recordings --consumer-group ingest-replay-recordings --healthcheck-file-path /tmp/health.txt

mem info:

cat /proc/2962173/smaps_rollup
56104f562000-7ffef5532000 ---p 00000000 00:00 0                          [rollup]
Rss:             9527620 kB
Pss:             9520678 kB
Pss_Anon:        9520480 kB
Pss_File:            198 kB
Pss_Shmem:             0 kB
Shared_Clean:       7140 kB
Shared_Dirty:          0 kB
Private_Clean:         0 kB
Private_Dirty:   9520480 kB
Referenced:      9488000 kB
Anonymous:       9520480 kB
LazyFree:              0 kB
AnonHugePages:     94208 kB
ShmemPmdMapped:        0 kB
FilePmdMapped:         0 kB
Shared_Hugetlb:        0 kB
Private_Hugetlb:       0 kB
Swap:                  0 kB
SwapPss:               0 kB
Locked:                0 kB

docker stats:

b6eb4b58de09   sentry-self-hosted-ingest-replay-recordings-1                                    0.24%     9.103GiB / 62.27GiB   14.62%    830MB / 751MB     15.4MB
  / 1.49MB   15

CPU profile (py-spy, 60s): process is idle in Kafka poll loop (arroyo/backends/kafka/consumer.py), no actual message processing occurring. Uptime: since 2026-03-31

Total replay events across all projects (last 30 days): 4 (almost no replays)

curl -s "https://sentry.my.com/api/0/organizations/sentry/stats_v2/?field=sum(quantity)&category=replay&interval=1d&statsPeriod=30d" -H "Authorization: Bearer $token" | jq '.groups[0].totals["sum(quantity)"]'
4

svg file:

py-spy record --pid 2962173 --output /tmp/replay-profile.svg --duration 60 --subprocesses
Image

Event ID

No response

Metadata

Metadata

Assignees

No one assigned
    No fields configured for issues without a type.

    Projects

    Status

    No status

    Status

    Waiting for: Product Owner

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions