Skip to content

Guiforge/sleepfake

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

84 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

SleepFake Logo

GitHub PyPI version Python versions License: MIT freezegun pytest plugin stable

πŸ’€ SleepFake: Time Travel for Your Tests

Ever wish your tests could skip the waiting but keep correct time behavior? SleepFake patches time.sleep and asyncio.sleep so tests return instantly while frozen time moves forward exactly as requested.

πŸ“¦ Install (30 seconds)

pip install sleepfake

⚑ Quick start (recommended): global autouse

If you want instant wins with almost no boilerplate, make SleepFake apply to every test.

Add to pyproject.toml:

[tool.pytest.ini_options]
sleepfake_autouse = true

Now regular tests automatically skip sleeps:

import time

def test_retry():
    start = time.time()
    time.sleep(30)  # returns instantly
    assert time.time() - start >= 30

Async works the same way:

import asyncio

async def test_polling():
    start = asyncio.get_running_loop().time()
    await asyncio.sleep(10)  # returns instantly
    assert asyncio.get_running_loop().time() - start >= 10

pytest-asyncio users: add asyncio_mode = "auto" to pyproject.toml (or mark each test with @pytest.mark.asyncio) so pytest collects async tests correctly.

βœ… Result: your suite keeps time-based correctness, minus the wall-clock pain.

🧭 Choose your usage style

Use case Best option Boilerplate
Apply everywhere (most teams) Global autouse (sleepfake_autouse = true or --sleepfake) Lowest
Per-test explicit control sleepfake fixture Low
Decoration-style usage @pytest.mark.sleepfake Low
Non-pytest scripts / direct control SleepFake context manager Medium

πŸ“š Full details (expand as needed)

Context manager usage
import time
import asyncio
from sleepfake import SleepFake

# Sync
with SleepFake():
    start = time.time()
    time.sleep(10)           # returns instantly
    assert time.time() - start >= 10

# Async β€” use async with for proper cleanup of the background processor
async def test_async():
    async with SleepFake():
        start = asyncio.get_running_loop().time()
        await asyncio.sleep(5)   # returns instantly
        assert asyncio.get_running_loop().time() - start >= 5

Customize freezegun ignores via ignore:

from sleepfake import SleepFake

# `_pytest.timing` is always ignored by default to keep pytest durations sane.
# Add your own modules as needed.
with SleepFake(ignore=["my_project.telemetry"]):
    ...
Fixture usage (`sleepfake`)

Install once; the sleepfake fixture is available automatically in tests.

import time

def test_retry_logic(sleepfake):
    start = time.time()
    time.sleep(30)           # instantly skipped
    assert time.time() - start >= 30
import asyncio

async def test_polling(sleepfake):
    start = asyncio.get_running_loop().time()
    await asyncio.gather(
        asyncio.sleep(1),
        asyncio.sleep(5),
        asyncio.sleep(3),
    )
    # All three complete instantly; frozen clock sits at +5 s
    assert asyncio.get_running_loop().time() - start >= 5

Deprecated: asleepfake is deprecated. Use sleepfake for both sync and async tests.

Marker usage (`@pytest.mark.sleepfake`)
import time
import asyncio
import pytest

@pytest.mark.sleepfake
def test_marked_sync():
    start = time.time()
    time.sleep(100)
    assert time.time() - start >= 100

@pytest.mark.sleepfake
async def test_marked_async():
    start = asyncio.get_running_loop().time()
    await asyncio.sleep(100)
    assert asyncio.get_running_loop().time() - start >= 100

If a test already requests the sleepfake fixture, this marker becomes a no-op (no double patching).

Global autouse: all options and opt-out

Option A β€” config file (pyproject.toml / pytest.ini)

# pyproject.toml
[tool.pytest.ini_options]
sleepfake_autouse = true
sleepfake_ignore = ["my_project.telemetry", "my_project.metrics"]
# pytest.ini
[pytest]
sleepfake_autouse = true
sleepfake_ignore =
    my_project.telemetry
    my_project.metrics

Option B β€” CLI flag

pytest --sleepfake
pytest --sleepfake --sleepfake-ignore my_project.telemetry --sleepfake-ignore my_project.metrics

Disable autouse per-test

import time
import pytest

# This test runs with SleepFake (autouse applies).
def test_patched():
    start = time.time()
    time.sleep(100)
    assert time.time() - start >= 100

# This test uses real time β€” SleepFake is NOT applied.
@pytest.mark.no_sleepfake
def test_needs_real_time():
    start = time.time()
    time.sleep(0.01)
    assert time.time() - start < 5

@pytest.mark.no_sleepfake only disables the autouse layer. If your test explicitly requests sleepfake, it still patches.

Option C β€” per-directory autouse in conftest.py

# conftest.py
import pytest

@pytest.fixture(autouse=True)
def _sleepfake_sync(sleepfake):
    """Auto-apply SleepFake for every test (sync and async)."""

@pytest.fixture(autouse=True)
async def _sleepfake_async(sleepfake):
    """Async counterpart β€” shares the same sleepfake instance; no double-patch."""

Both fixtures share one sleepfake instance.

Configure ignores in conftest.py

If you need project- or directory-specific ignore rules without touching pyproject.toml:

# conftest.py
pytest_sleepfake_ignore = ["my_project.telemetry", "my_project.metrics"]

This is used by:

  • the sleepfake fixture
  • @pytest.mark.sleepfake
  • global autouse mode (sleepfake_autouse = true or --sleepfake)
asyncio.timeout integration

The frozen clock advances before each sleep future resolves, so asyncio.timeout still fires correctly:

import asyncio
import pytest
from sleepfake import SleepFake

async def test_timeout_fires():
    with SleepFake():
        with pytest.raises(TimeoutError):
            async with asyncio.timeout(2):
                await asyncio.sleep(10)   # clock jumps to +10 s β†’ timeout at +2 s fires
Options reference (API, CLI, config)
Where Option Example Purpose
Python API (SleepFake) ignore: list[str] | None SleepFake(ignore=["my.module"]) Add module prefixes freezegun should ignore while freezing time.
Pytest CLI --sleepfake pytest --sleepfake Enable SleepFake for every test in the session.
Pytest CLI --sleepfake-ignore MODULE pytest --sleepfake-ignore my.module Add a module prefix to ignore (repeatable; merged with sleepfake_ignore).
Pytest config (pytest.ini / pyproject.toml) sleepfake_autouse = true [tool.pytest.ini_options]\nsleepfake_autouse = true Same as --sleepfake, but persisted in config.
Pytest config (pytest.ini / pyproject.toml) sleepfake_ignore sleepfake_ignore = ["my.module"] Add module prefixes to ignore for all pytest-managed SleepFake usage.
conftest.py pytest_sleepfake_ignore pytest_sleepfake_ignore = ["my.module"] Override ignore prefixes for a test subtree (directory-scoped).
Pytest marker @pytest.mark.no_sleepfake @pytest.mark.no_sleepfake Opt a single test out of global autouse patching. Has no effect if the test explicitly requests the sleepfake fixture.

Notes:

  • Every ignore list is merged with DEFAULT_IGNORE = ["_pytest.timing"]. This keeps pytest duration measurement on real clocks and prevents epoch-scale --durations output.
  • User-provided ignore values are appended after DEFAULT_IGNORE and deduplicated.

⚠️ Scope limitation

SleepFake patches time.sleep and asyncio.sleep at two levels:

  1. The source module (time.sleep / asyncio.sleep) β€” via unittest.mock.patch.
  2. Module-level aliases in sys.modules β€” any attribute that points to the original time.sleep or asyncio.sleep at context entry is patched too. This covers the common from time import sleep pattern at the top of a module.

The one case that cannot be covered is a local variable binding created inside a function body before the context is entered:

def hard_to_patch():
    _sleep = time.sleep   # local variable β€” not visible in sys.modules
    with SleepFake():
        _sleep(10)        # ⚠️ calls the real time.sleep; cannot be intercepted

πŸ§ͺ How it works

Aspect Detail
Sync sleep frozen_factory.tick(delta) advances frozen time immediately
Async sleep (deadline, seq, future) goes into an asyncio.PriorityQueue; a background task resolves futures in deadline order
Broad patching On context entry, sys.modules is scanned for module-level aliases of the originals (e.g. from time import sleep); all matched attributes are replaced and restored on exit
Timeout safety After advancing time, the processor yields one event-loop turn so timeout callbacks can fire before futures resolve
Cancellation Cancelled futures are skipped; the processor keeps running
pytest durations freeze_time(..., ignore=["_pytest.timing", ...]) avoids breaking pytest internal wall-clock timing

🀝 Contributing

PRs and issues welcome! Here's how to get started:

# Install dependencies and run the test suite
uv run pytest --force-sugar -vvv

# Lint (ruff + mypy) then test
make test-all

# Run against all supported Python versions (3.10–3.15)
make test-all-python

Please run make test-all before submitting a PR.

Note: SleepFake uses freezegun under the hood.

About

pytest plugin to fake time.sleep and asyncio.sleep in tests

Topics

Resources

License

Stars

Watchers

Forks

Contributors