Ever wish your tests could skip the waiting but keep correct time behavior? SleepFake patches time.sleep and asyncio.sleep so tests return instantly while frozen time moves forward exactly as requested.
pip install sleepfakeIf you want instant wins with almost no boilerplate, make SleepFake apply to every test.
Add to pyproject.toml:
[tool.pytest.ini_options]
sleepfake_autouse = trueNow regular tests automatically skip sleeps:
import time
def test_retry():
start = time.time()
time.sleep(30) # returns instantly
assert time.time() - start >= 30Async works the same way:
import asyncio
async def test_polling():
start = asyncio.get_running_loop().time()
await asyncio.sleep(10) # returns instantly
assert asyncio.get_running_loop().time() - start >= 10
pytest-asynciousers: addasyncio_mode = "auto"topyproject.toml(or mark each test with@pytest.mark.asyncio) so pytest collects async tests correctly.
β Result: your suite keeps time-based correctness, minus the wall-clock pain.
| Use case | Best option | Boilerplate |
|---|---|---|
| Apply everywhere (most teams) | Global autouse (sleepfake_autouse = true or --sleepfake) |
Lowest |
| Per-test explicit control | sleepfake fixture |
Low |
| Decoration-style usage | @pytest.mark.sleepfake |
Low |
| Non-pytest scripts / direct control | SleepFake context manager |
Medium |
Context manager usage
import time
import asyncio
from sleepfake import SleepFake
# Sync
with SleepFake():
start = time.time()
time.sleep(10) # returns instantly
assert time.time() - start >= 10
# Async β use async with for proper cleanup of the background processor
async def test_async():
async with SleepFake():
start = asyncio.get_running_loop().time()
await asyncio.sleep(5) # returns instantly
assert asyncio.get_running_loop().time() - start >= 5Customize freezegun ignores via ignore:
from sleepfake import SleepFake
# `_pytest.timing` is always ignored by default to keep pytest durations sane.
# Add your own modules as needed.
with SleepFake(ignore=["my_project.telemetry"]):
...Fixture usage (`sleepfake`)
Install once; the sleepfake fixture is available automatically in tests.
import time
def test_retry_logic(sleepfake):
start = time.time()
time.sleep(30) # instantly skipped
assert time.time() - start >= 30import asyncio
async def test_polling(sleepfake):
start = asyncio.get_running_loop().time()
await asyncio.gather(
asyncio.sleep(1),
asyncio.sleep(5),
asyncio.sleep(3),
)
# All three complete instantly; frozen clock sits at +5 s
assert asyncio.get_running_loop().time() - start >= 5Deprecated:
asleepfakeis deprecated. Usesleepfakefor both sync and async tests.
Marker usage (`@pytest.mark.sleepfake`)
import time
import asyncio
import pytest
@pytest.mark.sleepfake
def test_marked_sync():
start = time.time()
time.sleep(100)
assert time.time() - start >= 100
@pytest.mark.sleepfake
async def test_marked_async():
start = asyncio.get_running_loop().time()
await asyncio.sleep(100)
assert asyncio.get_running_loop().time() - start >= 100If a test already requests the sleepfake fixture, this marker becomes a no-op (no double patching).
Global autouse: all options and opt-out
# pyproject.toml
[tool.pytest.ini_options]
sleepfake_autouse = true
sleepfake_ignore = ["my_project.telemetry", "my_project.metrics"]# pytest.ini
[pytest]
sleepfake_autouse = true
sleepfake_ignore =
my_project.telemetry
my_project.metricspytest --sleepfake
pytest --sleepfake --sleepfake-ignore my_project.telemetry --sleepfake-ignore my_project.metricsimport time
import pytest
# This test runs with SleepFake (autouse applies).
def test_patched():
start = time.time()
time.sleep(100)
assert time.time() - start >= 100
# This test uses real time β SleepFake is NOT applied.
@pytest.mark.no_sleepfake
def test_needs_real_time():
start = time.time()
time.sleep(0.01)
assert time.time() - start < 5@pytest.mark.no_sleepfake only disables the autouse layer.
If your test explicitly requests sleepfake, it still patches.
# conftest.py
import pytest
@pytest.fixture(autouse=True)
def _sleepfake_sync(sleepfake):
"""Auto-apply SleepFake for every test (sync and async)."""
@pytest.fixture(autouse=True)
async def _sleepfake_async(sleepfake):
"""Async counterpart β shares the same sleepfake instance; no double-patch."""Both fixtures share one sleepfake instance.
Configure ignores in conftest.py
If you need project- or directory-specific ignore rules without touching pyproject.toml:
# conftest.py
pytest_sleepfake_ignore = ["my_project.telemetry", "my_project.metrics"]This is used by:
- the
sleepfakefixture @pytest.mark.sleepfake- global autouse mode (
sleepfake_autouse = trueor--sleepfake)
asyncio.timeout integration
The frozen clock advances before each sleep future resolves, so asyncio.timeout still fires correctly:
import asyncio
import pytest
from sleepfake import SleepFake
async def test_timeout_fires():
with SleepFake():
with pytest.raises(TimeoutError):
async with asyncio.timeout(2):
await asyncio.sleep(10) # clock jumps to +10 s β timeout at +2 s firesOptions reference (API, CLI, config)
| Where | Option | Example | Purpose |
|---|---|---|---|
Python API (SleepFake) |
ignore: list[str] | None |
SleepFake(ignore=["my.module"]) |
Add module prefixes freezegun should ignore while freezing time. |
| Pytest CLI | --sleepfake |
pytest --sleepfake |
Enable SleepFake for every test in the session. |
| Pytest CLI | --sleepfake-ignore MODULE |
pytest --sleepfake-ignore my.module |
Add a module prefix to ignore (repeatable; merged with sleepfake_ignore). |
Pytest config (pytest.ini / pyproject.toml) |
sleepfake_autouse = true |
[tool.pytest.ini_options]\nsleepfake_autouse = true |
Same as --sleepfake, but persisted in config. |
Pytest config (pytest.ini / pyproject.toml) |
sleepfake_ignore |
sleepfake_ignore = ["my.module"] |
Add module prefixes to ignore for all pytest-managed SleepFake usage. |
conftest.py |
pytest_sleepfake_ignore |
pytest_sleepfake_ignore = ["my.module"] |
Override ignore prefixes for a test subtree (directory-scoped). |
| Pytest marker | @pytest.mark.no_sleepfake |
@pytest.mark.no_sleepfake |
Opt a single test out of global autouse patching. Has no effect if the test explicitly requests the sleepfake fixture. |
Notes:
- Every ignore list is merged with
DEFAULT_IGNORE = ["_pytest.timing"]. This keeps pytest duration measurement on real clocks and prevents epoch-scale--durationsoutput. - User-provided ignore values are appended after
DEFAULT_IGNOREand deduplicated.
SleepFake patches time.sleep and asyncio.sleep at two levels:
- The source module (
time.sleep/asyncio.sleep) β viaunittest.mock.patch. - Module-level aliases in
sys.modulesβ any attribute that points to the originaltime.sleeporasyncio.sleepat context entry is patched too. This covers the commonfrom time import sleeppattern at the top of a module.
The one case that cannot be covered is a local variable binding created inside a function body before the context is entered:
def hard_to_patch():
_sleep = time.sleep # local variable β not visible in sys.modules
with SleepFake():
_sleep(10) # β οΈ calls the real time.sleep; cannot be intercepted| Aspect | Detail |
|---|---|
| Sync sleep | frozen_factory.tick(delta) advances frozen time immediately |
| Async sleep | (deadline, seq, future) goes into an asyncio.PriorityQueue; a background task resolves futures in deadline order |
| Broad patching | On context entry, sys.modules is scanned for module-level aliases of the originals (e.g. from time import sleep); all matched attributes are replaced and restored on exit |
| Timeout safety | After advancing time, the processor yields one event-loop turn so timeout callbacks can fire before futures resolve |
| Cancellation | Cancelled futures are skipped; the processor keeps running |
| pytest durations | freeze_time(..., ignore=["_pytest.timing", ...]) avoids breaking pytest internal wall-clock timing |
PRs and issues welcome! Here's how to get started:
# Install dependencies and run the test suite
uv run pytest --force-sugar -vvv
# Lint (ruff + mypy) then test
make test-all
# Run against all supported Python versions (3.10β3.15)
make test-all-pythonPlease run make test-all before submitting a PR.
Note: SleepFake uses freezegun under the hood.
