All notable changes to the durable-workflow Python SDK are documented here.
The format is based on Keep a Changelog,
and this project adheres to Semantic Versioning.
Client.get_schedule_history(schedule_id, *, limit=None, after_sequence=None)returns oneScheduleHistoryPageof the schedule's audit stream, andClient.iter_schedule_history(...)is an async iterator that walks every remainingScheduleHistoryEventwith paging hidden.ScheduleHandleexposes the same surface ashandle.history(...)andhandle.iter_history(...). History remains available for deleted schedules so post-mortem review still works after a schedule is removed.
WorkflowEnvironmentnow drivescontinue_as_newchains end-to-end. Each link's input, workflow type, history, and terminal command are exposed through theruns/run_countproperties, signals can be queued for a specific link viaenv.signal(..., run=N), and chains that switch workflow types use the newenv.register_workflow(cls)registration. Chain length is bounded bycontinue_as_new_limit(default50); exceeding the limit raisesRuntimeErrorso tests catch runaway continuations.
Client.set_namespace_external_storage(and its sync facade) now takes the namespace asname, matchingdescribe_namespace,create_namespace, andupdate_namespace. The 0.4.0 spellingnamespace=is still accepted as a deprecated keyword alias that emits aDeprecationWarning; it will be removed in a future release. Passing bothnameandnamespaceraisesTypeError.
- Workflow control-plane parity across the async and sync clients for list, describe, cancel, terminate, history, history export, and run visibility, plus a public history replayer and released golden replay fixtures.
- Task-queue, worker, and namespace control-plane coverage for build-id rollout visibility, drain/resume mutation, worker build-id reporting, namespace controls, activity-task operations, schedule visibility/mutation, and search attribute management.
- External payload storage support for reference envelopes, object-store drivers, expiry metadata, retention/delete helpers, storage policy parity, and verified-byte caching.
- Bridge webhook client support, invocable activity carrier support, replay-safe UUIDv7 and patch-marker helpers, worker interceptors, payload codec batching, and explicit Avro payload adapters.
- PyPI/TestPyPI publish builds now run the installed-package smoke before uploading artifacts, so release candidates verify the wheel and source distribution import from site-packages and replay the README quickstart.
- Polyglot parity coverage now spans CLI/Python shared control-plane fixtures, including workflow maintenance, task queues, storage drivers, and system maintenance endpoints, reducing drift between released SDK behavior and other Durable Workflow surfaces.
- Breaking (pre-1.0):
WorkflowCancelledandActivityCancellednow inherit fromBaseException(notDurableWorkflowError/Exception), so a genericexcept Exception:block in activity code or result handlers no longer silently swallows cancellation. Callers that relied on catching cancellation viaexcept Exception:orexcept DurableWorkflowError:must now either catch the class by name (e.g.except (ActivityCancelled, WorkflowCancelled):) or catchBaseException. Mirrors the standard-library precedent set byasyncio.CancelledErrorandKeyboardInterrupt.
- Plane-scoped SDK bearer tokens:
Client(..., control_token=..., worker_token=...)and the sync wrapper now support least-privilege server deployments where operator/admin credentials are separate from worker credentials. The existingtoken=argument remains the shared fallback. Worker.run_until(workflow_id=..., timeout=...)for examples, smoke tests, and single-workflow scripts that need to run a worker until one workflow reaches a terminal state.- A Docker Compose order-processing example under
examples/order_processingthat starts a local server and runs a multi-activity Python workflow end-to-end. ctx.wait_condition(...)durable primitive with replayer support, for workflows that pause until a signal- or update-driven predicate holds.@workflow.signal,@workflow.query, and@workflow.updatedecorators with in-workflow dispatch: signals apply during replay, queries execute against a replayed workflow instance, and updates run on a worker with acceptance + application recorded in history.ctx.sleep(seconds)sugar overStartTimerfor readability.- In-process
WorkflowEnvironmenttesting harness that boots a worker and client against a fake server for unit-style tests without Docker. - Activity retry policy support:
ActivityRetryPolicy(...)onctx.schedule_activity(...)serializes retry bounds onto the server-side command. - SDK metrics hooks (
MetricsRecorder/PrometheusMetricsRecorder) for worker-side operational telemetry.
- Worker compatibility checks now use
/api/cluster/infoprotocol manifests as the authority instead of the top-level server app version. SDK 0.3.x requirescontrol_plane.version: "2",control_plane.request_contractschemadurable-workflow.v2.control-plane-request.contractversion1, andworker_protocol.version: "1.0". Missing, unknown, or undiscoverable compatibility states fail closed. Client.get_result()now decodesWorkflowCompletedoutput with the event or workflow payload codec instead of assuming JSON.- History-event decoding in
client.pyandworkflow.pynow requires the server's canonical PascalCaseevent_typevalues (WorkflowCompleted,ActivityCompleted,TimerFired, etc.). The prior snake_case fallback and theoutput-or-resultkey fallback onWorkflowCompletedhave been removed; unknown event-type shapes are ignored instead of silently tolerated. (#432)
- Runtime server version compatibility check at worker registration. On
Worker.run(), the SDK now calls/api/cluster/infoand refuses to register against a server whose major version falls outside the set the SDK knows how to talk to. This prevents a 0.2.x worker from silently attempting to drive a future breaking-release server. (#302) Client.get_cluster_info()— fetches the server version and declared capability manifest from/api/cluster/info.- Avro payload codec support as a core runtime dependency.
serializer.encode(),serializer.decode(), andserializer.envelope()now accept acodec=argument, anddecode_envelope()honors the inner codec tag. The Worker decodes Avro-coded activity arguments and echoes the inbound codec on itscomplete_activity_taskresult. Wire format is the Durable Workflow generic-wrapper (base64 of0x00+ Avro binary of a{json: string, version: int}record), byte-compatible with the PHPWorkflow\Serializers\Avroserializer. (#362)
- Avro is now the default codec for new payloads produced by the client, serializer helpers, schedules, workflow commands, and activity results. JSON payloads remain supported for compatibility with existing history.
- Replayed activity results now decode using the event payload codec.
Initial PyPI release. HTTP+JSON worker and client for the Durable Workflow server, covering workflow authoring, activity execution, signal and update commands, and the worker protocol over long-poll HTTP.