diff --git a/.cursor/plans/consolidate-commerce-catalog-phases-plan_2f7429a3.plan.md b/.cursor/plans/consolidate-commerce-catalog-phases-plan_2f7429a3.plan.md new file mode 100644 index 000000000..a82d89d2c --- /dev/null +++ b/.cursor/plans/consolidate-commerce-catalog-phases-plan_2f7429a3.plan.md @@ -0,0 +1,427 @@ +--- +name: consolidate-commerce-catalog-phases-plan +overview: Implement the remaining EmDash commerce catalog v1 phases from `emdash-commerce-product-catalog-v1-spec-updated.md` using small, additive changes and no runtime money-path modifications. Keep kernel closed for checkout/webhook/finalize behavior and add catalog capabilities in phase order. +todos: + - id: phase-1-foundation-hardening + content: "Implement catalog foundation completion: product/SKU update + lifecycle routes, validations, and tests." + status: completed + - id: phase-2-media-assets + content: Introduce provider-neutral asset records and asset-link rows for product/SKU images with route and tests. + status: completed + - id: phase-3-variable-model + content: Implement product attributes, allowed values, sku option map rows, and duplicate-combination enforcement. + status: completed + - id: phase-4-digital-entitlements + content: Implement digital_assets + digital_entitlements storage, schemas, handlers, and retrieval hooks. + status: completed + - id: phase-5-bundles + content: Add bundle component model, discount computation, and derived availability semantics with tests. + status: completed + - id: phase-6-catalog-org + content: Add categories/tags + link tables and catalog list/detail retrieval filters. + status: completed + - id: phase-7-order-snapshots + content: Add order line snapshot payloads at checkout-time and enforce snapshot-based historical correctness. + status: completed +isProject: false +--- + +# Consolidated Execution Plan + +## Scope and constraints + +- Target module: `packages/plugins/commerce`. +- Preserve Stage-1 scope lock: no payment provider routing changes, no MCP write surfaces, no changes to checkout webhook finalize semantics. +- Follow the phased order in `emdash-commerce-product-catalog-v1-spec-updated.md`: + - [Phase 1 Foundation](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-1--foundation-schema-and-invariants) + - [Phase 2 Media/assets](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-2--mediaassets-abstraction) + - [Phase 3 Variable product model](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-3--variable-product-model) + - [Phase 4 Digital entitlement model](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-4--digital-entitlement-model) + - [Phase 5 Bundle model](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-5--bundle-model) + - [Phase 6 Catalog organization/retrieval](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-6--catalog-organization-and-retrieval) + - [Phase 7 Order snapshot integration](./emdash-commerce-product-catalog-v1-spec-updated.md#phase-7--order-snapshot-integration) +- Keep edits additive and type-safe; route-level contract remains in [`packages/plugins/commerce/src/index.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/index.ts). +- Preserve strict handler layering: catalog and checkout handlers must not invoke each other directly. +- Add explicit immutable field rules and response-shape contracts before entity expansion work. +- Feature flags should be used only where rollout risk or frontend surface maturity requires it. + +## High-level architecture flow + +```mermaid +flowchart LR +CatalogHandlers["handlers/catalog.ts"] +CheckoutHandlers["handlers/checkout.ts"] +CatalogService["catalog service"] +CatalogHelpers["catalog domain helpers"] +Storage["storage.ts"] +Domain["types.ts"] +Kernel["orchestration/finalize-payment.ts"] + +CatalogHandlers --> CatalogService +CheckoutHandlers --> CatalogHelpers +CatalogHandlers --> CatalogHelpers +CatalogService --> Storage +CatalogHelpers --> Storage +CheckoutHandlers --> Storage +CheckoutHandlers --> Kernel +``` + +## Canonical catalog-domain contracts (before implementation) + +- Immutable updates: + - `Product` immutable fields: `id`, `type`, `createdAt`, `productCode` (if present), and lifecycle governance of `status` if you introduce hard publication rules. + - `SKU` immutable fields: `id`, `productId`, `createdAt`, and any immutable identity fields in the product type payload. + - Merge-on-write is allowed only after validating incoming fields against the immutable set. +- Asset workflow: + - Asset metadata registration (`catalog asset create/register`) is separated from binary upload transport. + - Upload transport remains in media/asset infrastructure; catalog owns only asset record+link lifecycle. +- Variant product invariants: + - For variable products, each SKU option map must include exactly one value for each variant-defining attribute. + - No missing values, no extra values, and no duplicate values per attribute on a single SKU. + - Duplicate variant signatures are rejected. +- Bundle pricing ownership: + - v1 bundle discount config is explicitly stored on the bundle product record, not on dedicated pricing records or component rows. +- Snapshot boundary: + - Snapshot builders live in `src/lib` and are consumed by checkout; checkout handlers do not contain core catalog logic. +- Response shape contract: + - Define DTOs once in `src/lib/catalog-dto.ts`: + - product detail DTO + - catalog listing DTO + - admin product DTO + - bundle summary DTO + - variant matrix DTO + +## Data migration and backfill approach + +- Any new collection/table addition requires `storage` + `database` registration and migration notes for rollback and replay. +- For existing rows, define defaults during migration (status, visibility, bundle pricing defaults, snapshot fields). +- Add backfill tasks where historical rows are impacted: + - Add new fields with nullable defaults in v1, then migrate critical fields in a safe pass. + - For legacy orders without snapshots, render from live catalog when snapshot missing but emit monitoring alerts; prefer hardening `snapshot` as required in phase-7. + +## Feature flags + +- Phase 1–3: no feature flag required (core invariants and foundation). +- Phase 4–6: optional rollout flags if admin UI or search/readers are not yet ready. +- Phase 7: gate snapshot writes behind a deployment flag only if you need a controlled rollout; keep read path backward-tolerant. + +## PLQN approach per phase + +For each phase below, the strategy matrix is explicit and side-by-side comparisons are embedded so we always choose the highest-value implementation before coding. + +### Phase 1 — Foundation hardening (update + lifecycle) + +- **Strategy A (chosen): Minimal additive handlers + schemas** in the existing catalog module. + - Leverages current `StoredProduct`/`StoredProductSku` shapes and route style. Low risk, directly matches phase expectations. +- **Strategy B:** introduce generic catalog command-service first. + - Better abstraction separation, but too much indirection before all later entities exist. +- **Strategy C:** regenerate schema + typed model from metadata. + - Strong long-term consistency, high setup cost and migration risk. +- **Strategy D:** skip update/state endpoints. + - Fails phase-1 exit criteria (`can update it`). + +**Why A wins:** lowest complexity, high YAGNI compliance, enough DRY via helper reuse, scalable for later endpoint growth. + +#### Implement + +1. Extend schemas for updates/state in [`packages/plugins/commerce/src/schemas.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/schemas.ts): + - `productUpdateInputSchema` + - `productSkuUpdateInputSchema` + - `productStateInputSchema` (archive/unarchive) + - `productSkuStateInputSchema` +2. Extend catalog handlers in [`packages/plugins/commerce/src/handlers/catalog.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.ts): + - `updateProductHandler` and `updateProductSkuHandler` use immutable-field checks and reject attempts to overwrite forbidden fields. + - `updateProductHandler` + - `updateProductSkuHandler` + - `archiveProductHandler` + - `setSkuStatusHandler` +3. Update route wiring in [`packages/plugins/commerce/src/index.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/index.ts) with new endpoints. +4. Add/extend tests in [`packages/plugins/commerce/src/handlers/catalog.test.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.test.ts). +5. Add helper-level tests in `src/lib/catalog-domain.ts` (or existing shared helper module) for immutable-field enforcement and transition guards. + +```ts +// Phase-1 immutable-field merge intent +const nowIso = new Date().toISOString(); +const immutable = { + id: existing.id, + createdAt: existing.createdAt, + type: existing.type, + updatedAt: nowIso, +}; +const input = sanitizeMutableUpdates({ ...existing, ...ctx.input, ...immutable }); +await products.put(existing.id, input); +``` + +### Phase 2 — Media/assets abstraction (upload-first + links) + +- **Strategy A (chosen): Add explicit `product_assets` + `product_asset_links`.** + - Provider-neutral records and link semantics support product and SKU images; aligns with spec and portability. +- **Strategy B:** reuse content/assets directly on catalog rows. + - Simple short term, fragile portability and governance. +- **Strategy C:** add full media adapter layer first. + - Over-abstracted for v1. +- **Strategy D:** defer media. + - Violates phase-2 exit criteria and required retrieval shapes. + +**Why A wins:** direct spec alignment, strong DRY boundaries, safe future provider switch. + +#### Implement + +1. Add types in [`packages/plugins/commerce/src/types.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/types.ts): + - `StoredProductAsset` + - `StoredProductAssetLink` +2. Add storage config in [`packages/plugins/commerce/src/storage.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/storage.ts): + - `productAssets` collection + indexes + - `productAssetLinks` collection + unique constraints for primary image role per product +3. Add schemas + handlers: + - `product-assets/register` (asset metadata row from existing media reference) + - `catalog/asset/link` (associate asset row with product/SKU) + - `catalog/asset/unlink` (dissociate without touching upload transport) + - `catalog/asset/reorder` (per-target position) + - files: [`packages/plugins/commerce/src/schemas.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/schemas.ts), [`packages/plugins/commerce/src/handlers/catalog.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.ts) +4. Wire routes in [`packages/plugins/commerce/src/index.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/index.ts). +5. Add tests in [`packages/plugins/commerce/src/handlers/catalog.test.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.test.ts). +6. Add explicit API contract test verifying catalog routes never trigger binary upload behavior. + +```ts +// Phase-2 invariant (single primary per product) +if (role === 'primary_image' && productId) { + const primary = await productAssetLinks.query({ where: { productId, role: 'primary_image' }, limit: 1 }); + if (primary.items.length > 0) throw ... +} +``` + +### Phase 3 — Variable product model + +- **Strategy A (chosen): Add attribute tables + normalized option-mapping rows.** + - Enforces uniqueness and variant-defining rules deterministically. +- **Strategy B:** embed option JSON blobs per SKU. + - Weak for validation, indexing, and duplicate-combo checks. +- **Strategy C:** generic metadata map approach. + - Flexible but ambiguous, less reliable at compile-time and runtime. +- **Strategy D:** defer to later phase. + - Misses phase exit criteria and increases rewrite risk. + +**Why A wins:** correct constraints with manageable complexity and good long-term query behavior. + +#### Implement + +1. Storage additions: + - `productAttributes`, `productAttributeValues`, `productSkuOptionValues` in [`packages/plugins/commerce/src/storage.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/storage.ts) +2. Types in [`packages/plugins/commerce/src/types.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/types.ts) +3. Validation + handler flow in [`packages/plugins/commerce/src/handlers/catalog.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.ts): + - create variable product with attributes/values + - create SKU option map + - reject missing/extra/duplicated option values and duplicate combinations +4. Add schemas in [`packages/plugins/commerce/src/schemas.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/schemas.ts) +5. Add retrieval handler for variant matrix in catalog detail route. +6. Add unit tests for invariant checks in a shared helper module: + - exact variant-defining coverage + - no duplicate map rows per SKU+attribute + - no duplicate combinations for same product + +```ts +// Phase-3 deterministic signature +const signature = options.map(o => `${o.attributeId}:${o.attributeValueId}`).sort().join('|'); +if (seen.has(signature)) throw ...duplicate variant combination... +if (options.length !== variantAttributeIds.length) throw ...missing/extra option value... +if (new Set(options.map((o) => o.attributeId)).size !== options.length) throw ...duplicate attribute... +``` + +#### Notes + +- Implemented in this pass with: + - separate attribute/value metadata rows, + - `sku option map` rows for variable SKUs, + - signature-based duplicate combination rejection, + - exact variant-defining coverage checks in shared helper module + handler guardrails. + +### Phase 4 — Digital entitlement model + +- \*\*Strategy A (chosen): Separate `digital_assets` and `digital_entitlements`. + - Keeps media vs entitlement semantics explicit and composable for mixed fulfilment. +- **Strategy B:** coerce file assets into product image roles. + - Leaks concerns and breaks access policy. +- **Strategy C:** entitlement at checkout only. + - Correctness and auditability are weak. +- **Strategy D:** force all mixed products into bundles. + - Conflicts with spec behavior principle. + +**Why A wins:** explicit, portable, and aligns with anti-pattern guidance. + +#### Implement + +1. Add types/storage: + - `digitalAssets`, `digitalEntitlements` in [`packages/plugins/commerce/src/types.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/types.ts) + - corresponding storage collections in [`packages/plugins/commerce/src/storage.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/storage.ts) +2. Add handlers: + - `digital-assets/create` + - `digital-entitlements/create` + - `digital-entitlements/remove` + - files: [`packages/plugins/commerce/src/handlers/catalog.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.ts) +3. Add schemas in [`packages/plugins/commerce/src/schemas.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/schemas.ts) +4. Expose retrieval in product detail route: include entitlements summary. + +### Phase 5 — Bundle model + +- **Strategy A (chosen): Explicit `bundle_components` and derived pricing/availability.** + - Enforces non-owned bundle inventory and component-based computation. +- **Strategy B:** synthetic discount-only metadata on products. + - Not auditable for composition. +- **Strategy C:** reuse variable-product option model as bundle engine. + - Conflates concepts and weakens validation. +- **Strategy D:** defer bundle support. + - Fails phase exit criteria. + +**Why A wins:** spec-aligned and scalable for mixed component types. + +#### Implement + +1. Add `bundleComponents` in [`packages/plugins/commerce/src/types.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/types.ts). +2. Store v1 bundle discount config on `StoredProduct` as: + - `bundleDiscountType` + - `bundleDiscountValueMinor` (fixed) + - `bundleDiscountValueBps` (percentage) +3. Add storage collections in [`packages/plugins/commerce/src/storage.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/storage.ts) +4. Add schema + handlers: + - `bundle-components/add` + - `bundle-components/remove` + - `bundle-components/reorder` + - `bundle/compute` +5. Add utility in [`packages/plugins/commerce/src/lib`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/lib) or new helper file for deterministic discount and availability. +6. Add integration tests (price/availability, invalid component refs, recursive prevention where possible via validation). + +#### Execution status (current) + +Completed in this implementation pass with: + +- `bundleComponents` collection and indexes added in storage/types. +- bundle discount fields stored on `StoredProduct`. +- `bundle-components/*` and `bundle/compute` routes exposed in `index.ts`. +- deterministic bundle compute helper added in `src/lib/catalog-bundles.ts`. +- handler-level tests in `handlers/catalog.test.ts` covering add/reorder/remove/compute and invalid composition. + +```ts +const derived = components.reduce((sum, c) => sum + c.priceMinor * c.qty, 0); +const discountMinor = + discountType === "percentage" + ? Math.floor((derived * (discountBps ?? 0)) / 10_000) + : Math.max(0, fixedAmount ?? 0); +const finalMinor = Math.max(0, derived - discountMinor); +``` + +### Phase 6 — Catalog organization and retrieval + +- **Strategy A (chosen): Explicit category/tag entities + links + filterable retrieval.** + - Enables storefront/admin filtering without custom brittle parsing. +- **Strategy B:** metadata tags in JSON. + - cheap now, costly later for indexing and consistency. +- **Strategy C:** external search-only taxonomy. + - weak for source-of-truth reads and admin operations. +- **Strategy D:** config-coded taxonomies. + - not scalable or editable. + +**Why A wins:** durable retrieval model and direct alignment with retrieval requirements. + +#### Implement + +1. Add collections/types: + - `categories`, `productCategoryLinks`, `productTags`, `productTagLinks` in types/storage files. +2. Add schemas for slug/name + relation operations. +3. Define DTOs before final retrieval implementation (in `src/lib/catalog-dto.ts`): + - `ProductDetailDTO` + - `CatalogListingDTO` + - `ProductAdminDTO` + - `BundleSummaryDTO` + - `VariantMatrixDTO` +4. Add list/detail handlers for: + - catalog listing filters by category/tag/status/visibility + - admin retrieval shape includes lifecycle/inventory summary hints +5. Implement response mapping through the shared DTO builders in [`packages/plugins/commerce/src/handlers/catalog.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/catalog.ts). +6. Route additions in [`packages/plugins/commerce/src/index.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/index.ts). +7. Route-level response-shape validation and filter/list behavior for `categoryId`/`tagId` included in `ProductResponse` and listing handlers. + +#### Execution status (current) + +Completed in this implementation pass with: + +- category/tag entities and link rows added in types/storage. +- category/tag DTO members and catalog request filtering enabled in handlers. +- category/tag routes exposed through `index.ts` with list/create/link/unlink endpoints. + +#### Residual checks before phase closure + +- Ensure all schema-level route contract tests include category/tag indexes/lookup paths. + +### Phase 7 — Order snapshot integration + +- **Strategy A (chosen): Snapshot within order line payload at checkout write time.** + - Immediate immutable history guarantee with minimal storage surface change. +- **Strategy B:** separate order-line snapshot collection. + - Cleaner model but higher complexity and I/O. +- **Strategy C:** keep live references only. + - violates snapshot requirement and historical correctness. +- **Strategy D:** async post-checkout denormalization. + - eventual consistency risk for order history integrity. + +**Why A wins:** reaches required behavior quickly with smallest blast radius. + +#### Execution status (current) + +- Snapshot shape and snapshot line payload now extended in [`packages/plugins/commerce/src/types.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/types.ts). +- Snapshot utility added in [`packages/plugins/commerce/src/lib/catalog-order-snapshots.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/lib/catalog-order-snapshots.ts). +- Checkout now enriches and persists snapshots in [`packages/plugins/commerce/src/handlers/checkout.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/checkout.ts) and stores them in pending state for replay. +- Checkout regression coverage added in [`packages/plugins/commerce/src/handlers/checkout.test.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/handlers/checkout.test.ts). +- Snapshot coverage now includes: + - digital entitlement and image snapshot assertions, + - bundle summary assertions, + - idempotent checkout replay invariance (frozen snapshot retained on repeated replay). + +#### Implement + +1. Expand `OrderLineItem` in [`packages/plugins/commerce/src/types.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/types.ts) with a `snapshot` field. +2. Add snapshot builder utilities in [`packages/plugins/commerce/src/lib/catalog-order-snapshots.ts`](/Users/vidarbrekke/Dev/emDash/packages/plugins/commerce/src/lib/catalog-order-snapshots.ts) and domain helpers used by catalog reads as needed. +3. Update checkout handler (`packages/plugins/commerce/src/handlers/checkout.ts`) to call snapshot helper: + - product/sku titles, sku code, prices, options, image snapshot, entitlement/bundle hints +4. Ensure `checkout` stores snapshot into each `OrderLineItem` before `orders.put(...)`. +5. Add tests around historical integrity in order rendering path: + - update product title/price/sku status after checkout and assert order still renders frozen data. +6. Add immutability tests around snapshot payload: + - snapshot object is not recomputed from live catalog on read + - write path is stable under repeated checkout calls for idempotent carts + +### Dependencies and file touches (planned sequence) + +1. `packages/plugins/commerce/src/storage.ts` (collection contracts, indexes, uniqueness) +2. `packages/plugins/commerce/src/types.ts` (domain model growth) +3. `packages/plugins/commerce/src/schemas.ts` (input validation for each endpoint) +4. `packages/plugins/commerce/src/handlers/catalog.ts` (core catalog CRUD + media + variable + digital + bundle + classification) +5. `packages/plugins/commerce/src/index.ts` (route exposure) +6. `packages/plugins/commerce/src/lib/catalog-dto.ts` and `packages/plugins/commerce/src/lib/catalog-order-snapshots.ts` (shared helpers) +7. `packages/plugins/commerce/src/handlers/checkout.ts` (snapshot integration) +8. `packages/plugins/commerce/src/handlers/catalog.test.ts`, `packages/plugins/commerce/src/contracts/storage-index-validation.test.ts`, and any new test files per phase +9. `packages/plugins/commerce/src/services` if a dedicated `catalog-service` abstraction is introduced for shared helper extraction. +10. Docs updates (`HANDOVER.md`, `COMMERCE_EXTENSION_SURFACE.md`, `COMMERCE_DOCS_INDEX.md`) where scope/phase states changed. + +### Acceptance criteria by phase + +- **Phase 1:** create/read/update/get simple product + sku with invalid shape rejection. +- **Phase 2:** upload-link-read path works; primary image uniqueness enforced per product. +- **Phase 3:** variable attributes + option matrix works; each SKU has exactly one option for every variant-defining attribute; missing/extra/duplicate/skewed option values rejected. +- **Phase 4:** one digital SKU can be linked to one-or-more protected digital assets. +- **Phase 5:** fixed bundle pricing + availability derived from components; no independent bundle stock assumptions. +- **Phase 6:** catalog list filters by category/tag; admin can inspect basic states. +- **Phase 7:** historical order lines are snapshot-driven; later catalog edits do not alter rendered history. + +## Test emphasis additions + +- Unit invariants (always): immutable-field guards, variable SKU combination checks, primary image uniqueness, bundle availability formula. +- Cross-phase regression (vital): idempotent cart checkout snapshot generation and repeat snapshot payloads. +- Property-style checks (where practical): deterministic option signatures and bundle availability floor behavior. + +### Risks and mitigation + +- **Cross-cutting index discipline:** keep index coverage in `storage.ts` for every new query path (`status`, `productId`, `skuId`, `role`, `categoryId`, `tagId`) to avoid read regressions. +- **Rollback safety:** each phase can be feature-gated and merged independently. +- **Validation coupling:** avoid silent overwrites by using merge-on-write updates and explicit immutable fields where required. +- **Invariant coupling to checkout:** snapshot fields must be treated as immutable once order is created. diff --git a/.github/workflows/bonk.yml b/.github/workflows/bonk.yml index 9323a8d9a..a82a65ad7 100644 --- a/.github/workflows/bonk.yml +++ b/.github/workflows/bonk.yml @@ -1,11 +1,7 @@ name: Bonk on: - issue_comment: - types: [created] - pull_request_review_comment: - types: [created] - + workflow_dispatch: jobs: bonk: if: github.event.sender.type != 'Bot' diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index d832aad7c..b07c409d2 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -1,11 +1,7 @@ name: CI on: - push: - branches: [main] - pull_request: - branches: [main] - + workflow_dispatch: permissions: contents: read diff --git a/.github/workflows/cla.yml b/.github/workflows/cla.yml index 949d31871..68c8c40f5 100644 --- a/.github/workflows/cla.yml +++ b/.github/workflows/cla.yml @@ -1,11 +1,6 @@ name: "CLA Assistant" on: - issue_comment: - types: [created] - pull_request_target: - types: [opened, synchronize] - merge_group: - + workflow_dispatch: permissions: actions: write contents: write diff --git a/.github/workflows/deploy-marketplace.yml b/.github/workflows/deploy-marketplace.yml index 4ea423271..4d710630c 100644 --- a/.github/workflows/deploy-marketplace.yml +++ b/.github/workflows/deploy-marketplace.yml @@ -2,12 +2,6 @@ name: Seed Marketplace Plugins on: workflow_dispatch: - push: - branches: [main] - paths: - - "packages/plugins/**" - - ".github/workflows/deploy-marketplace.yml" - permissions: contents: read diff --git a/.github/workflows/format.yml b/.github/workflows/format.yml index 006f842dc..d44b9d97e 100644 --- a/.github/workflows/format.yml +++ b/.github/workflows/format.yml @@ -1,11 +1,7 @@ name: Format on: - push: - branches: [main] - pull_request: - branches: [main] - + workflow_dispatch: permissions: contents: read diff --git a/.github/workflows/preview-releases.yml b/.github/workflows/preview-releases.yml index edef63877..48639a45d 100644 --- a/.github/workflows/preview-releases.yml +++ b/.github/workflows/preview-releases.yml @@ -1,11 +1,7 @@ name: Preview Releases on: - push: - branches: [main] - pull_request: - branches: [main] - + workflow_dispatch: permissions: {} concurrency: diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index 30264e06e..e8a64c957 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -1,10 +1,7 @@ name: Release on: - push: - branches: - - main - + workflow_dispatch: concurrency: ${{ github.workflow }}-${{ github.ref }} jobs: diff --git a/.gitignore b/.gitignore index 157fa9ecc..0261e3169 100644 --- a/.gitignore +++ b/.gitignore @@ -65,4 +65,10 @@ __screenshots__/ .emdash-bundle-tmp # Downloaded test data (fetched on demand in CI) -examples/wp-theme-unit-test/ \ No newline at end of file +examples/wp-theme-unit-test/ + +# Local WooCommerce source copy (reference only; not part of EmDash) +woocommerce/ + +# Archives (e.g. review bundles); keep local only +*.zip \ No newline at end of file diff --git a/3rd-party-checklist.md b/3rd-party-checklist.md new file mode 100644 index 000000000..ebf83f11f --- /dev/null +++ b/3rd-party-checklist.md @@ -0,0 +1,102 @@ +# Third-Party Review Checklist (One-Page) + +> This is the historical Option A hardening checklist. +> For the current external reviewer flow, use: +> +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md` +> - `SHARE_WITH_REVIEWER.md` + +## Scope and review goal + +- Path reviewed: Option A finalize hardening for EmDash Commerce webhooks. +- Primary objective: validate whether the implementation is correct enough for production rollout and identify the smallest safe improvements. +- Owner roles: + - **RE** = Commerce plugin runtime engineer + - **SRE** = platform/storage operator + - **SEC** = security reviewer + - **QA** = QA/automation owner + +## Quick pass/fail criteria + +1. No finalize side effects occur without valid webhook signature. +2. Duplicate webhook deliveries do not create duplicate inventory side effects. +3. Preflight validation failures do not apply partial stock mutations. +4. Deterministic payment-attempt selection is stable across retries. +5. Remaining concurrency risk is explicitly accepted with an owner and follow-up ticket. + +## Issue-level checklist (severity + owner) + +### 1) Webhook signature gate is bypassable by malformed request + +- **Severity**: P1 (Integrity / Fraud) +- **What to verify** + - `Stripe-Signature` is parsed and validated before finalize side effects. + - Missing/invalid/malformed signatures return `WEBHOOK_SIGNATURE_INVALID`. + - `settings:stripeWebhookSecret` must be required in deployment paths that receive webhooks. +- **Reviewer outcome** + - `[ ]` Pass / `[ ]` Fail / `[ ]` N/A +- **Ownership**: **SEC** (validation), **RE** (fallback/edge-case handling) +- **Notes** + - Current implementation: implemented in `packages/plugins/commerce/src/handlers/webhooks-stripe.ts`. + +### 2) Replay safety on duplicate webhook events + +- **Severity**: P1 (Data integrity / Inventory) +- **What to verify** + - Duplicate event IDs return replay/error semantics via existing receipt decision path. + - Deterministic movement IDs prevent second write from creating additional ledger rows. + - Duplicate deliveries do not produce negative stock totals. +- **Reviewer outcome** + - `[ ]` Pass / `[ ]` Fail / `[ ]` N/A +- **Ownership**: **RE** (logic), **SRE** (runtime contention observations) + +### 3) Partial mutation risk during preflight failures + +- **Severity**: P1 (Inventory correctness) +- **What to verify** + - Stock validation and normalization occur before stock/ledger writes. + - Preflight failures return conflict/invalid-stock errors and preserve current stock. + - Ledger has no row written when any validation fails. +- **Reviewer outcome** + - `[ ]` Pass / `[ ]` Fail / `[ ]` N/A +- **Ownership**: **RE** + +### 4) Nondeterministic payment-attempt selection + +- **Severity**: P2 (State correctness) +- **What to verify** + - Selection uses deterministic filter/sort (`orderId + providerId + status`, ordered by stable field). + - Tests cover multiple pending attempts and earliest selection. +- **Reviewer outcome** + - `[ ]` Pass / `[ ]` Fail / `[ ]` N/A +- **Ownership**: **RE** + +### 5) Inventory movement index / replay model mismatch + +- **Severity**: P2 (Idempotency) +- **What to verify** + - Unique index definition for movement identity exists in `storage.ts`. + - No migration gap for existing deployments where index is required for full guarantee. +- **Reviewer outcome** + - `[ ]` Pass / `[ ]` Fail / `[ ]` N/A +- **Ownership**: **SRE** + **RE** + +### 6) Residual concurrent-race window under perfect simultaneity + +- **Severity**: P2 (Concurrency / Scaling) +- **What to verify** + - Confirm if remaining race window is acceptable for current traffic profile. + - Confirm follow-up plan if stronger guarantees are required (CAS/claim primitive). +- **Reviewer outcome** + - `[ ]` Accept as-is / `[ ]` Requires follow-up / `[ ]` N/A +- **Ownership**: **RE** (design), **SRE** (capacity/risk) + +## Final recommendation block + +- **Recommended rollout readiness**: `[ ] Ready` / `[ ] Hold until fixes` / `[ ] Require follow-up` +- **Owner**: `_____________________` +- **Review comments summary**: + - *** + - *** + - *** diff --git a/3rdparty_share_index_4.md b/3rdparty_share_index_4.md new file mode 100644 index 000000000..2793a28c8 --- /dev/null +++ b/3rdparty_share_index_4.md @@ -0,0 +1,13 @@ +# 3rd Party Share Index (v4) + +## Status + +This index is historical and refers to an earlier review zip layout. + +## Canonical review path + +- `external_review.md` (current canonical review packet) +- `@THIRD_PARTY_REVIEW_PACKAGE.md` (authoritative entrypoint) +- `SHARE_WITH_REVIEWER.md` (single-file handoff instructions) + +Use this file only for artifact history; current review work should follow the canonical packet chain above. diff --git a/3rdpary_review-4.md b/3rdpary_review-4.md new file mode 100644 index 000000000..c75b73047 --- /dev/null +++ b/3rdpary_review-4.md @@ -0,0 +1,168 @@ +# Third-Party Evaluation Brief — Commerce Finalize Hardening (Option A execution) + +> Historical review packet (Option A). Canonical current entrypoint is: +> +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md` +> - `SHARE_WITH_REVIEWER.md` + +## Executive summary + +This review package covers the Option A hardening pass for the EmDash Commerce plugin, focused on webhook-driven payment finalize integrity. +The current implementation improves reliability of the `stripe` webhook finalize path by making side effects deterministic, adding signature validation, and making inventory mutation behavior safer under duplicate/malformed flows. + +The guiding constraint is still your original brief: + +- keep changes narrow +- avoid over-engineering +- prioritize correctness over speculative features +- remain review-friendly for external audit before moving to Stage 2 + +## Ecosystem context (what this code lives in) + +- `packages/plugins/commerce` is a plugin package in a pnpm monorepo. +- Runtime writes are performed through EmDash plugin storage abstractions (`ctx.storage` + `StorageCollection`). +- Public plugin routes are defined in `packages/plugins/commerce/src/index.ts`. +- Route handlers are currently thin wrappers that call orchestration modules and throw API errors through existing error contracts. +- Checkout and finalize flows intentionally stay isolated from storefront/catalog concerns and do not couple recommendation/agent read paths. + +## Why this pass was needed + +Three categories of risk were addressed: + +1. **Security/inbound trust** + - Webhook traffic was entering finalize logic without cryptographic proof, creating an integrity risk. +2. **Correctness under duplicates and retries** + - `webhookReceipts` and deterministic identifiers reduce duplicate side effects but pre-existing write patterns could still expose partial mutation windows. +3. **Determinism/state consistency** + - Payment attempt updates could vary based on storage ordering, and partial stock/ledger writes were possible during failures. + +## Files changed in this implementation pass + +### Core logic + +- `packages/plugins/commerce/src/orchestration/finalize-payment.ts` + - Added deterministic inventory preflight + normalization path: + - validate required stock rows and line-item consistency before writes. + - convert intended stock adjustments into deterministic movement plans. + - Added deterministic ledger IDs via `inventoryLedgerEntryId(...)`. + - Added idempotent replay-safe mutation path by skipping already-written movement IDs. + - Kept payment conflict/error mapping deterministic and explicit. + +- `packages/plugins/commerce/src/handlers/webhooks-stripe.ts` + - Added webhook signature verification: + - parses `Stripe-Signature` + - validates timestamp tolerance + - validates HMAC (`whsec` style hex signature) using settings secret + - rejects invalid/missing signature before finalize execution. + - exposes helper exports for focused unit tests. + +### Guardrails / schema tightening + +- `packages/plugins/commerce/src/storage.ts` + - Added unique index for deterministic inventory movement replay safety: + - `inventoryLedger`: `["referenceType","referenceId","productId","variantId"]` + +- `packages/plugins/commerce/src/handlers/checkout.ts` + - Added stronger input checks to reject malformed line items (`quantity`, `inventoryVersion`, `unitPriceMinor`) before order creation. + +### Tests added/updated + +- `packages/plugins/commerce/src/orchestration/finalize-payment.test.ts` + - Added scenarios: + - earliest-pending provider attempt is chosen deterministically + - duplicate SKU merge still yields one ledger movement + - preflight failure leaves stock/ledger unchanged (partial-write prevention) + - In-memory storage mock now supports `orderBy` for deterministic pending-attempt behavior. + +- `packages/plugins/commerce/src/handlers/webhooks-stripe.test.ts` _(new)_ + - Added signature helper unit coverage: + - parse format + - valid v1 signature + - bad secret rejection + - missing timestamp rejection + - stale timestamp rejection + +## Known residual risk (explicit) + +- Storage currently lacks native CAS/conditional writes or transactional locking in the orchestration contract used here. +- In a perfect simultaneous duplicate webhook delivery race, one delivery can still attempt overlapping writes before first-commit visibility. +- The current design is replay-bounded and recoverable through receipt ledgering and deterministic IDs, but a true CAS/receipt-lock step remains the next hardening milestone if your volume/profile requires stronger isolation. + +## Third-party evaluator checklist + +### What to validate first + +1. Confirm environment configuration includes `settings:stripeWebhookSecret` in all production and staging runtime paths used by webhook ingestion. +2. Verify raw request body consumption remains compatible with EmDash route pipeline in production workers. +3. Confirm storage guarantees around `query` sorting and unique index enforcement on `inventoryLedger`. + +### What to validate during review + +1. Security + - invalid signatures cannot reach finalize side effects + - malformed / missing signatures fail safely +2. Determinism + - one deterministic attempt is selected across multiple pending attempts + - duplicate SKU merge produces one stock movement row +3. Integrity + - preflight failures produce no stock mutation + - inventory version mismatch and insufficient stock map to stable API errors +4. Idempotency/replay behavior + - duplicate webhook deliveries of same event do not create duplicate stock side effects + +### Suggested production rollout checks + +1. Deploy to staging with production-like concurrency. +2. Send duplicate/simultaneous webhook deliveries and verify: + - one success, one replay or controlled terminal conflict path + - no negative stock from partial writes +3. Monitor for `commerce.finalize.inventory_failed` and `commerce.finalize.token_rejected` logs. + +### Clear review path for a 3rd-party evaluator + +1. **Start with context** + - `3rdpary_review-4.md` (this document) + - `COMMERCE_REVIEW_OPTION_A_PLAN.md` + - `COMMERCE_REVIEW_OPTION_A_EXECUTION_NOTES.md` +2. **Inspect runtime contracts** + - `packages/plugins/commerce/src/index.ts` + - `packages/plugins/commerce/src/handlers/webhooks-stripe.ts` + - `packages/plugins/commerce/src/orchestration/finalize-payment.ts` +3. **Inspect constraints and storage model** + - `packages/plugins/commerce/src/storage.ts` +4. **Validate test coverage** + - `packages/plugins/commerce/src/orchestration/finalize-payment.test.ts` + - `packages/plugins/commerce/src/handlers/webhooks-stripe.test.ts` +5. **Validate behavior against this matrix** + - `WEBHOOK_SIGNATURE_INVALID` on bad/missing signatures + - duplicate events produce replay or controlled terminal conflict semantics + - insufficient stock/version mismatch remains non-partial + - deterministic payment attempt selection + - no duplicate movement rows for duplicate SKUs +6. **Finalize decision** + - Confirm residual concurrent-race risk is acceptable for current scale + - Decide whether a stronger CAS/lock path should be phase-2 scope + +## Artifacts this review package is optimized for + +- Implementation plan and status: + - `COMMERCE_REVIEW_OPTION_A_PLAN.md` + - `COMMERCE_REVIEW_OPTION_A_EXECUTION_NOTES.md` + - `3rdpary_review-4.md` (this document) +- Core implementation/test bundle: + - `packages/plugins/commerce/src/orchestration/finalize-payment.ts` + - `packages/plugins/commerce/src/handlers/webhooks-stripe.ts` + - `packages/plugins/commerce/src/storage.ts` + - `packages/plugins/commerce/src/handlers/checkout.ts` + - `packages/plugins/commerce/src/orchestration/finalize-payment.test.ts` + - `packages/plugins/commerce/src/handlers/webhooks-stripe.test.ts` + +## Decision support for 3rd-party suggestions + +The current path intentionally avoids broad redesigns (no middleware/framework migration, no new plugin boundaries, no new schema surface area). +If reviewer confirms current delivery profile needs stronger concurrency guarantees, the recommended follow-up should be: + +1. introduce a storage-level claim primitive (or explicit lock emulation) for webhook receipts, then +2. fold claim + mutation into one atomic boundary where backend storage allows it, +3. keep current deterministic IDs as a second line of defense for replay safety. diff --git a/3rdpary_review.md b/3rdpary_review.md new file mode 100644 index 000000000..9832aa642 --- /dev/null +++ b/3rdpary_review.md @@ -0,0 +1,163 @@ +# Third-party technical review — EmDash-native commerce plugin + +> Historical review packet. Superseded by `3rdpary_review_2.md` for the current project state. +> Canonical current review path: +> +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md` +> - `SHARE_WITH_REVIEWER.md` + +**Document purpose:** Give an external developer enough context to judge whether the proposed **e-commerce / cart plugin for [EmDash CMS](https://github.com/emdash-cms/emdash)** is on a sound, optimal path—especially regarding extensibility, platform fit, and operational risk—**before** substantial implementation begins. + +**Status:** Historical snapshot from before `packages/plugins/commerce` was added. Keep this file only for context on how the plan evolved. + +**How to use this file:** Read this overview, then the bundled documents (see **Review bundle** below). Answer the questions in **What we want from you** with concrete suggestions, risks, and alternatives. + +--- + +## 1. Ecosystem: EmDash in one paragraph + +EmDash is an **Astro-based CMS** with a **plugin system** that extends the admin, content pipeline, and HTTP API. Plugins are **TypeScript packages**. They receive a **scoped context** (`ctx`) with: + +- **`ctx.storage`** — document collections with indexes (plugin-scoped structured data). +- **`ctx.kv`** — key-value settings and operational state. +- **`ctx.http.fetch`** — outbound HTTP when the plugin declares **`network:fetch`** and **`allowedHosts`** (enforced when sandboxed). +- **`ctx.email.send`**, **`ctx.content`**, **`ctx.media`**, **`ctx.users`**, **`ctx.cron`**, etc., depending on **declared capabilities**. + +Plugins run in two modes: + +- **Trusted (in-process)** — full Node access; capabilities are documentary. +- **Sandboxed (Cloudflare Workers isolate)** — strict capability enforcement, resource limits, **Block Kit** admin UI (declarative JSON), no arbitrary plugin JS in the browser for admin. + +**Native vs standard:** “Standard” plugins favor marketplace distribution and the same code path for trusted + sandboxed. **Native** plugins are the escape hatch for **React admin**, **Portable Text block types**, and **Astro site components** shipped from npm—required for rich merchant UIs and storefront components. The canonical author-facing description of this split is in the bundled **`skills/creating-plugins/SKILL.md`** (mirrors [upstream skill](https://github.com/emdash-cms/emdash/blob/main/skills/creating-plugins/SKILL.md)). + +EmDash also ships **x402**-style payment integration for **content monetization**; that is **orthogonal** to a full cart (see `high-level-plan.md`). + +--- + +## 2. Problem we are solving (why not “just use WooCommerce”?) + +The product owner’s pain is **WooCommerce-style extensibility**: child themes, template overrides, opaque PHP hooks/filters, and stacks of plugins that fight over the same global cart/order hooks. The goal is a **legacy-free** commerce layer that is: + +- **Headless-friendly** — storefront is **Astro**, not PHP templates. +- **Contract-driven** — extensions integrate through **typed boundaries**, not mutable global hooks. +- **EmDash-native** — storage, KV, routes, cron, email, capabilities—not a parallel framework inside the CMS. + +A local **WooCommerce PHP tree** was used only as a **reference** for cart/checkout _ideas_ (session tokens, route decomposition, validation); it is **not** part of the deliverable and is **gitignored** in this repo. + +--- + +## 3. Proposed solution (executive summary) + +### 3.1 Core deliverable + +A **first-party commerce plugin** (`@emdash-cms/plugin-commerce` or equivalent) that provides: + +- **Product catalog** — including **simple**, **variable** (many variants), **bundle**, **digital**, and **gift card** shapes via a **discriminated `type` + `typeData`** model (not class inheritance). +- **Cart** — server-side cart, totals, discounts (staged), line items. +- **Checkout & orders** — order lifecycle, payment handoff, webhooks, emails. +- **Admin** — products, orders, settings (React / native plugin trajectory). +- **Storefront** — **Astro components** + optional **Portable Text** blocks (native). + +### 3.2 Extension model (the main architectural bet) + +Instead of WordPress-style filters, **extensions register as providers** in a **registry** stored in plugin storage. The commerce core **calls provider routes over HTTP** (`ctx.http.fetch`) using **narrow, versioned contracts** (payment, shipping, tax, fulfillment). Third-party payment/shipping/tax plugins are **standard** (sandboxable, marketplace-friendly) where possible. + +### 3.3 AI / agents + +Design assumption: **merchants and operators will use LLM agents**. Therefore: + +- Admin and automation surfaces expose **structured JSON** APIs. +- Errors use **stable machine codes** + human copy. +- A future **MCP**-oriented companion plugin is planned to expose tools (list products, adjust inventory, order actions, summaries). + +Details: **`commerce-plugin-architecture.md`** (Sections 10–11). + +### 3.4 Locked product decisions (already chosen) + +Recorded in **`commerce-plugin-architecture.md` §15**: + +| Topic | Decision | +| --------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| Payment gateways (v1) | **Stripe** and **Authorize.net**—two real implementations early to stress-test the provider contract. | +| Inventory | **Payment-first; reserve/decrement at finalize** after successful payment. Explicit UX for **inventory changed** between cart and payment. | +| Shipping / tax | **Separate module**. Without it: **no shipping address / quote flows** in core. Multi-currency and localized tax lean toward **that module family**, not duplicated in core v1. | +| Logged-in users | **Purchase history** + **durable cart** across logout/login and devices; anonymous `cartToken` **merge/associate** on login. | + +--- + +## 4. Documents in the review bundle (what to read in order) + +The archive **`lates-code.zip`** at the repository root contains exactly these **nine** paths (read in this order): + +| Order | Path in zip | Role | +| ----- | ----------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------- | +| 1 | `3rdpary_review.md` | Framing and review questions (this file). | +| 2 | `commerce-plugin-architecture.md` | **Authoritative** architecture: data model, routes, phases, Step 1 spec, locked decisions. | +| 3 | `high-level-plan.md` | Earlier, shorter sketch; useful for diffing scope drift; superseded by the architecture doc where they conflict. | +| 4 | `skills/creating-plugins/SKILL.md` | EmDash plugin anatomy, trusted vs sandboxed, capabilities, routes—**platform ground truth** for “are we using EmDash correctly?”. | +| 5 | `packages/plugins/forms/src/index.ts` | Forms plugin: descriptor + `definePlugin`, routes, hooks, admin. | +| 6 | `packages/plugins/forms/src/storage.ts` | Storage collection/index declaration pattern. | +| 7 | `packages/plugins/forms/src/schemas.ts` | Zod input schemas for routes. | +| 8 | `packages/plugins/forms/src/types.ts` | Domain types stored in `ctx.storage`. | +| 9 | `packages/plugins/forms/src/handlers/submit.ts` | Public route handler: validation, media, storage, email, webhooks. | + +**Not bundled (too large or redundant):** full `packages/core/src/plugins/types.ts` — use the [repo](https://github.com/emdash-cms/emdash) or your checkout of EmDash for the complete `PluginContext` / capability types. Plugin overview docs live under `docs/src/content/docs/plugins/` in the upstream repo. + +--- + +## 5. What we want from you (review questions) + +Please be blunt. We are optimizing for **correctness, maintainability, and third-party extension ergonomics**—not for matching WooCommerce feature parity. + +### 5.1 Platform fit + +1. Is **native plugin** for commerce core + **standard plugins** for providers the right split for EmDash today? +2. Where would you **push back** on “provider registry + HTTP delegation” vs **in-process hooks** or **shared npm library** only (no runtime calls)? +3. Does the plan align with **sandboxed** constraints for extensions (CPU, subrequests, no Node in worker)? Any provider pattern that will **systematically fail** on Cloudflare? + +### 5.2 Data model and commerce semantics + +4. Is **`type` + `typeData` + separate `productVariants` / attributes** the right long-term model for bundles and variants? +5. **Payment-first inventory** reduces reservation complexity but increases **oversell risk** under concurrency. What mitigations would you require (optimistic locking, version fields, queue, last-chance validation UX)? +6. Should **orders** embed line items vs normalize to `orderLines` collection for reporting at scale? + +### 5.3 Checkout and compliance + +7. **Stripe + Authorize.net** early: does that meaningfully validate the abstraction, or would you add a **third** radically different gateway (e.g. redirect-only) in the first milestone? +8. PCI and webhook **security** (signature verification, idempotency, replay): what is **missing** from the written plan? + +### 5.4 Extensibility vs WooCommerce + +9. What WooCommerce **patterns** (if any) are we **under-weighting** that merchants still expect (e.g. fee lines, coupons, mixed carts, subscriptions)? +10. What are the top **three** ways this design could still end up as “plugin soup” like WordPress—and how to **prevent** them? + +### 5.5 AI and operations + +11. Is the **MCP / tool** strategy coherent, or would you standardize on **OpenAPI** + codegen first? +12. What **observability** (structured logs, correlation ids, order event stream) is mandatory for production? + +### 5.6 Phasing + +13. Is the **phase order** in `commerce-plugin-architecture.md` §13 sensible? What would you **reorder** or **merge**? + +--- + +## 6. Known gaps and intentional non-goals (today) + +- No **`packages/plugins/commerce`** implementation yet. +- **WooCommerce** source is excluded from version control; reviewers should not assume it is in the zip. +- **Fulfillment / shipping / tax** module is **explicitly out of core v1 scope** except as extension points. +- Diagrams in the architecture doc may name illustrative packages (e.g. PayPal in a tree); **§15** is authoritative for payment targets. + +--- + +## 7. How to return feedback + +A short written review (bullet risks + recommendations) is enough. Prefer: + +- **Severity** (blocker / major / minor / nit). +- **Concrete alternative** where you disagree with the approach. +- **References** to sections in `commerce-plugin-architecture.md` so we can trace changes. + +Thank you for the review. diff --git a/3rdpary_review_2.md b/3rdpary_review_2.md new file mode 100644 index 000000000..f02079a43 --- /dev/null +++ b/3rdpary_review_2.md @@ -0,0 +1,176 @@ +# Third-party technical review (round 2) — EmDash-native commerce + +> Historical review packet (round 2). Current canonical review entrypoint is: +> +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md` +> - `SHARE_WITH_REVIEWER.md` + +**Document purpose:** Give an external developer enough context to assess whether the **EmDash e-commerce / cart plugin** program is on a sound, optimal path **after** architecture hardening, a first internal review, platform alignment notes, and a small **kernel code** scaffold—not just paper design. + +**How to use this file:** Read §1–3, then the files listed in **§4 Review bundle** (inside `latest-code_2.zip`) in order. Answer **§5** with concrete risks, alternatives, and section references. + +--- + +## 1. Ecosystem: EmDash in one paragraph + +EmDash is an **Astro-based CMS** with a **TypeScript plugin** model. Plugins receive a scoped **`ctx`**: **`ctx.storage`** (indexed document collections), **`ctx.kv`**, **`ctx.http.fetch`** (with **`network:fetch`** + **`allowedHosts`** when sandboxed), **`ctx.email`**, **`ctx.content`**, **`ctx.media`**, **`ctx.users`**, **`ctx.cron`**, etc., according to **declared capabilities**. + +- **Trusted** plugins run in-process (full Node where the host allows it); capabilities are mainly documentary. +- **Sandboxed** plugins run in **Cloudflare Workers isolates** with **enforced** capabilities, **CPU/subrequest limits**, and **Block Kit** admin UI (no arbitrary admin JS from the plugin). + +**Native vs standard:** **Standard** plugins target marketplace + sandbox compatibility. **Native** plugins are the escape hatch for **React admin**, **Portable Text blocks**, and **Astro storefront components**—the commerce **core** is expected to be **native** for merchant UX and PT/Astro integration, while many **payment/shipping/tax** extensions remain **standard**. + +Canonical platform description: bundled **`skills/creating-plugins/SKILL.md`** (see also [upstream](https://github.com/emdash-cms/emdash/blob/main/skills/creating-plugins/SKILL.md)). + +**x402 vs cart:** EmDash ships **x402** for HTTP-native, often **per-request** content monetization. It is **not** a substitute for a product catalog, cart, and orders. Merchant-facing comparison: **`commerce-vs-x402-merchants.md`**. + +--- + +## 2. Problem we are solving + +**Pain:** WooCommerce-style **theme coupling**, **PHP hooks/filters**, and **plugins mutating global cart state**—hard to extend safely and hard to headless. + +**Direction:** **Headless-first** (Astro storefront), **contract-driven** extensions (typed provider interfaces + registry), **explicit state machines** and **one finalization path** for payments/inventory, **EmDash primitives only** in the kernel (`ctx.*` in the plugin wrapper, not inside pure domain code). + +WooCommerce PHP source is **not** in this repository (ignored); prior analysis used Store API patterns (cart session, route decomposition, checkout validation) as **non-binding** input. + +--- + +## 3. Proposed solution (current snapshot) + +### 3.1 Core deliverable + +A **native** commerce plugin package providing: + +- **Products** — discriminated **`type` + `typeData`** (simple, variable, bundle, digital, gift card); variants and attributes in separate collections. +- **Cart** — server-side cart, line items, merge rules for logged-in users, rate limits and payload bounds (§20). +- **Checkout & orders** — immutable **order snapshots**, rich **order/payment** state machines, **`finalizePayment`** as the single authority for post-payment inventory decrement (aligned with **payment-first** inventory policy). +- **Providers** — payment (Stripe, Authorize.net), later shipping/tax via **separate modules** and provider contracts. +- **Admin & storefront** — React admin + Astro components (phased after the first vertical payment slice). + +Authoritative detail: **`commerce-plugin-architecture.md`** (§1–21). + +### 3.2 Extension / provider model (refined) + +- **Registry** in plugin storage for registered providers. +- **Contracts** exported from a future SDK package; **Zod**-validated route inputs where applicable. +- **Execution:** **In-process TypeScript adapters** for **first-party** gateways (fewer subrequests, simpler tests). **HTTP route delegation** to another plugin remains valid for **sandboxed** or marketplace extensions—same **interface**, different **wiring** (§4 architecture doc). + +This supersedes an earlier draft that leaned on **HTTP-only** internal delegation. + +### 3.3 Phasing (high level) + +Reflects an internal “shrink v1, prove correctness first” pass (**`emdash-commerce-final-review-plan.md`**) merged into **`commerce-plugin-architecture.md` §13**: + +1. **Phase 0** — Types, storage schema, state machines, error catalog, **no** business I/O. +2. **Phase 1** — **Kernel only** (pure domain + finalization idempotency); **no** React/Astro. +3. **Phase 2** — **One end-to-end slice: Stripe** (product → cart → checkout → webhook → finalize → email). +4. **Phase 3** — **Hardening tests** (duplicate webhook, inventory conflict, stale cart, etc.). +5. **Phase 4** — **Authorize.net** to **stress the payment abstraction** (auth/capture split). +6. Later — admin UX, storefront library, shipping/tax modules, MCP/AI tools. + +**Note:** Product decision was “Stripe + Authorize.net in v1”; **implementation order** is **Stripe first**, second gateway **after** the path is proven—still satisfies “two implementations,” with lower risk. + +### 3.4 Locked product decisions + +See **`commerce-plugin-architecture.md` §15**: + +| Topic | Decision | +| -------------- | ----------------------------------------------------------------------------------------------------------------- | +| Gateways | Stripe **and** Authorize.net (implementation **sequenced**; see §3.3). | +| Inventory | **Payment-first** finalize; explicit **`inventory_changed` / `payment_conflict`** handling. | +| Shipping / tax | **Separate module**; no shipping address/quote in core without it; multi-currency/localized tax with that family. | +| Identity | Logged-in **purchase history** + **durable cart**; guest cart **merge** on login (§17). | + +### 3.5 Robustness, scale, and platform (new since round 1) + +- **§20** — Payload caps, **KV rate limits**, **client `Idempotency-Key`** + **`idempotencyKeys`** collection, **webhook** composite unique **`(providerId, externalEventId)`**, **inventory ledger**, **circuit breaker** keys, cursor pagination, lean webhook handlers. +- **§21** — Alignment with **EmDash sandbox + capabilities** and **Workers bindings / SSRF** cautions; **x402** as complementary; **no `CF-Worker`-header auth**. + +### 3.6 WooCommerce-derived backlog (optional post-v1) + +Cart **revalidate on read**, **rounding policy**, **outgoing merchant webhooks**, **email matrix**, **customer vs internal notes**, **digital download grants**, **scheduled sales**, **per-customer limits**, **multi-capture** totals—captured in chat review; not all are yet spelled out in the architecture doc. Reviewer may suggest which belong in core vs modules. + +### 3.7 Code that exists today + +**`packages/plugins/commerce`** — early **kernel** only: + +- Error metadata subset, **limits**, **idempotency key** validation, **rate-limit window** helper, **provider HTTP policy** constants, **`decidePaymentFinalize`** (pure idempotency / state guard) + **Vitest** tests. + +**No** `definePlugin` wiring, **no** storage adapters, **no** Stripe integration yet. + +--- + +## 4. Review bundle (`latest-code_2.zip`) + +Extract and read in this order: + +| # | Path | Role | +| --- | ----------------------------------------------- | -------------------------------------------------------------- | +| 1 | `3rdpary_review_2.md` | This briefing + questions. | +| 2 | `commerce-plugin-architecture.md` | **Authoritative** full architecture (§1–21). | +| 3 | `emdash-commerce-final-review-plan.md` | External “tighten foundation” review that influenced §13–§19. | +| 4 | `commerce-vs-x402-merchants.md` | One-page **commerce vs x402** for product positioning. | +| 5 | `high-level-plan.md` | Original short sketch; superseded where it conflicts with (2). | +| 6 | `3rdpary_review.md` | **Round 1** review packet (historical context). | +| 7 | `skills/creating-plugins/SKILL.md` | EmDash plugin model **ground truth**. | +| 8 | `packages/plugins/forms/src/index.ts` | Reference: descriptor + `definePlugin` + routes + hooks. | +| 9 | `packages/plugins/forms/src/storage.ts` | Storage index / `uniqueIndexes` pattern. | +| 10 | `packages/plugins/forms/src/schemas.ts` | Zod route inputs. | +| 11 | `packages/plugins/forms/src/types.ts` | Domain types. | +| 12 | `packages/plugins/forms/src/handlers/submit.ts` | Public handler: validation, media, storage, email, webhook. | +| 13 | `packages/plugins/commerce/package.json` | Commerce package metadata + exports. | +| 14 | `packages/plugins/commerce/tsconfig.json` | TS config. | +| 15 | `packages/plugins/commerce/vitest.config.ts` | Tests. | +| 16 | `packages/plugins/commerce/src/kernel/*.ts` | Kernel modules + tests. | + +**Not bundled:** `node_modules`, full `packages/core` sources, WooCommerce tree, upstream EmDash `docs/` tree (use [GitHub](https://github.com/emdash-cms/emdash) for `PluginContext` and plugin overview MDX). + +--- + +## 5. What we want from you (review questions) + +Please be direct. Prefer **severity** (blocker / major / minor / nit), **alternatives**, and **§ references** into `commerce-plugin-architecture.md`. + +### A. Architecture and phasing + +1. Is **kernel-first → Stripe vertical slice → hardening → second gateway** the right ordering for **risk** vs **time-to-feedback**? +2. Does **§20** go too far for v1, or is it about right for **production-shaped** first release? + +### B. Provider model and Cloudflare + +3. Is **in-process first-party adapters + HTTP only when sandbox requires** coherent, or would you **standardize on one** mechanism? +4. Any **systematic** failure modes on **sandboxed** provider plugins (subrequests, CPU) we still underestimate? + +### C. Data model and money + +5. **`inventoryVersion` + ledger + payment-first finalize** — sufficient **concurrency** story, or missing **compare-and-swap** / transactions explicitly? +6. Where should **tax/rounding** policy be **pinned** so totals are reproducible (per line vs per order)? + +### D. Security and abuse + +7. **Rate limits + idempotency keys + webhook composite unique** — gaps vs real **card testing** or **replay** attacks? +8. **SSRF / user-controlled URLs** — any commerce feature we should **forbid by design** (see §21)? + +### E. Extensibility and “plugin soup” + +9. Top **three** guardrails to avoid **Woo-style** accidental coupling—are **events-only** (no filters) + **provider contracts** + **layer boundaries** enough? +10. Should **outgoing merchant webhooks** be **core** earlier than post-v1? + +### F. AI / ops + +11. **MCP later** vs **OpenAPI-first** for agent integration—which would you prioritize after checkout works? +12. **Observability §19** — missing **must-haves** for on-call? + +### G. x402 and positioning + +13. Is **`commerce-vs-x402-merchants.md`** accurate and sufficient to avoid **internal** product confusion? + +--- + +## 6. How to return feedback + +Short written review is enough. Link suggestions to **`commerce-plugin-architecture.md` sections** where possible. + +Thank you for the review. diff --git a/3rdpary_review_3.md b/3rdpary_review_3.md new file mode 100644 index 000000000..68a980104 --- /dev/null +++ b/3rdpary_review_3.md @@ -0,0 +1,215 @@ +# 3rd Party Technical Review Request Pack + +> Historical review packet. For the current external review entrypoint, use: +> +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md` +> - `SHARE_WITH_REVIEWER.md` + +## Executive Summary + +This workspace is implementing a first-party **EmDash commerce plugin** as a correctness-first, kernel-centric slice before broader platform expansion. The objective is to avoid the complexity and fragility that comes with external CMS integrations (for example WooCommerce parity work) by owning the commerce core in EmDash with a provider-first abstraction that supports a pragmatic path to additional providers. + +This is not a full-feature commerce platform yet. It is intentionally narrowed to a **single provable end-to-end path**: + +- one canonical order lifecycle model, +- idempotent cart/checkout/payment operations, +- fixed-window rate limiting, +- strict provider execution contracts, +- deterministic finalize behavior for webhook-driven payment confirmation. + +The current edits align the code with the architectural contracts in the handover and architecture documents by tightening error semantics, clarifying rate-limit semantics, and hardening finalize decision logic. + +--- + +## Why this approach was chosen + +### Problem framing + +- EmDash can support digital-first and traditional products in one place, but the previous path in many stacks starts with broad integration layers and only later fixes correctness issues. +- Mission-critical commerce systems fail most on correctness gaps: duplicate capture, non-idempotent checkout, replaying webhook side effects, inconsistent state transitions, and poor observability. +- The strategy here is therefore: **kernel-first, correctness-first, payment-first, then feature expansion**. + +### What makes this path robust + +- A single source of truth for commerce behavior in `packages/plugins/commerce/src/kernel`. +- Canonical enums + contracts for errors, states, and policies. +- Strongly typed provider interfaces with explicit extension boundaries. +- Storage-backed behavior for idempotency and state transitions as code evolves. + +### Why this is “phase 1” rather than full marketplace + +- Full merchant/platform features are intentionally deferred. +- The current scope is to prove one safe path in production-like conditions before adding: + - admin dashboards, + - additional providers, + - complex settlement workflows, + - multi-provider orchestration, + - advanced fraud/rate-limit controls. + +--- + +## Source documents and governing references + +You can evaluate alignment quickly by reading in this order: + +1. `HANDOVER.md` (current operating plan and open questions). +2. `commerce-plugin-architecture.md` (authoritative architecture contract). +3. `emdash-commerce-deep-evaluation.md` and `emdash-commerce-final-review-plan.md` (risk framing and recommended sequencing). +4. `3rdpary_review_2.md` and `3rdpary_review.md` (historical review context). +5. `AGENTS.md` and `skills/creating-plugins/SKILL.md` (implementation guardrails and plugin standards). + +--- + +## Current target architecture + +### 1) Plugin model and execution assumptions + +- EmDash supports both native and standard plugins. +- This implementation is positioned as a **native plugin** for depth and local behavior in phase 1. +- Provider support is built on a registry + typed interface with policy controls. +- The long-term path allows provider adapters in-process (first-party) or delegated execution (worker/HTTP route) without changing the kernel’s contract. + +### 2) Commerce core principles in code + +- **Kernel owns invariants**: state transitions, checks, and decision points live in core utility + schema modules. +- **Provider is a service**: providers perform external-facing operations and return canonical events/results. +- **Persistence + idempotency are required**, not optional. +- **Finalize is single path**: one authoritative function decides whether payment finalization should proceed, become noop, or conflict. + +### 3) Domain model direction + +- Product typing follows a discriminated union pattern (`type + typeData`) to avoid null/optional ambiguity. +- Order/payment/cart models are intentionally explicit state machines with narrow allowed transitions. +- Inventory is tracked with snapshot/ledger thinking to support reconciliation and deterministic replay behavior. + +### 4) Error contract strategy + +- Error codes are canonicalized (`snake_case`) and mapped to `(httpStatus, retryable)` metadata. +- Consumers should treat error code + status as compatibility surface; message wording is secondary. + +--- + +## Changes completed in this review cycle + +The recent corrections focused on three mismatches that had direct correctness impact: + +### A. Canonical commerce errors + +File: `packages/plugins/commerce/src/kernel/errors.ts` + +- Replaced the partial internal map with the canonical `COMMERCE_ERRORS` set from `commerce-plugin-architecture.md`. +- This makes error handling predictable across modules and aligns code expectations with the design document. + +### B. Rate-limit semantics correction + +Files: + +- `packages/plugins/commerce/src/kernel/limits.ts` +- `packages/plugins/commerce/src/kernel/rate-limit-window.ts` +- `packages/plugins/commerce/src/kernel/rate-limit-window.test.ts` + +- Confirmed implementation is fixed-window. +- Clarified comments so docs no longer describe a sliding window. +- Added/updated tests to validate boundary behavior of fixed-window counters. + +### C. Finalization decision logic hardening + +Files: + +- `packages/plugins/commerce/src/kernel/finalize-decision.ts` +- `packages/plugins/commerce/src/kernel/finalize-decision.test.ts` + +- Expanded `OrderPaymentPhase` coverage for robust state reasoning. +- Expanded finalize outcomes for webhook receipt states (`processed`, `duplicate`, `pending`, `error`). +- Ensured explicit precedence for already-paid/cached replay conditions and non-finalizable states. +- Added unit tests for the full decision matrix. + +--- + +## Why this matters for third-party review + +This bundle is designed to let an external reviewer validate: + +1. **Specification-conformance** + - Does implementation match the architecture claims? + - Are ambiguous comments/assumptions removed? + +2. **Failure behavior** + - How the system reacts under duplicate webhook, replay, and out-of-order events. + - Whether idempotency controls produce bounded behavior. + +3. **Operational safety** + - Whether rate-limiting semantics are consistent and test-anchored. + - Whether state transitions prevent accidental double-completion. + +4. **Expansion readiness** + - Whether abstractions are sufficient for local Stripe slice now and future provider adapters later. + +--- + +## Suggested review checklist for external reviewer + +1. Validate the contract mapping end-to-end: + - Compare `commerce-plugin-architecture.md` vs `packages/plugins/commerce/src/kernel/errors.ts` and finalize decision behavior. +2. Validate idempotency assumptions in kernel helpers: + - `packages/plugins/commerce/src/kernel/idempotency-key.ts` and existing tests. +3. Validate rate limiting behavior under burst and window edge cases: + - `packages/plugins/commerce/src/kernel/rate-limit-window.ts` + tests. +4. Validate finalize decision precedence: + - `packages/plugins/commerce/src/kernel/finalize-decision.ts` + tests. +5. Validate provider boundary and policy behavior: + - `packages/plugins/commerce/src/kernel/provider-policy.ts`. +6. Validate integration style: + - Compare with EmDash plugin reference implementations in `packages/plugins/forms/src/*`. +7. Validate that development constraints and conventions are observed: + - `AGENTS.md` and `skills/creating-plugins/SKILL.md`. + +--- + +## Potential risk areas to watch closely + +- **Scope drift**: It is easy to add provider-agnostic abstractions before state and payload contracts are fully stable. +- **State explosion**: `OrderPaymentPhase` and webhook status unions must remain explicit; hidden values can create silent transitions. +- **Replay semantics**: Webhook handling must be deterministic across retries, including explicit memoization behavior around already-processed and duplicate signatures. +- **Operator UX coupling**: As soon as admin tooling starts writing states, they must enforce the same kernel transitions and not bypass invariants. + +--- + +## Open assumptions requiring confirmation + +- At least one external webhook/event source (likely Stripe in phase 1) will be handled via a stable reconciliation strategy that surfaces both `processed` and `error` receipt states to finalize logic. +- Inventory decrement should remain finalize-gated (not merely cart-authorized) in the first stable slice. +- Storage-backed idempotency and webhook receipt persistence is planned in the next coding phase as stated in the handover document. +- Analytics/financial reporting is intentionally excluded from phase 1 to avoid unverified derived state. + +--- + +## Suggested immediate next milestones (so review feedback can be verified) + +1. Implement/finish storage-backed persistence for: + - webhook receipts, + - payment attempts, + - idempotency key replay windows, + - finalized order snapshots. +2. Integrate the Stripe provider slice end-to-end with kernel contracts. +3. Implement the canonical checkout and webhook endpoints. +4. Add replay/conflict tests that assert idempotent finalization under duplicate webhook deliveries. +5. Provide minimal admin visibility for failure and reconciliation status. + +--- + +## Included files in this review package + +This package contains: + +- Architecture and directive documents: `HANDOVER.md`, `commerce-plugin-architecture.md`, `emdash-commerce-deep-evaluation.md`, `emdash-commerce-final-review-plan.md`, `high-level-plan.md`, `commerce-vs-x402-merchants.md`, `3rdpary_review.md`, `3rdpary_review_2.md`. +- Coding guardrails and plugin conventions: `AGENTS.md`, `skills/creating-plugins/SKILL.md`. +- Commerce plugin metadata and kernel code: `packages/plugins/commerce/package.json`, `packages/plugins/commerce/tsconfig.json`, `packages/plugins/commerce/vitest.config.ts`, `packages/plugins/commerce/src/kernel/errors.ts`, `packages/plugins/commerce/src/kernel/finalize-decision.ts`, `packages/plugins/commerce/src/kernel/finalize-decision.test.ts`, `packages/plugins/commerce/src/kernel/limits.ts`, `packages/plugins/commerce/src/kernel/rate-limit-window.ts`, `packages/plugins/commerce/src/kernel/rate-limit-window.test.ts`, `packages/plugins/commerce/src/kernel/idempotency-key.ts`, `packages/plugins/commerce/src/kernel/idempotency-key.test.ts`, `packages/plugins/commerce/src/kernel/provider-policy.ts`. +- Plugin reference implementation for pattern comparison: `packages/plugins/forms/src/index.ts`, `packages/plugins/forms/src/storage.ts`, `packages/plugins/forms/src/schemas.ts`, `packages/plugins/forms/src/handlers/submit.ts`, `packages/plugins/forms/src/types.ts`. + +--- + +## Delivery + +This document is named `3rdpary_review_3.md` and should be reviewed before `latest-code_3.zip`. diff --git a/3rdpary_review_4.md b/3rdpary_review_4.md new file mode 100644 index 000000000..21ef3ac94 --- /dev/null +++ b/3rdpary_review_4.md @@ -0,0 +1,215 @@ +# 3rd Party Technical Review Request Pack + +> Historical review packet. For the current external review entrypoint, use: +> +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md` +> - `SHARE_WITH_REVIEWER.md` + +## Executive Summary + +This workspace is implementing a first-party **EmDash commerce plugin** as a correctness-first, kernel-centric slice before broader platform expansion. The objective is to avoid the complexity and fragility that comes with external CMS integrations (for example WooCommerce parity work) by owning the commerce core in EmDash with a provider-first abstraction that supports a pragmatic path to additional providers. + +This is not a full-feature commerce platform yet. It is intentionally narrowed to a **single provable end-to-end path**: + +- one canonical order lifecycle model, +- idempotent cart/checkout/payment operations, +- fixed-window rate limiting, +- strict provider execution contracts, +- deterministic finalize behavior for webhook-driven payment confirmation. + +The current edits align the code with the architectural contracts in the handover and architecture documents by tightening error semantics, clarifying rate-limit semantics, and hardening finalize decision logic. + +--- + +## Why this approach was chosen + +### Problem framing + +- EmDash can support digital-first and traditional products in one place, but the previous path in many stacks starts with broad integration layers and only later fixes correctness issues. +- Mission-critical commerce systems fail most on correctness gaps: duplicate capture, non-idempotent checkout, replaying webhook side effects, inconsistent state transitions, and poor observability. +- The strategy here is therefore: **kernel-first, correctness-first, payment-first, then feature expansion**. + +### What makes this path robust + +- A single source of truth for commerce behavior in `packages/plugins/commerce/src/kernel`. +- Canonical enums + contracts for errors, states, and policies. +- Strongly typed provider interfaces with explicit extension boundaries. +- Storage-backed behavior for idempotency and state transitions as code evolves. + +### Why this is “phase 1” rather than full marketplace + +- Full merchant/platform features are intentionally deferred. +- The current scope is to prove one safe path in production-like conditions before adding: + - admin dashboards, + - additional providers, + - complex settlement workflows, + - multi-provider orchestration, + - advanced fraud/rate-limit controls. + +--- + +## Source documents and governing references + +You can evaluate alignment quickly by reading in this order: + +1. `HANDOVER.md` (current operating plan and open questions). +2. `commerce-plugin-architecture.md` (authoritative architecture contract). +3. `emdash-commerce-deep-evaluation.md` and `emdash-commerce-final-review-plan.md` (risk framing and recommended sequencing). +4. `3rdpary_review_2.md` and `3rdpary_review.md` (historical review context). +5. `AGENTS.md` and `skills/creating-plugins/SKILL.md` (implementation guardrails and plugin standards). + +--- + +## Current target architecture + +### 1) Plugin model and execution assumptions + +- EmDash supports both native and standard plugins. +- This implementation is positioned as a **native plugin** for depth and local behavior in phase 1. +- Provider support is built on a registry + typed interface with policy controls. +- The long-term path allows provider adapters in-process (first-party) or delegated execution (worker/HTTP route) without changing the kernel’s contract. + +### 2) Commerce core principles in code + +- **Kernel owns invariants**: state transitions, checks, and decision points live in core utility + schema modules. +- **Provider is a service**: providers perform external-facing operations and return canonical events/results. +- **Persistence + idempotency are required**, not optional. +- **Finalize is single path**: one authoritative function decides whether payment finalization should proceed, become noop, or conflict. + +### 3) Domain model direction + +- Product typing follows a discriminated union pattern (`type + typeData`) to avoid null/optional ambiguity. +- Order/payment/cart models are intentionally explicit state machines with narrow allowed transitions. +- Inventory is tracked with snapshot/ledger thinking to support reconciliation and deterministic replay behavior. + +### 4) Error contract strategy + +- Error codes are canonicalized (`snake_case`) and mapped to `(httpStatus, retryable)` metadata. +- Consumers should treat error code + status as compatibility surface; message wording is secondary. + +--- + +## Changes completed in this review cycle + +The recent corrections focused on three mismatches that had direct correctness impact: + +### A. Canonical commerce errors + +File: `packages/plugins/commerce/src/kernel/errors.ts` + +- Replaced the partial internal map with the canonical `COMMERCE_ERRORS` set from `commerce-plugin-architecture.md`. +- This makes error handling predictable across modules and aligns code expectations with the design document. + +### B. Rate-limit semantics correction + +Files: + +- `packages/plugins/commerce/src/kernel/limits.ts` +- `packages/plugins/commerce/src/kernel/rate-limit-window.ts` +- `packages/plugins/commerce/src/kernel/rate-limit-window.test.ts` + +- Confirmed implementation is fixed-window. +- Clarified comments so docs no longer describe a sliding window. +- Added/updated tests to validate boundary behavior of fixed-window counters. + +### C. Finalization decision logic hardening + +Files: + +- `packages/plugins/commerce/src/kernel/finalize-decision.ts` +- `packages/plugins/commerce/src/kernel/finalize-decision.test.ts` + +- Expanded `OrderPaymentPhase` coverage for robust state reasoning. +- Expanded finalize outcomes for webhook receipt states (`processed`, `duplicate`, `pending`, `error`). +- Ensured explicit precedence for already-paid/cached replay conditions and non-finalizable states. +- Added unit tests for the full decision matrix. + +--- + +## Why this matters for third-party review + +This bundle is designed to let an external reviewer validate: + +1. **Specification-conformance** + - Does implementation match the architecture claims? + - Are ambiguous comments/assumptions removed? + +2. **Failure behavior** + - How the system reacts under duplicate webhook, replay, and out-of-order events. + - Whether idempotency controls produce bounded behavior. + +3. **Operational safety** + - Whether rate-limiting semantics are consistent and test-anchored. + - Whether state transitions prevent accidental double-completion. + +4. **Expansion readiness** + - Whether abstractions are sufficient for local Stripe slice now and future provider adapters later. + +--- + +## Suggested review checklist for external reviewer + +1. Validate the contract mapping end-to-end: + - Compare `commerce-plugin-architecture.md` vs `packages/plugins/commerce/src/kernel/errors.ts` and finalize decision behavior. +2. Validate idempotency assumptions in kernel helpers: + - `packages/plugins/commerce/src/kernel/idempotency-key.ts` and existing tests. +3. Validate rate limiting behavior under burst and window edge cases: + - `packages/plugins/commerce/src/kernel/rate-limit-window.ts` + tests. +4. Validate finalize decision precedence: + - `packages/plugins/commerce/src/kernel/finalize-decision.ts` + tests. +5. Validate provider boundary and policy behavior: + - `packages/plugins/commerce/src/kernel/provider-policy.ts`. +6. Validate integration style: + - Compare with EmDash plugin reference implementations in `packages/plugins/forms/src/*`. +7. Validate that development constraints and conventions are observed: + - `AGENTS.md` and `skills/creating-plugins/SKILL.md`. + +--- + +## Potential risk areas to watch closely + +- **Scope drift**: It is easy to add provider-agnostic abstractions before state and payload contracts are fully stable. +- **State explosion**: `OrderPaymentPhase` and webhook status unions must remain explicit; hidden values can create silent transitions. +- **Replay semantics**: Webhook handling must be deterministic across retries, including explicit memoization behavior around already-processed and duplicate signatures. +- **Operator UX coupling**: As soon as admin tooling starts writing states, they must enforce the same kernel transitions and not bypass invariants. + +--- + +## Open assumptions requiring confirmation + +- At least one external webhook/event source (likely Stripe in phase 1) will be handled via a stable reconciliation strategy that surfaces both `processed` and `error` receipt states to finalize logic. +- Inventory decrement should remain finalize-gated (not merely cart-authorized) in the first stable slice. +- Storage-backed idempotency and webhook receipt persistence is planned in the next coding phase as stated in the handover document. +- Analytics/financial reporting is intentionally excluded from phase 1 to avoid unverified derived state. + +--- + +## Suggested immediate next milestones (so review feedback can be verified) + +1. Implement/finish storage-backed persistence for: + - webhook receipts, + - payment attempts, + - idempotency key replay windows, + - finalized order snapshots. +2. Integrate the Stripe provider slice end-to-end with kernel contracts. +3. Implement the canonical checkout and webhook endpoints. +4. Add replay/conflict tests that assert idempotent finalization under duplicate webhook deliveries. +5. Provide minimal admin visibility for failure and reconciliation status. + +--- + +## Included files in this review package + +This package contains: + +- Architecture and directive documents: `HANDOVER.md`, `commerce-plugin-architecture.md`, `emdash-commerce-deep-evaluation.md`, `emdash-commerce-final-review-plan.md`, `high-level-plan.md`, `commerce-vs-x402-merchants.md`, `3rdpary_review.md`, `3rdpary_review_2.md`. +- Coding guardrails and plugin conventions: `AGENTS.md`, `skills/creating-plugins/SKILL.md`. +- Commerce plugin metadata and kernel code: `packages/plugins/commerce/package.json`, `packages/plugins/commerce/tsconfig.json`, `packages/plugins/commerce/vitest.config.ts`, `packages/plugins/commerce/src/kernel/errors.ts`, `packages/plugins/commerce/src/kernel/finalize-decision.ts`, `packages/plugins/commerce/src/kernel/finalize-decision.test.ts`, `packages/plugins/commerce/src/kernel/limits.ts`, `packages/plugins/commerce/src/kernel/rate-limit-window.ts`, `packages/plugins/commerce/src/kernel/rate-limit-window.test.ts`, `packages/plugins/commerce/src/kernel/idempotency-key.ts`, `packages/plugins/commerce/src/kernel/idempotency-key.test.ts`, `packages/plugins/commerce/src/kernel/provider-policy.ts`. +- Plugin reference implementation for pattern comparison: `packages/plugins/forms/src/index.ts`, `packages/plugins/forms/src/storage.ts`, `packages/plugins/forms/src/schemas.ts`, `packages/plugins/forms/src/handlers/submit.ts`, `packages/plugins/forms/src/types.ts`. + +--- + +## Delivery + +This document is named `3rdpary_review_4.md` and should be reviewed before `latest-code_4.zip`. diff --git a/@THIRD_PARTY_REVIEW_PACKAGE.md b/@THIRD_PARTY_REVIEW_PACKAGE.md new file mode 100644 index 000000000..a17bbbdb8 --- /dev/null +++ b/@THIRD_PARTY_REVIEW_PACKAGE.md @@ -0,0 +1,34 @@ +# Third-Party Review Package + +Use this as the single canonical starting point for external review. + +## Share these files + +1. `@THIRD_PARTY_REVIEW_PACKAGE.md` — canonical entrypoint +2. `external_review.md` — full system/repo context +3. `HANDOVER.md` — current implementation status +4. `commerce-plugin-architecture.md` — architecture and invariants +5. `3rd-party-checklist.md` — pass/fail checklist + +For one-line onboarding: +`@THIRD_PARTY_REVIEW_PACKAGE.md` → `external_review.md` → `HANDOVER.md` → `commerce-plugin-architecture.md`. + +## Supporting evidence + +- `packages/plugins/commerce/FINALIZATION_REVIEW_AUDIT.md` +- `packages/plugins/commerce/COMMERCE_DOCS_INDEX.md` +- `packages/plugins/commerce/src/orchestration/finalize-payment.ts` +- `packages/plugins/commerce/src/orchestration/finalize-payment.test.ts` + +## Reviewer guidance + +- Treat `@THIRD_PARTY_REVIEW_PACKAGE.md` as the only current entrypoint. +- The main residual production caveat is the documented same-event concurrency limit of the underlying storage model. +- Spend most review time on the failure-heavy paths: duplicate webhook delivery, replay/resume behavior, partial inventory writes, and cart ownership checks. +- Treat receipt `pending` as a correctness boundary, not a cosmetic state: it is both the resumable claim marker and the replay control surface for finalization. + +## Scope note + +The current package is intentionally narrow: this is a Stage-1 commerce kernel, +not a generalized provider platform. Evaluate correctness, replay safety, and +boundary discipline before asking for broader architecture. diff --git a/CHANGELOG_REVIEW_NOTES.md b/CHANGELOG_REVIEW_NOTES.md new file mode 100644 index 000000000..c3235de12 --- /dev/null +++ b/CHANGELOG_REVIEW_NOTES.md @@ -0,0 +1,8 @@ +# Third-Party Review Changelog Notes + +- 2026-04-03: Added canonicalized review-entrypath alignment (single canonical packet via `@THIRD_PARTY_REVIEW_PACKAGE.md`), removed lingering legacy `src/hash.ts` dependence from review status, and recorded stage-1 runtime completion: possession enforcement + closed-loop finalize path + deterministic webhook/idempotency behavior. +- 2026-04-02: Replaced partial commerce error metadata in `packages/plugins/commerce/src/kernel/errors.ts` with canonical `COMMERCE_ERRORS` to align kernel error contracts with architecture. +- 2026-04-02: Clarified `packages/plugins/commerce/src/kernel/limits.ts` and related comments to state explicit fixed-window rate-limit semantics, matching implementation behavior. +- 2026-04-02: Added fixed-window boundary coverage in `packages/plugins/commerce/src/kernel/rate-limit-window.test.ts` to prevent ambiguity around window resets. +- 2026-04-02: Expanded finalize decision types and precedence rules in `packages/plugins/commerce/src/kernel/finalize-decision.ts` to handle paid/replay/pending/error/non-finalizable states deterministically. +- 2026-04-02: Updated `packages/plugins/commerce/src/kernel/finalize-decision.test.ts` with coverage for webhook receipt states (`processed`, `duplicate`, `pending`, `error`) and explicit already-paid/no-op precedence. diff --git a/COMMERCE_REVIEW_OPTION_A_EXECUTION_NOTES.md b/COMMERCE_REVIEW_OPTION_A_EXECUTION_NOTES.md new file mode 100644 index 000000000..6f19064a3 --- /dev/null +++ b/COMMERCE_REVIEW_OPTION_A_EXECUTION_NOTES.md @@ -0,0 +1,10 @@ +# Commerce finalize hardening execution notes (DEPRECATED) + +This document is archived historical context for **Option A** execution. +Use the current canonical packet in: + +- `@THIRD_PARTY_REVIEW_PACKAGE.md` +- `external_review.md` +- `HANDOVER.md` +- `3rd-party-checklist.md` +- `packages/plugins/commerce/FINALIZATION_REVIEW_AUDIT.md` diff --git a/COMMERCE_REVIEW_OPTION_A_PLAN.md b/COMMERCE_REVIEW_OPTION_A_PLAN.md new file mode 100644 index 000000000..3a3629323 --- /dev/null +++ b/COMMERCE_REVIEW_OPTION_A_PLAN.md @@ -0,0 +1,10 @@ +# Commerce finalize hardening review plan (DEPRECATED) + +This document is archived historical context for **Option A** planning. +Use the current canonical packet in: + +- `@THIRD_PARTY_REVIEW_PACKAGE.md` +- `external_review.md` +- `HANDOVER.md` +- `3rd-party-checklist.md` +- `packages/plugins/commerce/FINALIZATION_REVIEW_AUDIT.md` diff --git a/HANDOVER.md b/HANDOVER.md new file mode 100644 index 000000000..a9343a86f --- /dev/null +++ b/HANDOVER.md @@ -0,0 +1,98 @@ +# HANDOVER + +## 1) Purpose and current problem statement + +This repository is an EmDash monorepo with the active work on the commerce plugin in `packages/plugins/commerce`. The current objective is to stabilize and simplify ordered-child behavior (asset links and bundle components) without changing runtime contracts, then continue external-review-driven hardening of correctness in catalog reads, inventory coupling, and checkout/finalize invariants. + +This handoff is for the next phase only: keep behavior stable, apply smallest possible patches, and avoid speculative refactors outside the requested scope. + +## 2) Completed work and outcomes + +The latest cycle completed the Strategy A lock-in pass. Existing ordered-child helper logic was moved from `catalog.ts` into a neutral utility module so catalog handlers now consume a shared contract rather than local duplicates. This reduced duplication and made ordering invariants easier to test while preserving behavior. + +Recent work before this handoff also includes: + +- catalog read-path batching improvements to reduce per-product query fan-out. +- `inventoryStockDocId` moved into shared library code and consumed from lib/orchestration call sites to reduce coupling. +- fixes for initial failures in collection helper usage and batching return-shape handling. +- 5F staged rollout and proof follow-through for strict claim-lease finalization: + - strict/legacy finalize test families were validated, + - strict-metadata replay behavior is documented in current strategy/regression notes, + - rollout evidence artifacts were recorded for audit and ops promotion. + +The branch was pushed at commit `ab065b3` with passing typecheck/tests/lint for the commerce package at handoff. + +## 3) Failures, open issues, and lessons learned + +Observed issues were concrete and fixed in-place: + +- A tuple parsing/type-shape issue in read-path batching during an earlier stage. +- Unbound `getMany` method access in collection helpers for test doubles. +- A move-invariant edge around ordered rows was addressed by centralized helper tests and unchanged semantics. + +There are no known blocking runtime regressions at this point. + +Open issues to prioritize next: + +1. Keep catalog responsibilities manageable; `catalog.ts` remains large, so consider splitting only if behavior adds complexity that warrants structural refactor. +2. Continue periodic review of CI configuration policy when the temporary process changes need to be reapplied. + +Lessons: + +- Keep helper helpers compatible with both real storage and in-memory collections. +- Keep ordering semantics in one place and assert them through shared tests. + +## 4) Files changed, key insights, and gotchas + +Priority files for continuation: + +- `packages/plugins/commerce/src/handlers/catalog.ts` — shared ordered-row helpers removed from this file and replaced with imports. +- `packages/plugins/commerce/src/lib/ordered-rows.ts` — canonical ordered-row normalization/mutation/persistence logic. +- `packages/plugins/commerce/src/lib/ordered-rows.test.ts` — regression coverage for ordering/normalization/mutation behavior. +- `packages/plugins/commerce/src/handlers/catalog.test.ts` — order-related scenarios remain covered. +- `packages/plugins/commerce/src/lib/inventory-stock.ts` — shared inventory id helper. +- `packages/plugins/commerce/src/lib/catalog-order-snapshots.ts` +- `packages/plugins/commerce/src/lib/checkout-inventory-validation.ts` + +Gotchas: + +- Do not call collection methods unbound when they depend on internal `this` (`getMany`, `query`, etc.). +- Preserve ordered-child semantics exactly when extending handlers (position normalization, list re-sequencing, and updated `position` persistence). +- Keep tests aligned to behavior; do not alter finalize/checkout contracts unless explicitly required by a correctness issue. + +## 5) Key files and directories + +Critical paths: + +- `packages/plugins/commerce/src/handlers/` +- `packages/plugins/commerce/src/lib/` +- `packages/plugins/commerce/src/orchestration/` +- `packages/plugins/commerce/src/schema/` (if migration-level adjustments are needed) +- `packages/plugins/commerce/src/types.ts` +- `packages/plugins/commerce/src/schemas.ts` + +Documentation for onboarding and review context: + +- `HANDOVER.md` +- `external_review.md` +- `@THIRD_PARTY_REVIEW_PACKAGE.md` +- `emdash_commerce_review_update_ordered_children.md` +- `packages/plugins/commerce/COMMERCE_DOCS_INDEX.md` +- `prompts.txt` + +## 6) Baseline check before coding + +Run these commands before new changes: + +- `pnpm --silent lint:quick` +- `pnpm typecheck` +- `pnpm --filter @emdash-cms/plugin-commerce test` + +## 7) Completion checklist + +Before final handoff each batch: + +- Update `HANDOVER.md` with what changed and why. +- Record the commit hash. +- Confirm no uncommitted changes with `git status`. +- Confirm `test/lint/typecheck` status for touched package(s). diff --git a/SHARE_WITH_REVIEWER.md b/SHARE_WITH_REVIEWER.md new file mode 100644 index 000000000..0afbfccec --- /dev/null +++ b/SHARE_WITH_REVIEWER.md @@ -0,0 +1,41 @@ +# Files to share with a 3rd-party reviewer + +Use `@THIRD_PARTY_REVIEW_PACKAGE.md` as the single canonical review entrypoint. + +For a single-file handoff, share: + +- `commerce-plugin-external-review.zip` +- `SHARE_WITH_REVIEWER.md` (this file) + +`commerce-plugin-external-review.zip` is regenerated from the current repository +state via: + +```bash +./scripts/build-commerce-external-review-zip.sh +``` + +That archive contains: + +- full `packages/plugins/commerce/` source tree (excluding `node_modules` and `.vite`), +- all review packet files required for onboarding: + - `@THIRD_PARTY_REVIEW_PACKAGE.md` + - `external_review.md` + - `HANDOVER.md` + - `commerce-plugin-architecture.md` + - `3rd-party-checklist.md` +- no nested `*.zip` artifacts. + +For local verification, confirm the archive metadata in your message: + +- File path: `./commerce-plugin-external-review.zip` +- Generator script: `scripts/build-commerce-external-review-zip.sh` +- Build anchor: commit `bda8b75` (generated 2026-04-03) + +`SHARE_WITH_REVIEWER.md` is intentionally shared outside the zip because it is the +single-file handoff companion and should be included directly in the reviewer message. + +Ask reviewers to focus on: + +- same-event concurrent webhook delivery as the main residual production risk, +- `pending` receipt semantics as a replay/resume correctness boundary, +- duplicate delivery, partial-write recovery, and cart ownership edge cases over broad architecture suggestions. diff --git a/commerce-plugin-architecture.md b/commerce-plugin-architecture.md new file mode 100644 index 000000000..c963d68e2 --- /dev/null +++ b/commerce-plugin-architecture.md @@ -0,0 +1,1870 @@ +# EmDash Commerce Plugin — Architecture Plan + +> This document supersedes the high-level-plan.md sketch and serves as the +> authoritative blueprint before any code is written. It defines principles, +> extension model, data model, route contracts, AI strategy, phased plan, and +> the complete specification for Step 1. + +--- + +## 1. The Core Problem We Are Solving + +WooCommerce's extensibility problems are not implementation bugs — they are +**architectural mismatches**: + +- Theme/layout coupling (Storefront theme overrides, child themes, template + hierarchy). +- Untyped PHP hook/filter system (`add_action`, `add_filter`) with no + discoverability, no contracts, and no type safety. +- Extension plugins that mutate global cart state unpredictably. +- Product types implemented via class inheritance, making new types invasive. +- Admin UI built on WordPress core, requiring deep WP-specific knowledge. + +Our solution makes different foundational decisions: + +| Problem | Our answer | +| ------------------------------- | ------------------------------------------------------------------------------------------------------ | +| Layout coupling | Headless by default. Frontend is pure Astro. Plugin ships components, not themes. | +| Untyped hooks | Typed TypeScript event catalog. Hooks are observations, not filters. | +| Mutable global state | Immutable data flow. Cart/order state transitions are explicit and guarded. | +| Inheritance-based product types | Discriminated union + `typeData` blob. New types are additive, not invasive. | +| WP admin complexity | Block Kit (declarative JSON) for standard UI; React only where complexity demands it. | +| Extension plugin fragility | Provider registry contract. Extensions register themselves; core calls them via typed route contracts. | + +--- + +## 2. Design Philosophy + +**Correctness over cleverness.** Every mutation goes through an explicit state +check. No implicit side effects. + +**Contracts are the product.** The TypeScript interfaces this plugin exports to +extension developers are the API surface. They must be stable, narrow, and +well-documented. + +**EmDash-native primitives first.** `ctx.storage`, `ctx.kv`, `ctx.http`, +`ctx.email`, `ctx.cron` cover every need. No npm dependencies for core logic. + +**AI as a first-class actor.** Every operation that a human merchant performs +must also be performable by an AI agent. This shapes route design, event +structure, and error semantics. + +**YAGNI until the data model.** For the data model, think ahead — it is +expensive to migrate. For everything else, build the minimum that is correct. + +--- + +## 3. Plugin Architecture Hierarchy + +``` +EmDash CMS Core +└── @emdash-cms/plugin-commerce ← Native plugin (React admin, Astro, PT blocks) + │ + ├── Provider extension points (Standard plugins — marketplace-publishable) + │ ├── @emdash-cms/plugin-commerce-stripe Payment provider + │ ├── @emdash-cms/plugin-commerce-paypal Payment provider + │ ├── @emdash-cms/plugin-shipping-flat Shipping provider + │ ├── @emdash-cms/plugin-tax-simple Tax provider + │ └── @emdash-cms/plugin-commerce-mcp MCP server for AI agents + │ + └── Storefront extensions (Standard plugins — marketplace-publishable) + ├── @emdash-cms/plugin-reviews Product reviews + ├── @emdash-cms/plugin-wishlist Wishlist + ├── @emdash-cms/plugin-loyalty Points / loyalty + └── @emdash-cms/plugin-subscriptions Recurring billing +``` + +### Why native for the core plugin? + +The commerce core requires: + +- Complex React admin UI (product variant editor, order management, media upload). +- Astro components for frontend rendering (``, ``, etc.). +- Portable Text block types (embed product in a content body). + +These features are **native-only** per EmDash's plugin model. The plugin still +uses `ctx.*` APIs for all data access and produces no privileged side effects — +it is architecturally equivalent to a standard plugin in terms of isolation, but +needs the native execution context for its UI. + +### Why standard for extension plugins? + +Extension plugins (payment gateways, shipping, tax, reviews) have simple, +narrow concerns: implement a typed interface and expose one to three routes. +Standard format is sufficient, allows marketplace distribution, and can be +sandboxed — appropriate for third-party code. + +--- + +## 4. Extension Framework Model + +WooCommerce uses PHP abstract classes and hooks to let extension plugins add +payment gateways, shipping methods, and product types. This is powerful but +brittle. Our model uses the **provider registry pattern**. + +### How it works + +1. The commerce plugin defines typed **provider interfaces** as exported TypeScript + types in a companion SDK package (`@emdash-cms/plugin-commerce-sdk`). + +2. Extension plugins import the SDK, implement the interface, and call our + `providers/register` route on `plugin:activate`. The registration record is + stored in our `providers` collection. + +3. At runtime (checkout, shipping estimate, tax calculation), our commerce + plugin reads the active provider from storage, then delegates to the + provider's route via `ctx.http.fetch`. + +4. On `plugin:deactivate`, extension plugins call `providers/unregister`. + +### Contracts (in `@emdash-cms/plugin-commerce-sdk`) + +``` +PaymentProviderContract + - routes.initiate → PaymentInitiateRequest → PaymentInitiateResponse + - routes.confirm → PaymentConfirmRequest → PaymentConfirmResponse + - routes.refund → PaymentRefundRequest → PaymentRefundResponse + - routes.webhook → raw webhook payload → void + +ShippingProviderContract + - routes.getRates → ShippingRateRequest → ShippingRate[] + +TaxProviderContract + - routes.calculate → TaxCalculationRequest → TaxCalculationResponse + +FulfillmentProviderContract + - routes.fulfill → FulfillmentRequest → FulfillmentResponse + - routes.getStatus → { fulfillmentRef } → FulfillmentStatus +``` + +### Key properties of this model + +- **No class inheritance.** Extension plugins implement a structural interface. +- **No PHP-style filters.** Extensions cannot mutate core data mid-flow. +- **Type-safe contracts.** The SDK package exports Zod schemas matching the + interfaces. Extension plugin authors get compile-time safety. +- **Multiple providers, one active.** The registry supports multiple registered + providers per type. The merchant selects the active one in admin settings. + Fallback behavior is defined per type. + +### Provider execution model — two modes, one contract + +The contract interface is identical in both modes. **Execution mode** depends on +how the provider plugin is installed: + +| Mode | When | How the core calls the provider | +| ---------------------- | -------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------- | +| **In-process adapter** | Plugin installed as trusted (in-process, `plugins: []`) | Direct TypeScript function call. No HTTP. No subrequest. | +| **Route delegation** | Plugin installed as sandboxed (`sandboxed: []`) or across isolate boundary | Core calls `ctx.http.fetch` to the provider's plugin route. Required by the EmDash sandbox model — the only permitted cross-isolate boundary. | + +**Default rule:** First-party provider plugins (Stripe, Authorize.net) run as +trusted in-process adapters. External API calls (to Stripe/Authorize.net APIs) +happen **inside** the provider adapter using `ctx.http.fetch` — not in the core +checkout path. Route delegation is reserved for genuinely sandboxed or +marketplace-distributed extensions. + +This preserves the contract model, removes unnecessary faux-network indirection +from the core checkout path, and keeps local dev and testing simple. + +--- + +## 5. Product Type Model + +WooCommerce implements product types as PHP class inheritance. Adding a new type +means extending `WC_Product` and registering hooks everywhere. This is the +primary source of plugin complexity for most WooCommerce stores. + +Our model uses a **discriminated union** with a `type` field and a `typeData` +JSON blob. The base product record is always the same. Type-specific fields live +in `typeData` and are validated in route handlers, not at the storage layer. + +### Product type taxonomy + +| Type | Description | +| ----------- | ---------------------------------------------------------- | +| `simple` | Single SKU, fixed price, tracked inventory | +| `variable` | Parent product with variants (color × size, etc.) | +| `bundle` | Composed of other products with optional pricing rules | +| `digital` | Downloadable file(s), no shipping, optional license limits | +| `gift_card` | Fixed or custom denomination, delivered by email | + +New types are additive: define new `typeData` shape, add a validator, add a +route handler branch. Nothing in core changes. + +### ProductBase (all types share this) + +```typescript +interface ProductBase { + type: "simple" | "variable" | "bundle" | "digital" | "gift_card"; + name: string; + slug: string; // URL-safe, unique + status: "draft" | "active" | "archived"; + publishedAt?: string; // When first made active; null = never published + descriptionBlocks?: unknown[]; // Portable Text + shortDescription?: string; // Plain text summary (for AI/search/embeddings) + searchText?: string; // Denormalized: name + sku + tags for full-text queries + basePrice: number; // Cents / smallest currency unit + compareAtPrice?: number; // Strike-through price + currency: string; // ISO 4217 + mediaIds: string[]; // References to ctx.media + categoryIds: string[]; + tags: string[]; + requiresShipping: boolean; // false for digital, gift cards; affects checkout flow + taxCategory?: string; // For tax module: "standard" | "reduced" | "zero" | custom + defaultVariantId?: string; // For variable products: pre-selected variant on product page + seoTitle?: string; + seoDescription?: string; + typeData: Record; // Validated per type in handlers + meta: Record; // Extension plugins store data here; not a junk drawer + createdAt: string; + updatedAt: string; +} +``` + +### Type-specific typeData shapes + +```typescript +interface SimpleTypeData { + sku: string; + stockQty: number; + stockPolicy: "track" | "ignore" | "backorder"; + weight?: number; // grams + dimensions?: { length: number; width: number; height: number }; // mm + shippingClass?: string; + taxClass?: string; +} + +interface VariableTypeData { + attributeIds: string[]; // References productAttributes collection + // Variants stored separately in productVariants collection +} + +interface BundleTypeData { + items: Array<{ + productId: string; + variantId?: string; + qty: number; + priceOverride?: number; // Override individual item price in bundle + }>; + pricingMode: "fixed" | "calculated" | "discount"; + discountPercent?: number; // For pricingMode: "discount" +} + +interface DigitalTypeData { + downloads: Array<{ + fileId: string; + name: string; + downloadLimit?: number; + }>; + licenseType: "single" | "multi" | "unlimited"; + downloadExpiryDays?: number; +} + +interface GiftCardTypeData { + denominations: number[]; // Fixed amount options + allowCustomAmount: boolean; + minCustomAmount?: number; + maxCustomAmount?: number; +} +``` + +### Product variants (for type: "variable") + +Variants are stored in a separate `productVariants` collection. Each variant is +a complete purchasable unit with its own SKU, price, and stock. + +```typescript +interface ProductVariant { + productId: string; + sku: string; + attributeValues: Record; // { color: "Red", size: "L" } + price: number; + compareAtPrice?: number; + stockQty: number; + stockPolicy: "track" | "ignore" | "backorder"; + inventoryVersion: number; // Monotonic counter; used in finalize-time optimistic check + mediaIds: string[]; + active: boolean; + sortOrder: number; + meta: Record; + createdAt: string; + updatedAt: string; +} +``` + +### Product attributes (for type: "variable") + +Attributes define the axis of variation (Color, Size). Terms define the values +(Red, Blue; Small, Medium, Large). + +```typescript +interface ProductAttribute { + name: string; + slug: string; + displayType: "select" | "color_swatch" | "button"; + terms: Array<{ + label: string; + value: string; + color?: string; // For displayType: "color_swatch" + sortOrder: number; + }>; + sortOrder: number; + createdAt: string; +} +``` + +--- + +## 6. Cart and Order Data Model + +### Cart + +```typescript +type CartStatus = + | "active" // In use; items can be added/removed + | "merged" // Anonymous cart merged into a logged-in user's cart on login + | "abandoned" // No activity for configured TTL; cron marks it; triggers recovery flow + | "converted" // Checkout completed; order created from this cart + | "expired"; // Past expiresAt without conversion or abandonment action + +interface Cart { + cartToken: string; // Opaque, used in Cookie / Authorization header + userId?: string; // Set when authenticated user is identified + status: CartStatus; + currency: string; + discountCode?: string; + discountAmount?: number; + shippingRateId?: string; // Selected shipping rate ID from provider + shippingAmount?: number; + taxAmount?: number; + note?: string; + expiresAt: string; + createdAt: string; + updatedAt: string; +} + +interface CartItem { + cartId: string; + productId: string; + variantId?: string; + qty: number; + unitPrice: number; // Cents. Frozen at time of add. + lineTotal: number; // qty × unitPrice + meta: Record; // Extension data (e.g., bundle composition) + createdAt: string; + updatedAt: string; +} +``` + +### Order state machine + +Allowed transitions only. Handlers must reject any transition not in this table. + +``` +draft + ↓ checkout.create called +payment_pending + ↓ gateway webhook: authorized (auth-only flow, e.g. Authorize.net) +authorized + ↓ gateway webhook: captured (immediate for Stripe card; delayed for bank ACH) + ↓ (from payment_pending direct, for gateways with no separate auth step) +paid + ↓ merchant/agent marks processing +processing + ↓ fulfillment webhook or manual mark +fulfilled + +From any pre-fulfilled state: + → canceled (before payment_pending: no gateway action needed) + → canceled (from authorized: void must be called on gateway first) + +From paid / fulfilled: + → refund_pending (refund initiated, awaiting gateway confirmation) + → refunded (gateway confirmed full refund) + → partial_refund (gateway confirmed partial refund) + +Exceptional: + → payment_conflict (payment succeeded at gateway but inventory finalize failed; + requires manual resolution or auto-void/refund) +``` + +```typescript +type OrderStatus = + | "draft" // Order record created; payment not yet initiated + | "payment_pending" // Payment session initiated; awaiting gateway event + | "authorized" // Payment authorized but not yet captured (auth+capture flows) + | "paid" // Payment captured; inventory decremented + | "processing" // Paid; merchant/fulfillment is preparing the shipment + | "fulfilled" // Shipped or delivered; order complete + | "canceled" // Canceled before/without successful payment + | "refund_pending" // Refund initiated; awaiting gateway confirmation + | "refunded" // Fully refunded + | "partial_refund" // Partially refunded + | "payment_conflict"; // Payment succeeded but finalization failed; needs resolution + +type PaymentStatus = + | "requires_action" // Awaiting customer action (3DS, redirect, bank confirmation) + | "pending" // Submitted to gateway; no confirmation yet + | "authorized" // Authorized but not captured + | "captured" // Funds captured (equivalent to "paid" at payment level) + | "failed" // Gateway rejected or timed out + | "voided" // Authorization canceled before capture + | "refund_pending" // Refund in flight + | "refunded" // Fully refunded + | "partial_refund"; // Partially refunded + +interface Order { + orderNumber: string; // Human-readable, unique: ORD-2026-00001 + cartId?: string; + userId?: string; + customer: CustomerSnapshot; // Frozen at checkout time + lineItems: OrderLineItem[]; // Frozen at checkout time + subtotal: number; + discountCode?: string; + discountAmount: number; + shippingAmount: number; + taxAmount: number; + total: number; + currency: string; + status: OrderStatus; + paymentStatus: PaymentStatus; + paymentProviderId?: string; + paymentProviderRef?: string; // Provider's transaction / charge ID + fulfillmentProviderId?: string; + fulfillmentRef?: string; + notes?: string; + meta: Record; + createdAt: string; + updatedAt: string; +} + +interface OrderLineItem { + productId: string; + variantId?: string; + productName: string; // Snapshot — survives product deletion + sku: string; // Snapshot + qty: number; + unitPrice: number; + lineTotal: number; + meta: Record; +} + +interface OrderEvent { + orderId: string; + eventType: string; // "status_changed" | "note_added" | "refund_initiated" | etc. + actor: "customer" | "merchant" | "system" | "agent"; + payload: Record; + createdAt: string; +} +``` + +### Customer snapshot + +```typescript +interface CustomerSnapshot { + email: string; + firstName: string; + lastName: string; + phone?: string; + billingAddress: Address; + shippingAddress: Address; +} + +interface Address { + line1: string; + line2?: string; + city: string; + state: string; + postalCode: string; + country: string; // ISO 3166-1 alpha-2 +} +``` + +--- + +## 7. Storage Schema + +```typescript +export const COMMERCE_STORAGE_CONFIG = { + products: { + indexes: [ + "status", + "type", + "createdAt", + "updatedAt", + ["status", "type"], + ["status", "createdAt"], + ] as const, + uniqueIndexes: ["slug"] as const, + }, + productVariants: { + indexes: ["productId", "active", ["productId", "active"], ["productId", "sortOrder"]] as const, + uniqueIndexes: ["sku"] as const, + }, + productAttributes: { + indexes: ["sortOrder"] as const, + uniqueIndexes: ["slug"] as const, + }, + carts: { + indexes: [ + "userId", + "status", + "expiresAt", + "createdAt", + ["status", "expiresAt"], + ["userId", "status"], + ] as const, + uniqueIndexes: ["cartToken"] as const, + }, + cartItems: { + indexes: ["cartId", "productId", ["cartId", "productId"]] as const, + }, + orders: { + indexes: [ + "status", + "paymentStatus", + "userId", + "createdAt", + ["status", "createdAt"], + ["userId", "createdAt"], + ["paymentStatus", "createdAt"], + ] as const, + uniqueIndexes: ["orderNumber"] as const, + }, + orderEvents: { + indexes: ["orderId", "createdAt", ["orderId", "createdAt"]] as const, + }, + providers: { + indexes: ["providerType", "active", "pluginId", ["providerType", "active"]] as const, + uniqueIndexes: ["providerId"] as const, + }, + + // Append-only ledger of every inventory movement. stockQty is derived from this. + // Never update or delete rows; always insert a new record. + inventoryLedger: { + indexes: [ + "productId", + "variantId", + "referenceType", + "referenceId", + "createdAt", + ["productId", "createdAt"], + ["variantId", "createdAt"], + ] as const, + }, + + // One record per payment attempt, regardless of outcome. + paymentAttempts: { + indexes: [ + "orderId", + "providerId", + "status", + "createdAt", + ["orderId", "status"], + ["providerId", "createdAt"], + ] as const, + }, + + // Deduplicated log of every inbound webhook. Used for idempotency and replay detection. + // Composite unique: event IDs are only guaranteed unique per provider, not globally. + webhookReceipts: { + indexes: [ + "providerId", + "externalEventId", + "orderId", + "status", + "createdAt", + ["providerId", "externalEventId"], + ["orderId", "createdAt"], + ] as const, + uniqueIndexes: [["providerId", "externalEventId"]] as const, + }, + + // Server-side idempotency for mutating routes (e.g. checkout.create). + // Survives restarts; TTL enforced by cron deleting rows older than N hours. + idempotencyKeys: { + indexes: ["route", "createdAt", ["keyHash", "route"]] as const, + uniqueIndexes: [["keyHash", "route"]] as const, + }, +} satisfies PluginStorageConfig; +``` + +### Storage design notes + +- `lineItems` are **embedded** in the order document — immutable snapshots, never queried independently. +- `orderEvents` is a **separate collection** — append-only; supports order timeline queries. +- `inventoryLedger` is **append-only**. The `stockQty` field on `products`/`productVariants` is a materialized cache updated atomically with each ledger insert. Never mutate stock directly — always write a ledger record and derive the new count. +- `webhookReceipts`: unique on **`(providerId, externalEventId)`** — never assume event IDs are globally unique across gateways. +- `idempotencyKeys`: stores a **hash** of the client `Idempotency-Key` + route name + optional `userId` scope, plus a short JSON pointer to the prior successful response (`orderId`, etc.). Prevents duplicate orders when the client retries `checkout.create` after a network timeout. Suggested fields: `keyHash`, `route`, `userId?`, `httpStatus`, `responseRef` (e.g. `{ orderId }`), `createdAt`. +- `paymentAttempts` enables refund reconciliation, retry auditing, and support escalation without relying solely on the payment provider's dashboard. + +--- + +## 8. KV Key Namespace + +```typescript +export const KV_KEYS = { + // Merchant settings (set via admin, read at request time) + settings: { + currency: "settings:currency:default", // "USD" + currencySymbol: "settings:currency:symbol", // "$" + taxEnabled: "settings:tax:enabled", // boolean + taxDisplayMode: "settings:tax:displayMode", // "inclusive" | "exclusive" + shippingOriginAddress: "settings:shipping:origin", // Address JSON + orderNumberPrefix: "settings:order:prefix", // "ORD" + lowStockThreshold: "settings:inventory:lowStock", // number + storeEmail: "settings:store:email", + storeName: "settings:store:name", + }, + + // Operational state (managed by the plugin, not merchant) + state: { + cartExpiryMinutes: "state:cart:expiryMinutes", // default: 4320 (72h) + checkoutWindowMinutes: "state:checkout:windowMinutes", // default: 30 + orderNumberCounter: "state:order:numberCounter", // monotonic counter + }, + + // Optional hot-path cache only — authoritative dedupe remains `webhookReceipts` in storage. + webhookDedupe: (eventId: string) => `state:webhook:dedupe:${eventId}`, + + // Rate limits (fixed-window counters; values are JSON { count, windowStart }) + rateLimit: { + checkoutPerIp: (ipHash: string) => `state:ratelimit:checkout:ip:${ipHash}`, + cartMutatePerToken: (tokenHash: string) => `state:ratelimit:cart:token:${tokenHash}`, + webhookPerProvider: (providerId: string) => `state:ratelimit:webhook:prov:${providerId}`, + }, + + // Provider cache (invalidated when providers/register is called) + activeProviderCache: "state:providers:cache", + + // Circuit breaker: after N failures in window, short-circuit outbound provider calls + providerCircuit: (providerId: string) => `state:circuit:provider:${providerId}`, +} as const; +``` + +--- + +## 9. Route Contract Catalog + +All routes live at `/_emdash/api/plugins/emdash-commerce/`. + +### Public routes (no auth required) + +| Route | Input | Output | +| ---------------------- | -------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| `products/list` | `{ cursor?, limit?, status?, type?, categoryId?, tag? }` | `{ items: Product[], cursor?, hasMore }` | +| `products/get` | `{ id } \| { slug }` | `Product` | +| `products/variants` | `{ productId }` | `{ variants: ProductVariant[], attributes: ProductAttribute[] }` | +| `cart/get` | `{ cartToken }` | `CartWithTotals` | +| `cart/create` | `{ currency?, cartToken? }` | `Cart` | +| `cart/add-item` | `{ cartToken, productId, variantId?, qty, meta? }` | `CartWithTotals` | +| `cart/update-item` | `{ cartToken, itemId, qty }` | `CartWithTotals` | +| `cart/remove-item` | `{ cartToken, itemId }` | `CartWithTotals` | +| `cart/apply-discount` | `{ cartToken, code }` | `CartWithTotals` | +| `cart/remove-discount` | `{ cartToken }` | `CartWithTotals` | +| `cart/shipping-rates` | `{ cartToken, destination: Address }` | `{ rates: ShippingRate[] }` — **only when shipping module enabled** | +| `cart/select-shipping` | `{ cartToken, rateId }` | `CartWithTotals` — **only when shipping module enabled** | +| `checkout/create` | `{ cartToken, customer, shippingRateId? }` | `{ orderId, orderNumber, paymentSession }` — `shippingRateId` **required** only if cart contains shippable items and the shipping module is active; otherwise omit | +| `checkout/get-order` | `{ orderNumber }` | `Order` | +| `checkout/webhook` | raw + provider signature headers | void | + +### Admin routes (authenticated) + +| Route | Input | Output | +| --------------------------- | ---------------------------------------- | -------------------------------------- | +| `products/create` | `ProductCreateInput` | `Product` | +| `products/update` | `{ id } & Partial` | `Product` | +| `products/archive` | `{ id }` | `Product` | +| `products/delete` | `{ id }` | void | +| `products/inventory-adjust` | `{ id, variantId?, delta, reason }` | `{ newStockQty }` | +| `variants/create` | `VariantCreateInput` | `ProductVariant` | +| `variants/update` | `{ id } & Partial` | `ProductVariant` | +| `variants/delete` | `{ id }` | void | +| `attributes/list` | `{ cursor?, limit? }` | `{ items: ProductAttribute[] }` | +| `attributes/create` | `AttributeCreateInput` | `ProductAttribute` | +| `attributes/update` | `{ id } & Partial` | `ProductAttribute` | +| `orders/list` | `{ status?, cursor?, limit? }` | `{ items: Order[], cursor?, hasMore }` | +| `orders/get` | `{ id } \| { orderNumber }` | `Order` | +| `orders/update-status` | `{ id, status, note? }` | `Order` | +| `orders/add-note` | `{ id, note, visibility }` | `OrderEvent` | +| `orders/refund` | `{ id, amount, reason, lineItems? }` | `Order` | +| `providers/register` | `ProviderRegistration` | void | +| `providers/unregister` | `{ providerId }` | void | +| `providers/list` | `{ providerType? }` | `ProviderRegistration[]` | +| `settings/get` | void | `CommerceSettings` | +| `settings/update` | `Partial` | `CommerceSettings` | +| `analytics/summary` | `{ from, to, currency? }` | `AnalyticsSummary` | +| `analytics/top-products` | `{ from, to, limit? }` | `TopProductsReport` | +| `analytics/low-stock` | `{ threshold? }` | `LowStockItem[]` | +| `ai/draft-product` | `{ description: string }` | `ProductCreateInput` | + +--- + +## 10. Event Catalog + +These are the lifecycle events our plugin records in `orderEvents` and will emit +when EmDash supports custom plugin-to-plugin hook namespaces. Extension plugins +can observe these by polling `orders/events` or by registering a webhook. + +``` +commerce:product:created +commerce:product:updated +commerce:product:archived +commerce:inventory:low-stock { productId, variantId?, currentQty, threshold } +commerce:inventory:out-of-stock { productId, variantId? } +commerce:cart:created { cartToken, userId? } +commerce:cart:item:added { cartToken, productId, variantId?, qty } +commerce:cart:item:updated { cartToken, itemId, previousQty, newQty } +commerce:cart:item:removed { cartToken, itemId } +commerce:cart:abandoned { cartToken, userId?, itemCount, cartValue } +commerce:cart:expired { cartToken } +commerce:checkout:started { orderId, orderNumber, cartToken } +commerce:payment:initiated { orderId, providerId, sessionId } +commerce:payment:authorized { orderId, providerId, paymentRef } +commerce:payment:captured { orderId, providerId, paymentRef, amount } +commerce:payment:failed { orderId, providerId, reason } +commerce:order:created { orderId, orderNumber, total, currency } +commerce:order:status:changed { orderId, from, to, actor } +commerce:order:fulfilled { orderId, fulfillmentRef? } +commerce:order:refunded { orderId, amount, reason } +commerce:order:canceled { orderId, reason } +``` + +Extension plugins (loyalty, email automation, analytics, fulfillment) hook into +these events. The same events power the AI agent's observability stream. + +--- + +## 11. AI and Agent Integration Strategy + +**Implemented contracts in-tree:** `packages/plugins/commerce/AI-EXTENSIBILITY.md` +summarizes vector/catalog boundaries, the stub `recommendations` route, error-code +discipline for LLMs, and MCP packaging expectations. `HANDOVER.md` links this +work to the current execution stage. + +This is the primary competitive differentiator against WooCommerce and all +legacy commerce platforms. AI is not bolted on — it is an **assumed actor** in +the system design. + +### Design principles for AI-first commerce + +1. **Every route a human can call, an agent can call.** All admin routes use + structured JSON input/output — no form posts, no multi-step wizards. + +2. **Structured event log as truth.** `orderEvents` is the canonical audit trail. + Agents can replay or query it. Every significant state change produces a + structured event with `actor: "system" | "merchant" | "agent" | "customer"`. + +3. **`shortDescription` on every product.** Plain text field alongside the + Portable Text body. Embeddings, semantic search, and LLM reasoning work on + this. The full PT body is for human reading. + +4. **`meta` on every entity.** Extension data goes in `meta`. AI agents attach + structured reasoning artifacts (e.g., `{ demand_forecast: ..., restock_at: ... }`) + to products without touching core fields. + +5. **Consistent error semantics.** Every route error includes `code` (machine- + readable), `message` (human-readable), and `details` (structured context). + LLMs can branch on `code` without parsing `message`. + +6. **`ai/draft-product` route.** Accepts natural language: "A red leather + wallet, $49, limited to 50 units." Returns a structured `ProductCreateInput` + draft for merchant review and confirmation. Implemented via `ctx.http.fetch` + to an LLM API — provider configurable in settings. + +### MCP server package: `@emdash-cms/plugin-commerce-mcp` + +A standard plugin that registers as a MCP server exposing commerce operations +as tools. Merchant installs it alongside the commerce plugin. + +MCP tools exposed: + +``` +Product tools: + list_products → paginated product list + get_product → single product with variants + create_product → full product creation + update_product → partial update + archive_product → soft delete + draft_product_from_ai → NL description → draft ProductInput + adjust_inventory → delta adjustment with reason + get_low_stock → items below threshold + +Order tools: + list_orders → paginated with filters + get_order → full order with line items and events + update_order_status → explicit status transition + add_order_note → merchant/agent notes + process_refund → full or partial + cancel_order + +Analytics tools: + revenue_summary → total, AOV, unit count for period + top_products → by revenue or units + abandoned_cart_summary → count, value, recovery rate + +Store tools: + get_settings + update_settings + list_providers → active payment/shipping/tax providers +``` + +AI agents can use these tools to: + +- **Bulk import** product catalogs from CSV descriptions. +- **Fulfillment automation**: mark orders fulfilled when tracking number arrives. +- **Customer service**: look up order status and issue refunds. +- **Inventory management**: restock alerts and purchase order drafts. +- **Merchandising**: draft new product listings from brief descriptions. +- **Reporting**: pull revenue snapshots on schedule. + +--- + +## 12. Frontend Strategy + +The commerce plugin ships Astro components as the canonical frontend layer. +Sites use these components directly, customize them via props, or replace them +with custom implementations backed by our API routes. + +### Astro components (in `src/astro/`) + +``` + + + + + ← floating cart icon with item count + ← slide-in cart panel + ← full cart page + + +``` + +These are intentionally simple, composable, and styled with CSS variables so +sites can theme them without any overrides system. + +### Portable Text block types + +``` +product-embed ← embed a product card inline in content +product-grid ← curated product grid in content +buy-button ← standalone "Add to cart" button +``` + +--- + +## 13. Phased Implementation Plan + +The original phase plan was too broad too early. The revised plan below: + +- Freezes dangerous semantics before coding starts (Phase 0) +- Proves one complete real flow before expanding (Phases 1–3) +- Validates the provider abstraction with a second gateway before growing the ecosystem (Phase 4) +- Expands UI, AI tooling, and extensions only after correctness is proven (Phases 5–7) + +### Phase 0 — Semantic hardening + contracts (Step 1 spec, see Section 14) + +Package scaffold. TypeScript types. Storage schema. KV namespace. Route contracts +(Zod schemas). Provider interface contracts. State machine constants. Error +catalog constants. **No business logic yet.** + +**Exit criteria:** + +- `packages/plugins/commerce` builds with TypeScript; exports all types and schemas. +- State machine transition tables are in code as constants (not just docs). +- Error catalog is in code as a typed `const` object. +- Inventory ledger, payment attempt, and webhook receipt types are defined. +- No runtime logic exists yet; this milestone is purely contracts. + +### Phase 1 — Commerce kernel (Layer A, no UI) + +Pure domain logic with no admin, no Astro, no React, no MCP. Enforced by +directory structure (`src/kernel/`). All business functions are pure or take +explicit I/O dependencies via injection — no direct `ctx.*` calls inside kernel. + +Scope: + +- Simple product domain rules and validation. +- Cart service: create, add item, update qty, remove, totals, expiry. +- Inventory service: `adjustStock(delta, reason, referenceType, referenceId)` — writes ledger + updates qty atomically. +- Order snapshot creation from cart. +- `finalizePayment(orderId, paymentRef)` — the single authoritative finalization path: + 1. Check idempotency (`webhookReceipts.externalEventId`). + 2. Verify order is in `payment_pending` or `authorized`. + 3. Read variant `inventoryVersion` at time of cart snapshot vs current — if changed and stock now insufficient, transition order to `payment_conflict` and return `insufficient_stock`. + 4. Decrement stock, insert ledger row. + 5. Transition order to `paid`, payment to `captured`. + 6. Emit side effects (email, events) **after** the above succeeds. +- Error types using the catalog (Section 16). +- Domain event records for `orderEvents`. + +**Exit criteria:** + +- All kernel functions are pure / injected; zero `ctx.*` imports inside `src/kernel/`. +- `finalizePayment` is idempotent (calling twice with same `externalEventId` is a no-op). +- Tests cover: duplicate finalize, stock-change conflict, stale cart, state transition guards. + +### Phase 2 — One real vertical slice (Stripe + EmDash plugin wrapper) + +One complete purchase flow, end-to-end: + +- View a simple product (public `products/get` route). +- Add to cart, view cart, update/remove items (cart routes). +- Checkout start: create `draft` order, initiate Stripe Payment Intent. +- Stripe webhook: verify signature → idempotency check → call `finalizePayment`. +- Order visible in admin (Block Kit order list page). +- Order timeline (`orderEvents`) visible in admin for debugging. +- Order confirmation email. + +EmDash plugin wrapper (`src/plugin/`): descriptor, `definePlugin`, routes wiring +into kernel, `ctx.storage` as the I/O layer, `ctx.kv`, `ctx.email`, `ctx.http`. + +Storefront: one minimal Astro page per step (product, cart, checkout, confirmation). +No `` component library yet — that is Phase 5. Goal: prove the flow, +not ship a UI framework. + +**Exit criteria:** + +- A test customer can buy a real simple product in Stripe test mode, end to end. +- Order finalizes correctly. Inventory decrements. Email sends. +- Duplicate Stripe webhook does not double-decrement stock. +- Inventory conflict path returns structured `payment_conflict` order + initiates auto-void. + +### Phase 3 — Hardening before features + +No new features. Pressure-test Phase 2 against expected failure cases: + +Required tests added in this phase: + +- Duplicate webhook (same `externalEventId`). +- Retry after webhook timeout (second delivery after first partially processed). +- Inventory changed between cart creation and finalize. +- Cart expired before checkout.create. +- Payment success + inventory failure → `payment_conflict` → auto-void triggered. +- Order finalization idempotency (repeated callback replay). +- Cancellation and refund state transition guards (invalid transitions rejected). +- Stale cart reuse after TTL. + +If the architecture bends under these tests, fix it before Phase 4. + +**Exit criteria:** All failure cases above have passing tests. No architectural +regressions from fixing them. + +### Phase 4 — Authorize.net (validate provider abstraction) + +Add `@emdash-cms/plugin-commerce-authorize-net` as a second in-process provider +adapter. The goal is not feature breadth — it is to prove that the +`PaymentProviderContract` is truly gateway-agnostic. + +Authorize.net introduces explicit auth/capture separation, which is why `authorized` +is a required order state (and was not removed from the state machine despite the +reviewer's suggestion). + +**Exit criteria:** + +- Test-mode checkout completes with Authorize.net. +- Auth-only flow (authorize → captured later) works through the existing state machine. +- No branching in kernel code for Stripe vs Authorize.net — all differences are in adapters. +- Refund route works for both gateways. + +### Phase 5 — Admin UX expansion + +Replace Block Kit admin with React (native plugin `adminEntry`): + +- Rich product editor (variant builder, image upload, pricing). +- Order management table with status transitions, notes, refund flow. +- Merchant settings page (provider selection, store config). +- KPI dashboard widget (revenue, open orders, low stock). +- Logged-in user purchase history page. + +**Exit criteria:** Merchant can perform all common operations (product CRUD, +order management, refund) without touching the API directly. + +### Phase 6 — Storefront and extensions + +After correctness is proven and admin is stable: + +- Full Astro component library (``, ``, ``, etc.). +- Portable Text blocks for product embeds. +- Variable product support (variant selector). +- Shipping/tax module (separate plugin family; see §15 decisions). +- Abandoned cart cron + email recovery. +- Digital product downloads. + +### Phase 7 — AI/MCP surfaces + +`@emdash-cms/plugin-commerce-mcp` standard plugin. `ai/draft-product` route. +All MCP tools from Section 11. Merchant can use an AI agent for product import, +order management, inventory management, and reporting. + +**Do not do this before Phase 3 hardening is complete.** AI agent reliability +depends on consistent structured errors and idempotent operations — those must +be proven before surfaces are exposed. + +--- + +## 14. Step 1 — Full Specification (Ready to Code) + +This is the only step detailed to code-ready level. All subsequent steps are +specified once Step 1 is complete and reviewed. + +### Package structure + +``` +packages/plugins/commerce/ +├── src/ +│ ├── index.ts # Descriptor factory (Vite / build time) +│ ├── types/ +│ │ ├── product.ts # Product discriminated union types +│ │ ├── variant.ts # Variant and attribute types +│ │ ├── cart.ts # Cart and CartItem types +│ │ ├── order.ts # Order, OrderLineItem, OrderEvent types +│ │ ├── customer.ts # CustomerSnapshot, Address +│ │ ├── provider.ts # Provider registration + contract interfaces +│ │ └── index.ts # Re-export all types +│ ├── storage/ +│ │ └── schema.ts # COMMERCE_STORAGE_CONFIG + CommerceStorage type +│ ├── kv/ +│ │ └── keys.ts # KV_KEYS typed constants +│ └── routes/ +│ └── contracts.ts # Zod schemas for all route inputs +├── package.json +└── tsconfig.json +``` + +### package.json + +```json +{ + "name": "@emdash-cms/plugin-commerce", + "version": "0.1.0", + "type": "module", + "exports": { + ".": "./src/index.ts", + "./sandbox": "./src/sandbox-entry.ts", + "./admin": "./src/admin.tsx", + "./astro": "./src/astro/index.ts" + }, + "peerDependencies": { + "emdash": "^0.1.0", + "astro": "^5.0.0" + }, + "devDependencies": { + "typescript": "^5.0.0", + "zod": "^3.22.0" + } +} +``` + +### `src/index.ts` — descriptor factory (skeleton) + +At Step 1, this file only defines the descriptor. No routes, no hooks yet. + +```typescript +import type { PluginDescriptor } from "emdash"; +import { COMMERCE_STORAGE_CONFIG } from "./storage/schema.js"; + +export interface CommercePluginOptions { + currency?: string; + taxIncluded?: boolean; +} + +export function commercePlugin( + options: CommercePluginOptions = {}, +): PluginDescriptor { + return { + id: "emdash-commerce", + version: "0.1.0", + entrypoint: "@emdash-cms/plugin-commerce/sandbox", + adminEntry: "@emdash-cms/plugin-commerce/admin", + componentsEntry: "@emdash-cms/plugin-commerce/astro", + options, + capabilities: [ + "network:fetch", // payment gateway, shipping, tax, fulfillment APIs + "email:send", // order confirmations, abandoned cart, notifications + "read:users", // link orders to authenticated users + "read:media", // read product media + "write:media", // upload product media + ], + allowedHosts: [ + // Narrowed at runtime via settings. Stub wildcard for dev. + // Phase 5 narrows to specific gateway hosts. + "*", + ], + storage: COMMERCE_STORAGE_CONFIG, + adminPages: [ + { path: "/products", label: "Products", icon: "tag" }, + { path: "/orders", label: "Orders", icon: "shopping-cart" }, + { path: "/settings", label: "Commerce Settings", icon: "settings" }, + ], + adminWidgets: [{ id: "commerce-kpi", title: "Store Overview", size: "full" }], + }; +} +``` + +### `src/storage/schema.ts` + +See Section 7 above — implement verbatim. + +### `src/kv/keys.ts` + +See Section 8 above — implement verbatim. + +### `src/types/product.ts` + +See Section 5 above — implement verbatim. + +### `src/types/cart.ts` + +See Section 6 (Cart) above — implement verbatim. + +### `src/types/order.ts` + +See Section 6 (Order) above — implement verbatim. + +### `src/types/provider.ts` + +```typescript +export type ProviderType = "payment" | "shipping" | "tax" | "fulfillment"; + +export interface ProviderRegistration { + providerId: string; // e.g., "stripe-v1" + providerType: ProviderType; + displayName: string; // e.g., "Stripe" + pluginId: string; // e.g., "emdash-commerce-stripe" + routeBase: string; // e.g., "/_emdash/api/plugins/emdash-commerce-stripe" + active: boolean; + config: Record; // Provider-specific (non-secret) config + registeredAt: string; +} + +// Payment provider contract +export interface PaymentInitiateRequest { + orderId: string; + orderNumber: string; + total: number; // Cents + currency: string; + customer: import("./customer.js").CustomerSnapshot; + lineItems: import("./order.js").OrderLineItem[]; + successUrl: string; + cancelUrl: string; + meta?: Record; +} + +export interface PaymentInitiateResponse { + sessionId: string; + redirectUrl?: string; // For redirect-based flows (PayPal, etc.) + clientSecret?: string; // For embedded flows (Stripe Elements) + expiresAt: string; +} + +export interface PaymentConfirmRequest { + sessionId: string; + orderId: string; + rawWebhookPayload: unknown; + rawWebhookHeaders: Record; +} + +export interface PaymentConfirmResponse { + success: boolean; + paymentRef: string; + amountCaptured: number; + currency: string; + failureReason?: string; +} + +export interface PaymentRefundRequest { + orderId: string; + paymentRef: string; + amount: number; + reason: string; +} + +export interface PaymentRefundResponse { + success: boolean; + refundRef: string; + amountRefunded: number; +} + +// Shipping provider contract +export interface ShippingRateRequest { + items: Array<{ + productId: string; + variantId?: string; + qty: number; + weight?: number; // grams + }>; + origin: import("./customer.js").Address; + destination: import("./customer.js").Address; + currency: string; +} + +export interface ShippingRate { + rateId: string; + carrier: string; + service: string; + displayName: string; + price: number; + estimatedDays?: number; + meta?: Record; +} + +// Tax provider contract +export interface TaxCalculationRequest { + items: Array<{ + productId: string; + variantId?: string; + qty: number; + unitPrice: number; + taxClass?: string; + }>; + billingAddress: import("./customer.js").Address; + shippingAddress: import("./customer.js").Address; + currency: string; +} + +export interface TaxCalculationResponse { + totalTax: number; + breakdown: Array<{ + label: string; + rate: number; + amount: number; + }>; +} +``` + +### `src/routes/contracts.ts` + +Define Zod schemas for the public and admin route inputs catalogued in Section 9. These are used in Phase 1 and beyond. At Step 1, define them as commented +stubs so the shapes are locked, even without handler implementations. + +Pattern: one Zod schema per route, named `Schema`. One inferred type +export per schema, named `Input`. + +```typescript +import { z } from "astro/zod"; +import type { infer as ZInfer } from "astro/zod"; + +// ─── Shared ────────────────────────────────────────────────────── + +export const addressSchema = z.object({ + line1: z.string().min(1), + line2: z.string().optional(), + city: z.string().min(1), + state: z.string().min(1), + postalCode: z.string().min(1), + country: z.string().length(2), // ISO 3166-1 alpha-2 +}); + +export const paginationSchema = z.object({ + cursor: z.string().optional(), + limit: z.number().int().min(1).max(100).default(50), +}); + +// ─── Products ──────────────────────────────────────────────────── + +export const productListSchema = paginationSchema.extend({ + status: z.enum(["draft", "active", "archived"]).optional(), + type: z.enum(["simple", "variable", "bundle", "digital", "gift_card"]).optional(), + categoryId: z.string().optional(), + tag: z.string().optional(), +}); + +export const productGetSchema = z.union([ + z.object({ id: z.string().min(1) }), + z.object({ slug: z.string().min(1) }), +]); + +export const productCreateSchema = z.object({ + type: z.enum(["simple", "variable", "bundle", "digital", "gift_card"]), + name: z.string().min(1).max(500), + slug: z + .string() + .min(1) + .max(200) + .regex(/^[a-z0-9-]+$/), + status: z.enum(["draft", "active", "archived"]).default("draft"), + descriptionBlocks: z.array(z.unknown()).optional(), + shortDescription: z.string().max(500).optional(), + basePrice: z.number().int().min(0), + compareAtPrice: z.number().int().min(0).optional(), + currency: z.string().length(3).default("USD"), + mediaIds: z.array(z.string()).default([]), + categoryIds: z.array(z.string()).default([]), + tags: z.array(z.string()).default([]), + seoTitle: z.string().max(200).optional(), + seoDescription: z.string().max(500).optional(), + typeData: z.record(z.unknown()), +}); + +export const inventoryAdjustSchema = z.object({ + id: z.string().min(1), + variantId: z.string().optional(), + delta: z.number().int(), // positive = restock, negative = correction + reason: z.string().min(1), +}); + +// ─── Cart ──────────────────────────────────────────────────────── + +export const cartCreateSchema = z.object({ + currency: z.string().length(3).optional(), + cartToken: z.string().optional(), // Resume existing cart +}); + +export const cartGetSchema = z.object({ + cartToken: z.string().min(1), +}); + +export const cartAddItemSchema = z.object({ + cartToken: z.string().min(1), + productId: z.string().min(1), + variantId: z.string().optional(), + qty: z.number().int().min(1).max(999), + meta: z.record(z.unknown()).optional(), +}); + +export const cartUpdateItemSchema = z.object({ + cartToken: z.string().min(1), + itemId: z.string().min(1), + qty: z.number().int().min(0).max(999), // 0 = remove +}); + +export const cartRemoveItemSchema = z.object({ + cartToken: z.string().min(1), + itemId: z.string().min(1), +}); + +export const cartApplyDiscountSchema = z.object({ + cartToken: z.string().min(1), + code: z.string().min(1).max(100), +}); + +export const cartShippingRatesSchema = z.object({ + cartToken: z.string().min(1), + destination: addressSchema, +}); + +export const cartSelectShippingSchema = z.object({ + cartToken: z.string().min(1), + rateId: z.string().min(1), +}); + +// ─── Checkout ──────────────────────────────────────────────────── + +const customerSnapshotSchema = z.object({ + email: z.string().email(), + firstName: z.string().min(1), + lastName: z.string().min(1), + phone: z.string().optional(), + billingAddress: addressSchema, + shippingAddress: addressSchema, +}); + +export const checkoutCreateSchema = z.object({ + cartToken: z.string().min(1), + customer: customerSnapshotSchema, + /** Required when shipping module is active and cart has shippable items */ + shippingRateId: z.string().min(1).optional(), + successUrl: z.string().url(), + cancelUrl: z.string().url(), + meta: z.record(z.unknown()).optional(), +}); + +// ─── Orders ────────────────────────────────────────────────────── + +export const orderListSchema = paginationSchema.extend({ + status: z + .enum([ + "pending", + "payment_pending", + "authorized", + "paid", + "processing", + "fulfilled", + "canceled", + "refunded", + "partial_refund", + ]) + .optional(), + userId: z.string().optional(), + from: z.string().datetime().optional(), + to: z.string().datetime().optional(), +}); + +export const orderUpdateStatusSchema = z.object({ + id: z.string().min(1), + status: z.enum(["processing", "fulfilled", "canceled", "refunded", "partial_refund"]), + note: z.string().optional(), + actor: z.enum(["merchant", "agent"]).default("merchant"), +}); + +export const orderRefundSchema = z.object({ + id: z.string().min(1), + amount: z.number().int().min(1), + reason: z.string().min(1), + lineItems: z + .array( + z.object({ + lineItemIndex: z.number().int().min(0), + qty: z.number().int().min(1), + }), + ) + .optional(), +}); + +// ─── Providers ─────────────────────────────────────────────────── + +export const providerRegisterSchema = z.object({ + providerId: z + .string() + .min(1) + .regex(/^[a-z0-9-]+$/), + providerType: z.enum(["payment", "shipping", "tax", "fulfillment"]), + displayName: z.string().min(1), + pluginId: z.string().min(1), + routeBase: z.string().url(), + config: z.record(z.unknown()).default({}), +}); + +// ─── Type Exports ──────────────────────────────────────────────── + +export type ProductListInput = ZInfer; +export type ProductCreateInput = ZInfer; +export type InventoryAdjustInput = ZInfer; +export type CartCreateInput = ZInfer; +export type CartAddItemInput = ZInfer; +export type CartUpdateItemInput = ZInfer; +export type CheckoutCreateInput = ZInfer; +export type OrderListInput = ZInfer; +export type OrderUpdateStatusInput = ZInfer; +export type OrderRefundInput = ZInfer; +export type ProviderRegisterInput = ZInfer; +``` + +### `src/types/index.ts` + +```typescript +export type * from "./product.js"; +export type * from "./variant.js"; +export type * from "./cart.js"; +export type * from "./order.js"; +export type * from "./customer.js"; +export type * from "./provider.js"; +``` + +### Step 1 exit criteria + +1. `packages/plugins/commerce` exists and builds without TypeScript errors. +2. All types from Section 5 and 6 are exported. +3. All Zod schemas from the route contract catalog are defined and typed. +4. The storage schema `satisfies PluginStorageConfig` without errors. +5. The descriptor factory `commercePlugin()` returns a valid `PluginDescriptor`. +6. No business logic exists yet — this milestone is purely contracts. + +--- + +## 15. Product decisions (locked) + small defaults + +**Where this section lives:** Section 15 is the **last** section of this document. +Section 14 (“Step 1 — Full Specification”) is very long; if you only scrolled partway +through Step 1, keep scrolling to the file end to reach Section 15. + +### Locked decisions (your answers) + +1. **Payment providers (v1)** + Support **Stripe** and **Authorize.net** from the first shipping release of + payments — not a single-provider MVP. The provider registry and + `PaymentProviderContract` must be validated against **two** real gateways early + (Phase 5 becomes “Stripe + Authorize.net”, not Stripe-only). + +2. **Inventory: payment-first, reserve-at-finalize** + Do **not** hold stock when the customer adds to cart or when checkout starts. + **Re-validate availability and decrement inventory only after successful + payment** (or at the same atomic transition that marks the order paid — + whichever the storage model allows without double-sell). + **UX implication:** Between “add to cart” and “payment succeeded”, counts can + change. The API must return **clear, machine-readable error codes** (e.g. + `inventory_changed`, `insufficient_stock`) and copy-ready **human messages** so + the storefront can explain: _“While you were checking out, availability for one + or more items changed.”_ + +3. **Tax and shipping as a separate module** + Without the **fulfillment / shipping & tax** module installed and active: + - No **shipping address** capture and no **shipping quote** flows in core UI or + public API (those routes either are absent or return a consistent + `feature_not_enabled` / 404 — pick one policy and document it). + - Core checkout may assume **no shippable line items** or a merchant-configured + “digital / no shipping” mode; physical goods that need a quote **require** the + module. + **Multi-currency and localized tax rules** are **in scope for that same module + family** (not in commerce core v1), so currency display, conversion, and + region-specific tax live there or behind additional providers — not duplicated + in core. + +4. **Authenticated purchase history + cart across sessions and devices** + Logged-in users must have: + - **Purchase history** (orders linked to `userId`). + - **Cart continuity** when they log out and back in, or open another client: + server-side cart bound to `userId` (with optional merge from anonymous + `cartToken` on login). + Anonymous browsing may still use `cartToken`; **login associates or merges** + into the durable user cart. + +### Small defaults (still open to tweak, low risk) + +- **Order number format:** `ORD-YYYY-NNNNN` (human-readable; separate from storage + document id) unless you prefer opaque IDs for customer-facing URLs. +- **Tax display when tax module is off:** N/A — tax lines appear only when a tax + provider/module is active. + +--- + +## 16. Error Catalog + +Every route error must use this structure: + +```typescript +interface CommerceError { + code: CommerceErrorCode; // Machine-stable; safe for AI branching + message: string; // Human-readable; safe to display + httpStatus: number; + retryable: boolean; // Whether the client may safely retry + details?: Record; // Structured context (e.g. which itemId, which field) +} +``` + +### Canonical error codes + +```typescript +export const COMMERCE_ERRORS = { + // Inventory + INVENTORY_CHANGED: { httpStatus: 409, retryable: false }, + INSUFFICIENT_STOCK: { httpStatus: 409, retryable: false }, + + // Product / catalog + PRODUCT_UNAVAILABLE: { httpStatus: 404, retryable: false }, + VARIANT_UNAVAILABLE: { httpStatus: 404, retryable: false }, + + // Cart + CART_NOT_FOUND: { httpStatus: 404, retryable: false }, + CART_EXPIRED: { httpStatus: 410, retryable: false }, + CART_EMPTY: { httpStatus: 422, retryable: false }, + + // Order + ORDER_NOT_FOUND: { httpStatus: 404, retryable: false }, + ORDER_STATE_CONFLICT: { httpStatus: 409, retryable: false }, + PAYMENT_CONFLICT: { httpStatus: 409, retryable: false }, + + // Payment + PAYMENT_INITIATION_FAILED: { httpStatus: 502, retryable: true }, + PAYMENT_CONFIRMATION_FAILED: { httpStatus: 502, retryable: false }, + PAYMENT_ALREADY_PROCESSED: { httpStatus: 409, retryable: false }, + PROVIDER_UNAVAILABLE: { httpStatus: 503, retryable: true }, + + // Webhooks + WEBHOOK_SIGNATURE_INVALID: { httpStatus: 401, retryable: false }, + WEBHOOK_REPLAY_DETECTED: { httpStatus: 200, retryable: false }, // 200 — tell provider we got it + + // Discounts / coupons + INVALID_DISCOUNT: { httpStatus: 422, retryable: false }, + DISCOUNT_EXPIRED: { httpStatus: 410, retryable: false }, + + // Features / config + FEATURE_NOT_ENABLED: { httpStatus: 501, retryable: false }, + CURRENCY_MISMATCH: { httpStatus: 422, retryable: false }, + SHIPPING_REQUIRED: { httpStatus: 422, retryable: false }, + + // Abuse / limits + RATE_LIMITED: { httpStatus: 429, retryable: true }, + PAYLOAD_TOO_LARGE: { httpStatus: 413, retryable: false }, +} as const satisfies Record; + +export type CommerceErrorCode = keyof typeof COMMERCE_ERRORS; +``` + +Rules: + +- `WEBHOOK_REPLAY_DETECTED` returns **200** (not 4xx) so that payment gateways do + not retry the delivery — they treat non-2xx as failures and retry aggressively. +- `PAYMENT_CONFLICT` is used when payment captured but inventory finalize failed. + It is distinct from `INSUFFICIENT_STOCK` because money has moved. +- Wire-level / API error codes should be **snake_case strings**, stable across + versions; never remove a code, only add. The kernel keeps **internal** keys as + `UPPER_SNAKE` on `COMMERCE_ERRORS` and exports an explicit map + `COMMERCE_ERROR_WIRE_CODES` plus `commerceErrorCodeToWire()` in + `packages/plugins/commerce/src/kernel/errors.ts`. Route handlers must emit wire + codes in JSON; do not expose internal keys to clients. + +--- + +## 17. Cart Merge Rules + +Applies when a user with an anonymous `cartToken` logs in and may have a +pre-existing server-side cart linked to their `userId`. + +### Guest checkout policy + +Guest checkout (purchase without creating an account) is **supported**. Orders +are linked to `userId: null` and the `customer.email` is the only persistent +identifier. Guest orders can be associated with a new/existing account by email +match — see below. + +### Merge algorithm on login + +1. **Identify carts**: Look up the anonymous cart by `cartToken` (source) and any + `active` or `abandoned` cart owned by `userId` (target). + +2. **If no target cart exists**: Claim the anonymous cart by setting `userId` on + it. Status stays `active`. No merge needed. + +3. **If both carts exist and both have items**: + - For each item in the source cart: + - If the same `productId` + `variantId` already exists in target: **add quantities** (source qty + target qty), capped at product `maxQty` or 999. + - If the item does not exist in target: **copy item** into target. + - Validate all merged items against current availability (product `active`, variant + `active`, price not drastically changed). Items that fail validation are removed + and reported back to the caller in the merge response so the frontend can show a + notice. + - Transition source cart to `merged`. + +4. **If source cart is empty**: Discard it (transition to `expired`); use target. + +5. **If target cart is empty**: Claim the source cart (set `userId`; transition + source cart to active under the user). Discard empty target. + +### Invalid merged items + +If a merged line item references an unavailable product or variant, it is silently +removed with an entry in the merge response under `removedItems: [{ productId, reason }]`. +The frontend should display a notice. + +### Past orders ↔ account association + +If a guest places an order and later creates an account with the same email: + +- The `orders/list` route, when called by an authenticated user, also queries + for guest orders matching `customer.email`. These are returned in purchase + history with a flag `guestOrder: true`. +- **We do not automatically rewrite `order.userId`** on the historical record. + Association is read-time only, so there is no risk of corrupting audit trails. + +--- + +## 18. Layer Boundaries + +Code must be organized into four layers. **No layer may import from a higher +layer.** Violations should be caught by lint rules (e.g. `eslint-plugin-import` +`no-restricted-paths`). + +``` +Layer A — Commerce Kernel (src/kernel/) + ↑ no dependencies on B, C, D + Pure domain: types, state machines, error catalog, cart service, + inventory service, order service, finalization function, totals. + No ctx.*, no HTTP, no React, no Astro. + +Layer B — EmDash Plugin Wrapper (src/plugin/) + ↑ depends on A only + Plugin descriptor (index.ts), definePlugin (sandbox-entry.ts), + route handlers, ctx.* wiring, storage adapters, hook handlers. + +Layer C — Admin UI (src/admin/) + ↑ depends on B (via route calls or SDK) and A (for types) + React components, Block Kit JSON builders, admin pages, widgets. + No direct ctx.* access. + +Layer D — Storefront UI (src/astro/) + ↑ depends on A (for types), calls Layer B routes via HTTP + Astro components, page templates, checkout flow UI. + No kernel imports except shared types. +``` + +**Practical rule for v1:** A single `packages/plugins/commerce` package is +acceptable. Enforce the layers through **directory structure and enforced import +rules**, not separate npm packages (that can come later when needed). + +--- + +## 19. Observability Requirements + +Observability is not a post-launch concern. The first gateway integration must +be debuggable from day one. + +### Mandatory from Phase 2 + +- **Correlation ID**: Every request that enters the checkout flow generates a + `correlationId` (uuid). It is threaded through every `ctx.log.*` call, every + `orderEvent` record, and every `paymentAttempt` record. It is returned in error + responses under `details.correlationId`. + +- **Order timeline**: Every state transition appends a record to `orderEvents` + with: `eventType`, `fromState`, `toState`, `actor`, `correlationId`, `createdAt`, + and optional `payload` (non-sensitive context only — no card numbers, no secrets). + +- **Provider call log**: Every outbound call to a payment gateway or provider route + appends a `paymentAttempt` record with: `providerId`, `action` + (initiate/confirm/refund/webhook), `status`, `durationMs`, `correlationId`, + `createdAt`. Sensitive fields (raw payload, response body) are **redacted** — + store only a hash or omit entirely. + +- **Webhook receipt log**: Every inbound webhook appends a `webhookReceipt` record + with: `providerId`, `externalEventId`, `orderId`, `status` + (processed/duplicate/invalid_signature/error), `createdAt`. Raw body is **not + stored** — only the normalized, validated facts. + +- **Inventory mutation log**: Every stock change is a row in `inventoryLedger`. + `reason` and `referenceType`/`referenceId` are mandatory — never allow `reason: +"unknown"`. + +- **Actor attribution**: Every `orderEvent` records `actor` as one of: + `"customer"` | `"merchant"` | `"system"` | `"agent"`. AI agent operations are + always tagged `"agent"` so audit trails distinguish machine from human actions. + +- **Structured log levels**: Use `ctx.log.info / warn / error` with a consistent + shape: `{ correlationId, orderId?, cartId?, event, ...context }`. Never log + secrets, PII beyond email, or raw payment payloads. + +--- + +## 20. Robustness and scalability + +This section tightens production behavior without reopening locked product decisions +(§15). Implement during Phase 2–3 alongside the first gateway. + +### 20.1 Bounded payloads and abuse resistance + +- **Cart line item cap** (e.g. 50 lines per cart) and **per-line qty cap** (e.g. 999) + — reject with `ORDER_STATE_CONFLICT` or a dedicated `PAYLOAD_TOO_LARGE` once added + to the error catalog. +- **Checkout.create body size** — validate JSON depth/size before parsing. +- **Product list** — always **cursor-based** pagination; default limit capped (e.g. 50); + never unbounded `limit` query params. + +### 20.2 Rate limiting (KV fixed window) + +Apply before expensive work: + +| Surface | Key basis | Purpose | +| ----------------- | ------------------------------------ | ---------------------------------------------------------- | +| `checkout.create` | Hashed client IP + optional `userId` | Slow brute-force / card testing | +| Cart mutations | Hashed `cartToken` | Scraping / bot add-to-cart | +| Inbound webhooks | `providerId` + source IP hash | Flood protection (still verify signature first when cheap) | + +Return **429** with `retryAfter` seconds when exceeded. Log with `correlationId` only. + +### 20.3 Client idempotency (`Idempotency-Key`) + +- For **`checkout.create`** (and later `refund`), accept header or body field + `Idempotency-Key` (16–128 printable ASCII). +- Normalize to **hash + `(keyHash, route)` unique** row in `idempotencyKeys`. +- On duplicate key within TTL (e.g. 24h): return **the same HTTP status and body** + as the first successful completion (replay-safe for clients). +- Cron: delete `idempotencyKeys` older than TTL to bound collection growth. + +### 20.4 Provider outbound calls + +- **Timeouts** per call type (initiate vs refund): fail fast; rely on webhook for + eventual consistency where the gateway supports it. +- **Retries**: only for **idempotent** outbound reads or explicit idempotent retry + tokens from the gateway — never blind double-POST captures. +- **Circuit breaker** (KV): after `N` consecutive failures in window `W`, fail open + with `PROVIDER_UNAVAILABLE` and log; half-open probe after cool-down. Prevents + stampedes when a gateway region is down. + +### 20.5 Webhook processing + +- Verify signature **before** heavy work; reject early with 401 on bad signature. +- **Storage-first dedupe**: insert `webhookReceipts` row in `pending` → process → + mark `processed` (or rely on unique constraint + catch conflict for “already seen”). +- Respond **2xx** for duplicates and successful idempotent replays so gateways stop + retrying (per §16). +- For **Worker CPU wall-time** limits: keep finalize path lean; avoid unbounded + loops over line items (batch size is capped by §20.1). + +### 20.6 Inventory under concurrency + +- **`inventoryVersion`** on variant: increment on every successful stock mutation. +- **Finalize path**: compare snapshot version (stored on order line or cart line at + checkout.create) to current variant version; mismatch → `inventory_changed` / + `payment_conflict` flow already defined. +- **Single writer** per variant per finalize: storage layer should reject lost updates + if you add conditional writes later; until then, serialize via “read version → + write only if version matches” in one handler path. + +### 20.7 Hot rows and read scaling + +- **One active cart per `userId`** (or merge policy) avoids unbounded cart rows per user. +- **Product reads** are cache-friendly: public `products/list` / `get` may use + `Cache-Control` on the **site** (Astro/data layer); cart/checkout responses are + **never** cached at CDN. +- **Order admin list** uses composite indexes already declared; add **cursor** not + offset for large stores. + +### 20.8 Operational recovery + +- **`payment_conflict` queue**: admin filter + optional cron job that lists orders + in `payment_conflict` older than X minutes for human or automated void/refund + (gateway-specific adapter). +- **Metrics** (when platform allows): counters for `checkout_started`, `finalize_ok`, + `finalize_conflict`, `webhook_duplicate`, `provider_timeout` — even log-based + metrics beat nothing. + +### 20.9 API versioning + +- Plugin routes remain under `/_emdash/api/plugins/emdash-commerce/...`. When + breaking request/response shapes are needed, introduce **`v2/` route prefix** or + new route names; keep v1 stable for storefronts pinned to older Astro builds. + +--- + +## 21. Platform alignment (EmDash product + Cloudflare Workers) + +This section records constraints from EmDash’s public positioning and Cloudflare’s +Workers binding model. It does **not** change locked commerce semantics (§15); it +**reinforces** why several earlier choices exist. + +### 21.1 EmDash: sandbox, capabilities, marketplace + +- Third-party plugins are intended to run in **isolates** with **declared + capabilities** — matching our split: **native commerce core** + **standard + provider plugins** with narrow grants (`network:fetch` + `allowedHosts`). +- **License and distribution** are decoupled from the core repo; payment provider + packages can stay proprietary while the core stays MIT-aligned with the host + project. +- **x402** is a first-class EmDash primitive for _HTTP-native, pay-per-access + content_. It is **complementary** to cart checkout, not a replacement: use x402 + for gated content or micropayments; use commerce for SKUs, carts, and fulfillment + workflows. Avoid folding cart totals into x402 in v1. + +### 21.2 Workers: bindings, SSRF, and `fetch` + +- Workers **environment bindings** are live objects (KV, D1, service bindings), + not opaque connection strings. The commerce plan’s insistence on **`ctx.storage` + / `ctx.kv` / `ctx.http`** (and no ad-hoc DB clients in kernel code) matches that + philosophy: fewer string secrets in application code, clearer attachment of + permissions at deploy time. +- **SSRF:** User-controlled URLs must never drive `fetch()` to internal or + same-zone origins. Commerce already restricts outbound calls to **payment / + shipping / tax hosts** via capability rules; do not add “callback URL” fields that + accept arbitrary URLs without validation. +- **Legacy caveat (Worker in front of an origin):** global `fetch()` to URLs under + the site’s own zone may reach the **origin** directly. If a deployment uses that + pattern, any bug that passes user input into `fetch` is an SSRF risk against the + origin. Mitigation: keep using **explicit host allowlists** and never treat + `CF-Worker` (or similar) as **authorization** — Cloudflare documents it for abuse + attribution, not auth. + +### 21.3 Subrequests, CPU, and provider execution + +- Sandboxed plugins face **tight subrequest and CPU budgets**. Prefer **in-process + payment adapters** for first-party gateways (§4) so one checkout does not chain + “core → HTTP → sandbox provider → Stripe” unless marketplace isolation requires + it. +- Keep **webhook handlers short** (§20.5): validate signature, dedupe, call + `finalizePayment`, return 2xx — no unbounded fan-out inside the handler. + +### 21.4 Observability + +- Platforms that understand bindings can attribute resource use to workers; + commerce should still emit **correlation IDs and structured order events** (§19) + so merchant-visible timelines do not depend solely on host metrics. diff --git a/commerce-vs-x402-merchants.md b/commerce-vs-x402-merchants.md new file mode 100644 index 000000000..5fb4fb156 --- /dev/null +++ b/commerce-vs-x402-merchants.md @@ -0,0 +1,33 @@ +# Commerce vs x402 — quick guide for merchants + +EmDash can power **two different payment stories**. They solve different jobs. You can use **one, the other, or both** on the same site; they are not duplicates of each other. + +--- + +## At a glance + +| | **EmDash Commerce** _(cart / checkout plugin)_ | **x402** _(`@emdash-cms/x402`)_ | +| ------------------------------- | --------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------ | +| **What it’s for** | Selling **products or services** with a **cart**, **checkout**, **orders**, and (when configured) **cards** via payment providers | **HTTP-native, pay-per-request** access — often for **content**, **APIs**, or **agent** traffic using **402 Payment Required** | +| **Typical buyer** | Humans shopping on your storefront | Automated clients (AI agents, bots) or any client that speaks x402; can be combined with “humans free, bots pay” | +| **Mental model** | “I run a **shop**” | “I charge **per access** to a URL or resource” | +| **Cart & line items** | Yes — multiple items, quantities, variants | No — each paid request is its own transaction | +| **Order history & fulfillment** | Yes — orders, statuses, emails, operations _(as the plugin ships)_ | No — it gates access; there is no built-in “order” object like a store | +| **Inventory & stock** | Yes — core concern for physical / limited digital goods | Not applicable — no SKU catalog | +| **Shipping & tax** | Supported via **separate modules** when you need real quotes and addresses | Not applicable | +| **How payment feels** | Familiar **checkout** (redirect, card form, wallet, depending on provider) | Client receives **402** + instructions, pays, **retries** the request with proof of payment | +| **Best fit** | T-shirts, courses, licenses, donations with amounts, anything with a **catalog** | Articles, feeds, APIs, “charge scrapers/agents,” **micropayments** per view or call | +| **Same site?** | Yes | Yes — e.g. **store** uses Commerce; **blog or API** uses x402 | + +--- + +## Simple decision rule + +- Choose **Commerce** when buyers pick **products**, you need **carts**, **orders**, or **inventory**. +- Choose **x402** when you want **automatic, request-level payment** (especially for **machines** or **per-access** pricing) without building a shop. + +When in doubt: **shop-shaped problem → Commerce. Gate-shaped problem → x402.** + +--- + +_This is a merchant summary. Technical architecture lives in `commerce-plugin-architecture.md` and the [x402 payments guide](docs/src/content/docs/guides/x402-payments.mdx)._ diff --git a/commerce_plugin_review.md b/commerce_plugin_review.md new file mode 100644 index 000000000..d842b4c45 --- /dev/null +++ b/commerce_plugin_review.md @@ -0,0 +1,441 @@ +# EmDash Commerce Plugin Review + +Date: 2026-04-06 +Scope reviewed: `packages/plugins/commerce` from `COMMERCE_REVIEW_HANDOFF_PLAN_5F.zip` + +## Executive summary + +This codebase is in better shape than many first-pass ecommerce plugins. The architecture is mostly coherent, storage/index definitions are thoughtful, test coverage appears broad, and the checkout/finalize path shows real discipline. + +That said, I would **not deploy this plugin yet**. + +The most important blocker is simple: **the catalog/admin mutation surface is exposed as public routes**. For a greenfield plugin that has not shipped, there is no good reason to leave privileged catalog writes publicly accessible. + +The second major issue is that the code still carries **real compatibility and rollout branches** in runtime paths. Because this plugin has not yet been deployed, those branches should now be removed rather than preserved. + +I could not run the automated tests in this container because `pnpm` is not installed here, so this is a **thorough static review**, not an execution-validated test run. + +--- + +## Overall assessment + +**Strengths** + +- Kernel-first direction is sensible. +- Storage declarations and uniqueness/index coverage are stronger than average. +- Catalog domain modeling is reasonably clean. +- The codebase shows evidence of tests and design discipline rather than ad hoc implementation. + +**Main risks before deployment** + +1. Access control and route exposure +2. Runtime compatibility/legacy branches that should not exist in a never-deployed release +3. A catalog handler that is becoming too large to trust easily +4. Read/query patterns that will not age well under catalog growth +5. Write-path race handling that is still friendlier than it is robust + +--- + +## Severity-ranked findings + +## Critical + +### 1) Privileged catalog and admin write routes are public + +**Why this matters** + +The route registry exposes nearly the entire catalog mutation surface as `public: true`, including product creation, updates, SKU writes, category/tag writes, asset linking, bundle writes, and digital entitlement writes. + +**Evidence** + +`src/index.ts:201-370` + +Notable examples: + +- `product-assets/register` — `src/index.ts:212-216` +- `catalog/product/create` — `src/index.ts:267-271` +- `catalog/product/update` — `src/index.ts:277-280` +- `catalog/category/create` — `src/index.ts:287-290` +- `catalog/tag/create` — `src/index.ts:307-310` +- `catalog/sku/create` — `src/index.ts:332-335` +- `digital-entitlements/create` — `src/index.ts:257-260` + +Inside `src/handlers/catalog.ts`, the mutation handlers are POST-gated with `requirePost(ctx)`, but I found no corresponding authorization enforcement in this package. The repeated calls to `requirePost(ctx)` begin at `src/handlers/catalog.ts:688` and continue through the rest of the file. + +**Risk** + +If EmDash does not inject strong auth outside this plugin, unauthenticated or low-trust callers could mutate the catalog. + +**Recommendation** + +- Default all catalog/admin mutation routes to non-public. +- Keep only clearly storefront-safe routes public. +- Add one explicit `require_admin_access()`-style helper and call it in every privileged mutation and privileged read. +- Treat digital entitlement creation/removal as privileged operations. + +**Suggested public set** + +Likely public: + +- `cart/upsert` +- `cart/get` +- `checkout` +- `checkout/get-order` (token-gated possession proof already exists) +- `recommendations` +- `webhooks/stripe` + +Everything else should start private unless there is a very strong reason otherwise. + +--- + +## High + +### 2) Legacy webhook compatibility mode is still in the production schema + +**Why this matters** + +The Stripe webhook schema still accepts a legacy direct body shape instead of only accepting the verified webhook event structure. + +**Evidence** + +`src/schemas.ts:162-191` + +Specifically: + +- legacy input object at `src/schemas.ts:162-171` +- union that keeps both modes alive at `src/schemas.ts:184-188` +- inline comment explicitly says this supports an old integration and some tests at `src/schemas.ts:185` + +**Risk** + +- Wider ingress contract than needed +- Larger test matrix +- Old assumptions preserved in production runtime +- Higher chance of accidental misuse by integrators + +**Recommendation** + +- Remove the legacy schema from runtime code. +- Accept only the verified Stripe event shape in production. +- Move any shortcut test payloads into test helpers or fixtures. + +--- + +### 3) Checkout replay validation still tolerates legacy cache rows + +**Why this matters** + +Completed checkout replay validation still permits cached responses without `replayIntegrity`. + +**Evidence** + +`src/handlers/checkout-state.ts:133-153` + +The comment at `src/handlers/checkout-state.ts:135` explicitly states the missing-integrity case is treated as a legacy cache path. + +**Risk** + +A greenfield release should not ship with relaxed replay validation for pre-existing cache formats that should not exist. + +**Recommendation** + +- Require `replayIntegrity` on completed cached responses. +- Remove the legacy acceptance branch. +- If migration support is needed for tests, keep it in fixtures, not runtime behavior. + +--- + +### 4) Bundle finalization still supports a legacy stock fallback path + +**Why this matters** + +Bundle inventory deduction only expands bundle components when every component has a non-negative `componentInventoryVersion`. Otherwise it falls back to treating the line as a legacy bundle row keyed by the bundle product. + +**Evidence** + +- `src/lib/order-inventory-lines.ts:1-48` +- `src/types.ts:80-84` + +The docs/comments are explicit: + +- `src/lib/order-inventory-lines.ts:8` says the line is treated like a legacy bundle row +- `src/types.ts:82` says finalization falls back to legacy bundle-line stock rows + +There is also a dedicated test covering the legacy path: + +- `src/orchestration/finalize-payment-inventory.test.ts:121-122` + +**Risk** + +This is the sort of compatibility behavior that quietly survives forever and later becomes a source of stock inconsistencies. + +**Recommendation** + +- Remove the legacy fallback for first release. +- Fail fast if a bundle snapshot lacks valid component inventory versions. +- Treat missing component versions as a checkout snapshot bug, not something to silently absorb. + +--- + +### 5) Finalization behavior is still split by environment toggles + +**Why this matters** + +The finalize path still depends on environment flags for behavior selection. + +**Evidence** + +`src/orchestration/finalize-payment.ts:84-86` + +- `COMMERCE_ENABLE_FINALIZE_INVARIANT_CHECKS` +- `COMMERCE_USE_LEASED_FINALIZE` + +Package-level docs also confirm this staged rollout posture: + +- `packages/plugins/commerce/COMMERCE_USE_LEASED_FINALIZE_ROLLOUT.md` +- `packages/plugins/commerce/rollout-evidence/*` +- `packages/plugins/commerce/COMMERCE_DOCS_INDEX.md` + +**Risk** + +For a not-yet-deployed plugin, rollout toggles preserve unnecessary alternate runtime paths and make the release posture ambiguous. + +**Recommendation** + +- Pick the canonical finalize path now. +- Delete the alternate runtime mode before first deployment. +- Keep invariants on by default unless there is a very strong measured reason not to. + +--- + +## Medium + +### 6) `catalog.ts` is now too large and multi-purpose + +**Why this matters** + +`src/handlers/catalog.ts` is 1,924 lines and now spans too many concerns. + +**Evidence** + +- file length: `src/handlers/catalog.ts` = 1,924 lines + +It covers: + +- product CRUD/state +- SKU CRUD/state/listing +- categories and tags +- category/tag links +- assets and asset ordering +- bundle components +- digital assets and entitlements +- read-model hydration + +**Risk** + +- Harder code review +- Higher regression risk +- More difficult onboarding +- Greater chance of hidden coupling + +**Recommendation** + +Split by domain boundary now, before more features land: + +- `catalog-products.ts` +- `catalog-skus.ts` +- `catalog-taxonomy.ts` +- `catalog-assets.ts` +- `catalog-bundles.ts` +- `catalog-digital.ts` +- shared `catalog-read-model.ts` + +This is a maintainability refactor, not an architectural rewrite. + +--- + +### 7) Product listing fetches broadly, then filters and slices in memory + +**Why this matters** + +The product list handler pulls a base result set, then applies category/tag filtering in memory, sorts in memory, and slices to the requested limit afterward. + +**Evidence** + +`src/handlers/catalog.ts:1008-1028` + +Key lines: + +- broad query: `src/handlers/catalog.ts:1008-1010` +- category filter in memory: `src/handlers/catalog.ts:1013-1017` +- tag filter in memory: `src/handlers/catalog.ts:1019-1023` +- sort and slice after full filtering: `src/handlers/catalog.ts:1025-1028` + +**Risk** + +This is acceptable for a tiny catalog. It becomes less attractive as the catalog grows, especially if product media and metadata hydration remain downstream of that query. + +**Recommendation** + +- Push more filtering into indexed queries. +- When category or tag filters are present, query link tables first and drive product lookup from those IDs. +- Add cursor/pagination semantics now, before API consumers depend on whole-list behavior. + +--- + +### 8) Uniqueness checks are friendly, but race-prone + +**Why this matters** + +Create paths perform preflight query checks before writes. That is helpful for nicer error messages, but it is not sufficient under concurrency. + +**Evidence** + +- product slug precheck: `src/handlers/catalog.ts:711-717` +- category slug precheck: `src/handlers/catalog.ts:1075-1080` +- tag slug precheck: `src/handlers/catalog.ts:1183-1188` +- SKU code precheck: `src/handlers/catalog.ts:1287-1292` + +Storage does define proper unique indexes: + +- products slug: `src/storage.ts:10` +- categories slug: `src/storage.ts:34` +- tags slug: `src/storage.ts:42` +- SKU code: `src/storage.ts:70` + +**Risk** + +Two concurrent writers can both pass the query check, then race into the write. + +**Recommendation** + +- Keep the preflight checks if you want user-friendly messages. +- But also normalize storage-level unique constraint failures on `put`. +- Make the storage constraint the true source of truth. + +--- + +### 9) Route registration is getting too manual + +**Why this matters** + +`src/index.ts` is doing route registry composition by hand for everything. + +**Evidence** + +`src/index.ts:201-370` + +**Risk** + +As the surface grows, this becomes a hotspot for accidental exposure, naming drift, and review fatigue. + +**Recommendation** + +Split route registration into grouped registries: + +- storefront routes +- admin/catalog routes +- webhook routes +- optional extension routes + +This change would also make access classification more obvious. + +--- + +### 10) Package documentation still signals rollout-in-progress rather than first-release posture + +**Why this matters** + +The package root still includes rollout notes, evidence logs, and compatibility-oriented documentation that imply a staged migration rather than a clean first deployment. + +**Evidence** + +Examples: + +- `COMMERCE_USE_LEASED_FINALIZE_ROLLOUT.md` +- `rollout-evidence/legacy-test-output.md` +- `rollout-evidence/strict-test-output.md` +- `rollout-evidence/strict-finalize-smoke-output.md` + +**Risk** + +Not a direct runtime bug, but it confirms the release posture is still transitional. + +**Recommendation** + +- Decide what is canonical. +- Keep only the docs that reflect the intended release state. +- Archive or move rollout artifacts out of the plugin package if they are only historical. + +--- + +## Lower-severity observations + +### 11) Some public read routes may expose more internal catalog state than intended + +This is a design concern rather than a proven bug. + +Because product, category, tag, and SKU reads/lists are public, you should verify whether storefront callers are meant to see: + +- draft products +- hidden products +- archived products +- inactive SKUs +- bundle composition details +- digital entitlement relationships + +If the storefront is only meant to expose sellable catalog data, public reads should apply storefront-safe filters by default. + +--- + +## What I did **not** find + +I did **not** see obvious signs of random dead code sprawl or rushed copy-paste architecture. This is not a messy codebase. The issues are more about release posture, trust boundaries, and a few places where the implementation is still carrying migration-era assumptions. + +--- + +## Recommended action plan + +## Stop-ship before deployment + +1. **Lock down route exposure** + - Make catalog/admin mutation routes private. + - Add explicit authorization checks. + +2. **Remove greenfield-inappropriate compatibility paths** + - delete legacy webhook schema branch + - require replay integrity for completed checkout cache + - remove legacy bundle stock fallback + - choose one finalize mode and delete the other runtime path + +3. **Audit public reads** + - confirm what storefront callers are allowed to see + - default to storefront-safe visibility/status filters + +## Next refactor pass + +4. **Split `catalog.ts` by concern** +5. **Push product filtering closer to indexed storage** +6. **Normalize unique-index write failures instead of relying only on prechecks** +7. **Split route registration into grouped modules** + +--- + +## Suggested developer framing + +If you want to give a developer a crisp mandate, this is the version I would use: + +> Before first deployment, treat this plugin as greenfield. Remove all runtime compatibility branches that only exist to support old integrations or phased rollouts. Tighten route exposure so only storefront-safe endpoints are public. Then do one maintainability refactor to split the catalog handler and harden query/write paths. + +--- + +## Bottom line + +This plugin is **promising and fairly disciplined**, but it is **not yet in the cleanest first-release state**. + +The two biggest corrections are: + +- **fix route exposure / authorization** +- **remove legacy and rollout-era runtime branches** + +Once those are addressed, the remaining work is mostly maintainability and scaling hygiene rather than foundational redesign. diff --git a/commerce_plugin_review_update_v3.md b/commerce_plugin_review_update_v3.md new file mode 100644 index 000000000..2ce6afc21 --- /dev/null +++ b/commerce_plugin_review_update_v3.md @@ -0,0 +1,509 @@ +# EmDash Commerce Plugin Review Update (Deep Dive) + +## Scope + +Static deep-dive review of the latest remediation branch/package, with emphasis on: + +- bugs and correctness risks +- opportunities for refactoring +- DRY and YAGNI alignment +- removal of legacy / rollout-era behavior +- deployment readiness for first real testing + +This review assumes the plugin has **not yet been deployed**, so the standard should be **greenfield-clean** rather than backward-compatibility tolerant. + +--- + +## Executive Summary + +This version is **meaningfully improved** over the prior one. Several real runtime legacy paths appear to be removed or neutralized. + +However, I would **not yet call the plugin storefront-safe, fully DRY, fully YAGNI, or fully legacy-free**. + +The single biggest remaining issue is now the **public read surface**: public catalog routes still appear to expose internal/admin-grade data structures and do not appear to enforce storefront-safe defaults such as `status=active` and `visibility=public`. + +So the core risk has shifted: + +- **Before:** legacy runtime compatibility paths and public admin mutation exposure +- **Now:** public read-surface design, DTO boundaries, and module structure + +--- + +## What Is Clearly Improved + +These changes look materially better than the earlier version: + +### 1) Admin mutations are no longer publicly exposed + +The route surface in `src/index.ts` is much safer than before. The prior issue where catalog/admin writes were exposed as public now appears largely resolved. + +### 2) Stripe webhook legacy compatibility appears removed + +The earlier direct-payload compatibility mode for Stripe webhook handling no longer appears to be part of the active runtime path. + +### 3) Checkout replay handling is stricter + +Cached replay acceptance now appears to require `replayIntegrity`, which is the correct posture for a greenfield release. + +### 4) Bundle inventory fallback behavior is stricter + +The earlier silent fallback from component-level inventory state to bundle-level fallback stock handling appears to be removed. Failing fast is the right choice. + +### 5) Alternate finalize-path rollout behavior appears mostly neutralized + +`COMMERCE_USE_LEASED_FINALIZE` now looks more like rollout/history residue than a live runtime fork. That is much healthier than before. + +--- + +## Highest-Priority Remaining Problems + +## 1) Public catalog reads still expose internal/admin-grade data + +This is now the most important problem in the codebase. + +Public routes in `src/index.ts` still include endpoints such as: + +- `bundle/compute` +- `catalog/product/get` +- `catalog/products` +- `catalog/sku/list` + +But the handlers behind them appear to return **internal storage-grade objects**, not storefront-safe DTOs. + +### Why this is a problem + +The current shapes appear to expose far more than a public storefront should reveal, including things like: + +- `inventoryQuantity` +- `inventoryVersion` +- raw SKU state +- variant matrix internals +- bundle composition internals +- digital entitlement metadata +- inactive / hidden / draft product details + +That is both a security/data-exposure concern and a design-boundary problem. + +### Why it matters + +A storefront API should only reveal what a public buyer actually needs, such as: + +- active/public products +- public pricing +- public media +- purchasable options +- availability status at a business level, if desired + +It should **not** expose: + +- stock concurrency/version tokens +- raw inventory numbers unless intentionally part of the storefront design +- admin-only product states +- internal entitlement structures +- hidden catalog metadata + +### Recommendation + +Create **separate public and admin DTOs**. + +At minimum: + +- public routes should return storefront-safe DTOs only +- admin routes should return internal/admin detail DTOs +- `catalog/sku/list` should not expose raw `StoredProductSku[]` on a public route +- `inventoryVersion` should never be exposed publicly + +This is the first issue I would fix before deployment. + +--- + +## 2) Public product listing appears to default to “everything” + +The product list input schema appears to allow optional `status` and `visibility` filters. + +Then `listProductsHandler()` appears to build the query directly from caller input, meaning that if the caller does not specify those filters, the public route may default to returning products without forcing: + +- `status = active` +- `visibility = public` + +### Why this is a problem + +That creates a likely path for exposing: + +- draft products +- hidden products +- archived products +- not-yet-ready merchandising data + +### Recommendation + +For **public storefront routes**, enforce server-side defaults: + +- `status = active` +- `visibility = public` + +If admin users need broader discovery, give them a separate admin route or admin-only handler mode. + +Do not rely on the caller to request safe filters. + +--- + +## 3) The “catalog split” is not a real refactor yet + +There are now multiple files such as: + +- `catalog-assets.ts` +- `catalog-bundles.ts` +- `catalog-categories.ts` +- `catalog-digital.ts` +- `catalog-products.ts` +- `catalog-tags.ts` + +But these appear to function mainly as re-export shims back into `catalog.ts`, not true implementation splits. + +### Why this is a problem + +This adds file count and indirection without actually reducing complexity. + +So the code pays the cost of a multi-file design while still living with a monolithic implementation. + +### Recommendation + +Choose one of two honest options: + +#### Option A — keep the monolith temporarily + +If you are not ready to truly split the module, keep `catalog.ts` as the canonical implementation and remove the fake split. + +#### Option B — perform a real split + +Move real implementations into domain files such as: + +- products +- SKUs +- taxonomy +- assets +- bundles +- digital +- shared read-model hydration + +Right now it is the worst of both worlds. + +--- + +## 4) Read-model helpers still appear vulnerable to truncation / scaling issues + +Several helper functions still appear to use one-shot `query()` calls where the code seems to assume a complete result set. + +Examples include read helpers for: + +- bundle components +- category DTOs +- tag DTOs +- SKU hydration +- product images by role/target +- SKU option values +- digital entitlement summaries + +Elsewhere in the same module, pagination is used more carefully when cardinality is expected to grow. + +### Why this is a problem + +If the storage adapter ever applies default limits, soft limits, or driver-level caps, these helpers could silently under-read. + +That creates brittle behavior that may remain invisible until a catalog grows. + +### Recommendation + +Create one shared helper for “query all pages until complete” and use it consistently whenever completeness is expected. + +This is both a correctness improvement and a DRY improvement. + +--- + +## 5) Storefront reads and admin reads are still mixed together + +`getProductHandler()` appears to serve too many concerns at once: + +- base product detail +- taxonomy hydration +- images +- variable-product matrix detail +- bundle summary +- digital entitlement summary + +### Why this is a problem + +This makes it difficult to reason about: + +- what is safe to expose publicly +- what is necessary for storefront use +- what is admin-only detail +- what performance cost each caller is paying + +It also encourages an “everything endpoint” design. + +### Recommendation + +Split product read responsibilities into at least two clear paths: + +- `getStorefrontProduct()` +- `getAdminProductDetail()` + +That would improve: + +- safety +- clarity +- performance discipline +- future maintainability + +--- + +## Important Correctness / Robustness Issues + +## 6) Product lifecycle logic is duplicated and appears inconsistent + +Product lifecycle handling appears split between: + +- `updateProductHandler()` via shared patch logic +- `setProductStateHandler()` via hand-rolled transition logic + +### Why this is a problem + +Duplicated lifecycle logic is a correctness trap. + +One likely inconsistency is that a transition to `active` sets `publishedAt` but may not clear `archivedAt`, whereas a transition to `draft` clears `archivedAt`. + +If that reading is correct, a previously archived product moved back to active could still carry an old archived timestamp. + +### Recommendation + +Centralize lifecycle transitions into one authoritative helper used by both handlers. + +This is a classic DRY fix that also reduces subtle state bugs. + +--- + +## 7) Ordered-child mutations do not appear atomic + +Asset-link and bundle-component mutation flows appear to follow a pattern like: + +1. insert/delete child row +2. normalize ordering with `mutateOrderedChildren(...)` + +### Why this is a problem + +If the first step succeeds and the second fails, the system can be left with: + +- gaps in position ordering +- partially normalized children +- ordering drift after deletions +- a state that relies on repair later + +### Recommendation + +Push the full ordered-child mutation into one authoritative helper so callers do not manage the sequence manually. + +If true transactions are unavailable, then at minimum: + +- document failure semantics clearly +- provide repair/normalization guarantees +- add strong tests around partial-failure behavior + +--- + +## 8) Variable SKU validation still has avoidable N+1 query behavior + +The SKU creation path still appears to perform multiple layered fetches such as: + +- attributes +- attribute values per attribute +- option rows per existing SKU + +### Why this is a problem + +This is not a launch blocker for a modest catalog, but it is a clear opportunity to simplify and reduce query count. + +### Recommendation + +Batch-load once, then map in memory: + +- all relevant attribute values +- all relevant option rows for existing SKUs + +That keeps the logic simpler and more scalable. + +--- + +## 9) Error code precision remains weaker than it should be + +Some missing-resource situations still appear to map to overly broad codes such as `PRODUCT_UNAVAILABLE`, even when the missing thing is not actually a product. + +### Why this is a problem + +This hurts: + +- observability +- operational debugging +- client-side error handling +- API clarity + +### Recommendation + +Use narrower, resource-specific codes where possible, such as: + +- `ASSET_NOT_FOUND` +- `ENTITLEMENT_NOT_FOUND` +- `BUNDLE_COMPONENT_NOT_FOUND` +- `CATEGORY_LINK_NOT_FOUND` + +The system does not need a huge taxonomy, but it should at least distinguish major resource classes. + +--- + +## DRY / YAGNI Opportunities + +## 10) Repeated timestamp construction should be centralized + +There appears to be repeated use of patterns like: + +- `new Date(Date.now()).toISOString()` + +throughout the module. + +### Why this matters + +This is minor, but repetitive timestamp generation: + +- adds noise +- weakens consistency +- makes tests harder to stabilize + +### Recommendation + +Use a tiny helper such as `now_iso()` or inject time where lifecycle logic matters. + +Small cleanup, worthwhile. + +--- + +## 11) `catalog.ts` still owns too many responsibilities + +Even beyond file size, the module appears to own: + +- conflict handling +- stock synchronization +- metadata hydration +- DTO building +- asset ordering +- bundle logic +- digital entitlement logic +- lifecycle logic + +### Why this is a problem + +This makes the module harder to trust, harder to test, and harder to evolve safely. + +### Recommendation + +Move toward a structure where responsibilities are clearer, for example: + +- lifecycle/state transitions +- read-model hydration +- taxonomy linking +- ordered-child mutations +- bundle business logic +- digital entitlement logic + +This does not require over-architecting. It simply means putting each concern in one home. + +--- + +## 12) The repo is cleaner, but not fully legacy-free yet + +The active runtime path looks much cleaner now. + +However, the repository still appears to carry rollout-history artifacts and documentation such as: + +- `COMMERCE_USE_LEASED_FINALIZE_ROLLOUT.md` +- `rollout-evidence/*` +- staged-rollout checklist language + +### Why this matters + +Because the package appears to publish only `src`, this is not a runtime blocker. + +But if the stated goal is “legacy-code free,” then the repo itself is not fully there yet. + +### Recommendation + +After runtime cleanup is complete, do a repo-hygiene pass: + +- archive or remove rollout-era docs that are no longer useful +- keep one canonical implementation posture +- reduce historical noise in the package root + +--- + +## Recommended Next Steps (Priority Order) + +## 1) Lock down the public read surface + +Before deployment: + +- make public product routes storefront-safe +- remove raw SKU exposure from public routes +- remove inventory version exposure from public routes +- prevent hidden/draft/archived leakage +- avoid exposing admin-grade entitlement detail publicly + +## 2) Separate storefront reads from admin reads + +Create clear boundaries: + +- storefront DTOs +- admin DTOs +- storefront handlers +- admin handlers + +This is the highest-value structural improvement remaining. + +## 3) Fix the fake split + +Choose one: + +- truly split `catalog.ts`, or +- remove the shim files until you are ready + +Do not keep architectural theater in the codebase. + +## 4) Centralize lifecycle/state transitions + +Unify product state logic in one place so handlers cannot drift. + +## 5) Make full-read helpers pagination-safe + +Introduce one shared complete-query helper and remove inconsistent assumptions. + +## 6) Make ordered-child mutation flows safer + +Prefer one authoritative mutation helper with explicit guarantees. + +--- + +## Bottom Line + +This branch is **substantially better** than the earlier one. + +But it is **not yet where I would want it** if the goal is to be: + +- storefront-safe +- DRY +- YAGNI +- genuinely legacy-clean + +The biggest remaining problem is no longer webhook/finalize legacy logic. + +It is now the **design of the public catalog read surface** and the **lack of strong separation between storefront and admin representations**. + +If that is fixed well, the plugin will be in a much healthier position for first deployment and real testing. diff --git a/docs/best-practices.md b/docs/best-practices.md new file mode 100644 index 000000000..dabe26740 --- /dev/null +++ b/docs/best-practices.md @@ -0,0 +1,177 @@ +# EmDash Plugin Developer Handoff + +## Purpose + +This document is the minimum useful briefing for a developer starting plugin work on Cloudflare's EmDash CMS. It is intentionally DRY and YAGNI: it focuses on the architectural constraints, implementation risks, and early decisions most likely to affect the success of a real plugin project, especially for e-commerce. EmDash launched as a v0.1.0 preview on March 31, 2026, is MIT-licensed, TypeScript-native, and built on Astro 6.[1][2] + +## Executive Brief + +EmDash is not WordPress with a modern UI. It changes the plugin model at the runtime, security, data, and admin-extension levels. Plugins run in isolated sandboxes, must pre-declare their permissions, interact through explicit capability bindings, and cannot rely on the shared-process assumptions that underpin most WordPress plugin design.[1][2][3] + +For a plugin developer, the main message is simple: do not start by porting WordPress patterns. Start by treating EmDash as a capability-constrained application platform with a CMS on top. The biggest risks are incomplete capability declarations, immature schema evolution practices, and underdesigned payment/session architecture for commerce use cases.[1][2][4] + +## What EmDash Is + +EmDash is Cloudflare's new CMS positioned as a "spiritual successor" to WordPress, with a strong emphasis on plugin security, AI-native tooling, and a cleaner content/data model than WordPress's general-purpose `wp_posts` structure.[1][3] Public commentary and early coverage consistently describe it as Astro-based, TypeScript-first, and tightly aligned with Cloudflare infrastructure such as Workers, D1, and R2.[2][5] + +Themes are presentation-only, while plugins are where privileged logic lives. Content is modeled structurally rather than as loose HTML blobs, and plugin code runs in isolation rather than inside a shared PHP runtime.[1][6][7] + +## Non-Negotiable Architectural Rules + +### Capability model first + +Every plugin must declare the capabilities it needs up front. If a plugin has not declared a specific permission, the corresponding binding is not available in runtime context. This applies to content access, storage, and outbound network requests.[1] + +For outbound HTTP, exact hostnames must be declared. That means a plugin cannot safely assume it can call a third-party API later just because credentials are present in config. If the hostname is missing from the manifest, the integration is effectively broken by design.[1] + +### Plugins are isolated, not co-resident + +EmDash plugins run in isolated V8 Dynamic Workers on Cloudflare, which is the core mechanism behind its security claims around plugins.[1][8] This is a deep break from WordPress, where plugins share a process and can interfere with each other or the entire application. + +This isolation improves safety, but it also means plugin developers should assume less implicit power, less cross-plugin reach, and more explicit contracts. If a feature depends on hidden side effects or direct access to internals, it is likely the wrong design for EmDash.[1][3] + +### Themes cannot own business logic + +Themes are presentational and cannot act like WordPress themes with application logic embedded in template code. Any feature that writes data, performs privileged operations, or coordinates application state should be implemented in a plugin and then consumed from the theme layer.[6] + +For commerce, this means checkout, inventory updates, order creation, and customer state all belong in plugins. The theme should only render views and call into explicit plugin-provided interfaces.[6] + +### Admin UI is declarative, not arbitrary app code + +Admin extensions are defined using a JSON schema comparable to Slack's Block Kit rather than arbitrary HTML/JS dropped into the admin surface.[3] This matters because many custom CMS plugins rely on rich bespoke admin apps; in EmDash, that assumption will fail unless the schema can express the workflow. + +Any plugin requiring complex operator experiences, such as product-variant matrix editing or warehouse-picking dashboards, should prototype the admin UI early before deeper backend work begins.[3] + +## WordPress Assumptions That Will Break + +The following WordPress-era assumptions should be treated as invalid in EmDash: + +- Plugins do not share a universal runtime with unrestricted application access.[1] +- There is no `$wpdb`-style direct database shortcut for arbitrary querying from anywhere.[6] +- Themes are not a backdoor for application logic.[6] +- Hook behavior is not equivalent to WordPress's mature action/filter model.[1] +- Content is not stored as raw HTML intended for direct output.[7] +- The plugin ecosystem is not mature enough to assume that a needed primitive already exists.[2][4] + +A developer who starts by trying to recreate WooCommerce idioms inside EmDash will likely waste time. A developer who starts by designing explicit services, schema boundaries, and capability manifests will move faster. + +## Hooks, Extensibility, and Missing Surface Area + +EmDash exposes lifecycle-style hooks such as `content:afterSave`, but early documentation and commentary do not show a WordPress-equivalent filter system with broad mid-pipeline mutation semantics.[1][2] That means features that depend on intercept-and-modify behavior should not be assumed to exist. + +This matters for dynamic pricing, checkout manipulation, cart mutation, tax adjustments, and workflow injection. If the design depends on global filters being available everywhere, the safer assumption is that the platform does not yet support that cleanly and the plugin architecture should instead center around explicit service boundaries and controlled entry points.[1][4] + +## Data Model and Content Shape + +EmDash stores content as structured portable text rather than free-form HTML content blobs.[7] This is cleaner and more future-proof, but it means rendering and migration work are more deliberate. + +Shortcode-heavy content, arbitrary embedded markup, and editor-side hacks from WordPress do not carry over naturally. Rich content should be represented as structured blocks, and any migration from legacy product descriptions or landing pages should expect transformation work rather than direct reuse.[1][7][3] + +Collection schemas are also more explicit. Instead of overloading a single generic posts table, content types map to typed collections, typically backed by D1.[6] This is an advantage for maintainability, but only if schema evolution is handled carefully. + +## E-Commerce Reality Check + +There is no WooCommerce-equivalent standard e-commerce layer in EmDash today. Early ecosystem coverage points to a very small plugin marketplace and no broad commerce foundation plugin at launch.[2][4] + +The implication is important: if the goal is e-commerce, the developer is not merely "building a plugin." The developer is likely building several foundational primitives that WordPress users take for granted, including cart state, order modeling, payment processing integration, fulfillment hooks, review systems, and potentially faceted catalog behavior.[4] + +This is both the main challenge and the biggest opportunity. The ecosystem is immature, but a well-architected commerce base could become one of the first meaningful platform-standard packages.[4][9] + +## Three Decisions To Make Before Writing Production Code + +### 1. Capability manifest audit + +Before implementation starts, define the full dependency graph of the plugin. This should include every external hostname, every internal binding, every read/write need, and every admin extension requirement. Because EmDash enforces capabilities at the manifest level, this is not paperwork; it is part of the application architecture.[1] + +Minimum pre-build checklist: + +- List all third-party APIs, including sandbox and production domains. +- Map each plugin action to required capabilities. +- Confirm whether the admin UI schema can express needed workflows. +- Design error messages for capability-denied failures. + +### 2. Schema and migration strategy + +EmDash's cleaner schema model is a strength, but public material around versioned migration workflows is immature at v0.1.0.[2] A plugin that expects its data model to stay fixed is unrealistic, especially in commerce. + +Minimum pre-build checklist: + +- Define versioned collection schemas for products, orders, customers, and operational metadata. +- Establish a migration convention before launch, even if first-party tooling is immature. +- Test schema changes against realistic staging data. +- Decide whether D1 is sufficient for write-heavy workflows or whether some operations should move to an external database path.[6] + +### 3. Cart, session, and payment architecture + +There is no native fiat checkout stack in EmDash. Native x402 support is real, but it is aimed at stablecoin/agent-style payment flows rather than conventional human checkout.[6] For most commerce use cases, Stripe or similar must be integrated through plugin-defined capabilities. + +Workers-style environments also force explicit state design. There is no dependable PHP-style shared session flow to lean on. Cart state, checkout progression, and order promotion must be designed deliberately.[6] + +Minimum pre-build checklist: + +- Choose where cart state lives and why. +- Define idempotent order creation and webhook handling. +- Separate active cart state from committed order state. +- Evaluate x402 separately as an additional monetization path for digital or agent-facing products.[6] + +## Recommended Default Architecture For A First Commerce Plugin + +If the goal is a first practical EmDash commerce plugin, the simplest sane default is: + +- **Theme**: render-only storefront and account views. +- **Plugin**: owns product logic, cart APIs, checkout APIs, order creation, inventory adjustment, and admin tools.[6] +- **D1**: source of truth for committed entities such as products, orders, and inventory snapshots.[6] +- **KV or equivalent ephemeral store**: active cart/session-style state if low-latency temporary state is needed. +- **R2**: media assets and downloadable goods where appropriate.[6] +- **Stripe or equivalent**: fiat payments via explicitly declared hostnames and verified webhooks. +- **x402**: optional second rail for agent-facing or micropayment-oriented flows.[6] + +This architecture is intentionally boring. It avoids speculative abstractions and keeps business logic in the one place EmDash is clearly designed to support: plugins. + +## Gotchas Likely To Cause Rework + +### Runtime surprises + +A missing hostname declaration, an undeclared storage binding, or an assumed runtime capability will cause failure at the point of use, not at the point of design. Treat the manifest as code, review it like code, and test failure paths like code.[1] + +### Over-ambitious admin UX + +Because the admin UI is schema-driven, not arbitrary frontend code, it is risky to promise sophisticated operator workflows before confirming the schema supports them. Prototype the hardest admin screen first.[3] + +### Assuming mature ecosystem support + +The ecosystem is too new to assume standard plugins exist for taxes, shipping, reviews, subscriptions, or faceted search. Build plans should assume gaps, not abundance.[2][4] + +### Ignoring infrastructure fit + +Cloudflare-native deployment is the intended path. Public commentary notes that self-hosting is possible but the strongest isolation/security characteristics depend on Cloudflare's runtime model.[6] If production will not run primarily on Cloudflare, test the operational and security differences early. + +## Opportunities Worth Paying Attention To + +The same things that make EmDash harder than WordPress for plugin authors also create opportunity: + +- The commerce ecosystem is early, so foundational plugins have first-mover upside.[4][9] +- The capability model can become a real trust/safety differentiator compared with WordPress's shared-process plugin sprawl.[1][8] +- Structured content and typed collections are a better base for headless, AI-assisted, and multichannel commerce than WordPress's older content model.[6][7] +- Native x402 support creates a path to agent-to-agent or programmable payment products that WordPress does not natively target.[6] + +These are not reasons to overbuild. They are reasons to design cleanly and leave room for future monetization and product layers once the basics are stable. + +## What To Build First + +The first production target should be a narrow, boring, testable plugin slice: + +1. Product collection schema. +2. Read-only storefront rendering. +3. Cart API with explicit state handling. +4. Checkout integration with one fiat processor. +5. Order creation with idempotency. +6. Minimal admin tooling for catalog and order inspection. + +Anything beyond that, including coupons, subscriptions, advanced search, reviews, returns, or marketplace support, should be delayed until the platform's operational edges are better understood. + +## Handoff Guidance + +The developer starting this work should treat EmDash as an early-stage application platform, not as a mature CMS ecosystem. The practical approach is to keep the first plugin small, capability-explicit, schema-conscious, and infrastructure-aligned with Cloudflare's intended runtime model.[1][6] + +If a decision is unclear, default toward explicit contracts, simple data models, isolated responsibilities, and fewer moving parts. That approach fits both EmDash's current reality and the platform's likely evolution path.[1][2][3] diff --git a/emdash-commerce-deep-evaluation.md b/emdash-commerce-deep-evaluation.md new file mode 100644 index 000000000..df5a35678 --- /dev/null +++ b/emdash-commerce-deep-evaluation.md @@ -0,0 +1,643 @@ +# EmDash Commerce — Deep Project Evaluation and Feature-Fit Review + +## Scope reviewed + +I reviewed the current project bundle, including: + +- `3rdpary_review_2.md` +- `commerce-plugin-architecture.md` +- `emdash-commerce-final-review-plan.md` +- `commerce-vs-x402-merchants.md` +- `high-level-plan.md` +- `skills/creating-plugins/SKILL.md` +- `packages/plugins/forms/*` reference files +- `packages/plugins/commerce/*` current kernel scaffold and tests + +## Current status update (2026-04-03) + +The codebase has now moved from architecture-only recommendation to a validated v1 kernel slice: + +- core handlers and finalize path are implemented and covered by passing package tests, +- idempotency, webhook replay, and inventory ledger behavior are in place, +- token-guarded possession is strict for cart and order access, +- zero-legacy strict token contracts are now enforced in typed domain models, +- and full suite checks are green at package and workspace level. + +Design decisions locked since the original review: + +- keep the kernel narrow and correctness-first, +- prefer local provider adapters over internal HTTP delegation for v1, +- keep one authoritative finalization path, and +- defer broad feature breadth (shipping/tax/discounts/adaptive bundles) until after first slice correctness is proven. + +--- + +## Executive verdict + +> **Note:** The material that follows reflects the historical deep review snapshot. +> The latest project posture is captured in: +> +> - `Current status update (2026-04-03)` above +> - `emdash-commerce-final-review-plan.md` +> - `@THIRD_PARTY_REVIEW_PACKAGE.md` +> - `external_review.md`. + +The project is **architecturally promising and materially better than a WooCommerce-style clone**, but it is **still not yet a validated commerce system**. Today it is best described as: + +> **a strong architecture specification plus a thin kernel scaffold, not yet a working commerce implementation.** + +That is not a criticism by itself. It is the correct stage for a risky foundational project. But it matters, because the current codebase is still too early to “prove” the design. + +My final judgment: + +- **Direction:** strong +- **Conceptual architecture:** good to very good +- **Platform alignment with EmDash:** good +- **Current implementation maturity:** early / pre-vertical-slice +- **Readiness for broad feature expansion:** not yet +- **Readiness for a focused v1 payment slice:** yes + +If the team stays disciplined, this can become an unusually clean commerce foundation. If scope expands too early, it could still become an elegant-looking but under-validated architecture exercise. + +--- + +## Overall assessment of the project as a whole + +## What is clearly good + +### 1. The project now has much better architectural discipline than the earlier pass + +Compared with the earlier plan, the revised codebase and documents show real improvement: + +- the architecture now centers the **kernel** +- the project explicitly prioritizes **Stripe-first vertical validation** +- it treats **payment finalization** as the one critical mutation boundary +- it separates **provider contracts** from WooCommerce-style hook mutability +- it formalizes **inventory versioning**, **ledgering**, **idempotency**, and **webhook dedupe** +- it acknowledges **EmDash native vs standard plugin constraints** +- it narrows the role of HTTP delegation and prefers local adapters first + +That is exactly the right direction. + +### 2. The data model is thoughtful in the places that matter most + +The best parts of the architecture are the parts that are hardest to retrofit later: + +- discriminated product types +- separate product variants +- explicit product attributes +- immutable order snapshots +- append-only inventory ledger +- payment attempts +- webhook receipts +- idempotency key persistence +- explicit state machines +- cart merge rules +- operational recovery paths like `payment_conflict` + +These are signs that the project is being designed by someone thinking about real commerce failure modes rather than just storefront rendering. + +### 3. The project is mostly aligned with how EmDash actually works + +The revised direction fits EmDash’s model reasonably well: + +- native plugin for the commerce core where React admin, Astro components, and Portable Text support are needed +- standard or sandboxed plugins for narrower third-party provider integrations +- `ctx.*`-oriented thinking rather than assuming a traditional monolith +- awareness of Worker constraints and the limits of sandboxed plugin execution + +That platform fit is important, because EmDash’s current plugin model distinguishes sharply between trusted/native capabilities and sandboxed marketplace-style plugins. citeturn844054search1turn844054search2turn844054search0 + +--- + +## What is still weak or incomplete + +## 1. The architecture is ahead of the code by a wide margin + +This is the biggest truth about the current project. + +The documents are detailed and increasingly mature. The actual commerce package is still a **small kernel scaffold** with: + +- error metadata subset +- idempotency key validation +- rate-limit helper +- provider HTTP policy constants +- a narrow finalization decision helper +- tests around those helpers + +That means the project has **not yet earned confidence through execution pressure**. + +The architecture may be right. It may also still contain hidden awkwardness that only appears once the first real checkout, webhook, and finalize path are implemented. + +## 2. Some important architecture-to-code mismatches already exist + +These are not fatal, but they are signals. + +### A. Error-code naming is inconsistent + +At the time of this historical evaluation pass, the architecture document said error codes should be stable **snake_case strings**, but `src/kernel/errors.ts` exported uppercase internal keys. + +- `WEBHOOK_REPLAY_DETECTED` +- `PAYMENT_ALREADY_PROCESSED` +- `ORDER_STATE_CONFLICT` + +That mismatch was corrected in the later zero-legacy hardening pass; subsequent sections and current runbooks track the updated status. + +### B. Rate-limit terminology is inconsistent + +The architecture talks about **KV sliding-window** rate limits, but `rate-limit-window.ts` implements a **fixed-window counter**. + +A fixed window may be perfectly acceptable for v1. In the current pass, this behavior is treated as explicit and documented in the runtime and review notes. + +### C. Finalization logic is still narrower than the architecture promises + +`decidePaymentFinalize()` is useful, but it is still just a minimal guard. It does not yet embody the full architecture around: + +- auth vs capture flows +- payment status transitions +- inventory version mismatch handling +- duplicate-but-not-processed webhook states +- gateway event ordering +- conflict escalation path +- refund/void decision coupling + +That was normal for an early scaffold, but the hard logic path has now moved forward into the verified v1 kernel path and the zero-legacy progress updates. + +## 3. The system has not yet proven its storage mutation model + +The architecture rightly leans on: + +- inventoryVersion +- ledger writes +- unique webhook receipts +- idempotency keys +- one finalization path + +But the project has not yet shown the actual mutation choreography inside EmDash storage. + +This is where the next real risk lives. + +The key unanswered implementation question is not whether the design _sounds_ correct. It is whether the storage layer can enforce the design in a way that is: + +- deterministic +- race-safe enough for the chosen concurrency assumptions +- easy to reason about in code review +- easy to test with duplicate delivery and near-simultaneous purchase attempts + +Until that exists, the architecture remains a strong hypothesis. + +--- + +## Deep evaluation by area + +## 1. Architecture quality + +### Rating: 8.5/10 + +The architecture is good. + +Its strongest ideas are: + +- a real commerce kernel instead of UI-first feature assembly +- avoiding WooCommerce’s mutable extension model +- treating payments/inventory/orders as the backbone +- keeping extension points narrow +- embedding snapshots into orders +- using append-only audit surfaces where possible + +Its biggest remaining risk is not “bad architecture.” It is **too much architectural confidence before a real payment slice proves the seams**. + +That means the answer is not to simplify the architecture dramatically. The answer is to **validate it aggressively with one real flow before broadening scope**. + +--- + +## 2. Phasing and delivery strategy + +### Rating: 8.5/10 + +The revised phasing is much better than the earlier concept. + +Kernel first, then one Stripe slice, then hardening, then a second gateway is the correct order. + +The only caution I would add is this: + +> once the Stripe vertical slice begins, do not let surrounding admin/storefront polish grow faster than the finalization path and test harness. + +That is the easiest way for a commerce project to look like it is progressing while the dangerous core remains under-tested. + +--- + +## 3. Provider model + +### Rating: 8/10 + +The current provider model is coherent enough. + +The move away from HTTP-first internal delegation is correct. First-party providers should behave like local adapters unless the sandbox boundary genuinely forces route-based isolation. + +That said, the provider model will not be truly proven until the second gateway lands. + +Stripe alone can flatter an abstraction. + +Authorize.net or another auth/capture-oriented gateway is what will reveal whether the contract is really shaped correctly. + +So the current provider architecture is good, but still provisional in practice. + +--- + +## 4. Data model + +### Rating: 8.8/10 + +The data model is one of the strongest parts of the project. + +The following choices are especially strong: + +- product type discrimination +- separate variants +- attribute modeling +- inventory ledger +- order snapshots +- payment attempts +- webhook receipts +- idempotency key persistence +- order events +- cart merge rules + +My main caution is that the model should resist becoming too permissive through `meta` blobs and loosely governed `typeData` growth. + +The architecture remains strong only if: + +- `typeData` is tightly validated by product type +- bundle semantics do not leak into generic line items sloppily +- extension metadata stays namespaced and non-authoritative for core logic + +--- + +## 5. Code quality of what exists today + +### Rating: 7.5/10 for the current scaffold + +For what it is, the code is clean and sane. + +Good signs: + +- pure helpers +- small, testable functions +- narrow responsibilities +- no premature framework sprawl in the kernel +- tests exist already +- constants and limits are separated + +What keeps the score lower is simply scope: the hardest code does not yet exist. + +The project is still before the phase where the true design quality becomes visible in implementation. + +--- + +## Most important project-level recommendations + +## 1. Freeze the semantics that already leaked into code + +Before broader implementation continues, normalize these: + +- canonical error code format +- final naming of order/payment/cart states +- fixed-window vs sliding-window limit policy +- idempotency response replay shape +- webhook receipt statuses +- inventory conflict result semantics +- what exactly counts as “finalizable” + +Do this now, not after Stripe lands. + +## 2. Treat the storage adapter as the next critical deliverable + +The next big milestone should not just be “Stripe integration.” + +It should be: + +> **a storage-backed finalization path that proves the architecture can actually enforce its own invariants** + +That means implementing and testing: + +- order creation +- payment attempt persistence +- webhook receipt insertion / dedupe +- inventory version checks +- ledger write + materialized stock update +- idempotent finalize completion +- conflict path handling + +## 3. Keep the first live product type brutally narrow + +For the first end-to-end slice, support: + +- simple product +- maybe variable product only if necessary to prove attribute/variant handling + +Do not let bundles, gift cards, subscriptions, advanced discounting, or rich addon logic creep into the first transaction slice. + +## 4. Add a “resolved purchasable unit” concept before bundles get serious + +This matters for your bundle requirement. + +At checkout/finalization time, the system should resolve every purchasable thing into a normalized unit that the inventory and order snapshot layers can reason about consistently. + +That likely means a normalized structure along the lines of: + +- productId +- variantId +- sku +- qty +- unitPrice +- inventoryMode +- bundleComponent metadata if applicable + +This can stay internal. But without a normalized resolved-unit concept, advanced bundles become messy fast. + +--- + +## Evaluation of the two WooCommerce-style features you need + +## Feature 1 — Variant swatches with uploaded visual swatches instead of only dropdowns + +## Verdict + +**The current architecture is aligned with this feature, but the current data model is only partially complete for it.** + +### Why I say that + +The architecture already has a proper concept of product attributes and explicitly includes attribute display modes such as: + +- `select` +- `color_swatch` +- `button` + +That is a very good start. + +This means the architecture already understands that variant selection is not just raw dropdown data — it includes presentation metadata. That is exactly the right foundation. + +### What is missing + +Right now the model appears to support **color value swatches** via a term field like `color`, but not clearly **uploaded image swatches**. + +For the use case you described, you will likely want the attribute-term model to support something like: + +```ts +interface ProductAttributeTerm { + label: string; + value: string; + sortOrder: number; + color?: string; + swatchMediaId?: string; + swatchAlt?: string; +} +``` + +And possibly broaden `displayType` to: + +- `select` +- `button` +- `color_swatch` +- `image_swatch` + +### My recommendation + +Add image swatches as a **small, explicit extension** of the attribute model, not as generic metadata. + +That means: + +- keep swatches attached to attribute terms +- reference uploaded media via `mediaId` +- let the storefront components choose the rendering based on `displayType` +- let admin manage swatch media in the attribute editor +- make variant resolution depend on term values, not on the UI widget type + +### Complexity and risk + +- **Complexity:** low to moderate +- **Architectural risk:** low +- **Best timing:** after variable products are working in the first usable storefront/admin pass + +### Bottom line + +This feature is **well-aligned** with the current architecture and should be **easy to add cleanly**, provided the term model is extended deliberately for uploaded image swatches. + +--- + +## Feature 2 — Product bundles composed of multiple SKUs/products, with variable products inside the bundle and optional add-ons + +## Verdict + +**The current architecture is directionally aligned with bundles, but it is not yet fully modeled for the bundle behavior you actually want.** + +This is the more important and more difficult feature. + +### What is already good + +The architecture already includes: + +- a `bundle` product type +- bundle `items` +- `productId` +- optional `variantId` +- quantity +- optional price override +- pricing mode concepts + +That proves the system is already thinking in the right direction. + +### Where the current model falls short + +Your real requirement is more advanced than a static bundle. + +You want all of the following: + +1. a bundle made up of multiple products/SKUs +2. some component products may be **variable products** +3. the shopper may need to **choose the variant** for those bundle components +4. some components may be **optional add-ons** +5. those add-ons may themselves have variant choices +6. the order/inventory system still needs a clean resolved snapshot at checkout + +The current bundle shape in the architecture is not yet rich enough for that. + +It currently reads more like: + +- bundle contains fixed items +- maybe one fixed variant per item +- maybe pricing adjustments + +That is fine for a simple starter bundle model, but not enough for configurable bundle composition. + +### What the data model needs instead + +I would evolve bundle modeling toward **bundle components** rather than just bundle items. + +Something more like: + +```ts +interface BundleComponent { + id: string; + productId: string; + required: boolean; + defaultIncluded: boolean; + minQty: number; + maxQty: number; + allowCustomerQtyChange: boolean; + selectionMode: "fixed_variant" | "choose_variant" | "simple_only"; + fixedVariantId?: string; + allowedVariantIds?: string[]; + addonPricingMode?: "included" | "fixed" | "delta"; + addonPrice?: number; +} +``` + +And then the shopper’s actual cart line for the bundle would need a **resolved selection payload** recording which components and variants were chosen. + +### Architectural implication + +The key is this: + +> A bundle should not remain an abstract product at finalization time. + +Before pricing, inventory decrement, and order snapshotting complete, the bundle needs to be resolved into explicit component purchases. + +That does **not** mean you must expose separate visible cart lines to the shopper. It means the backend needs a normalized resolved representation. + +### How this affects inventory + +This is where the current architecture can support the feature, but only if implemented carefully. + +Inventory must be checked and finalized against the actual resolved components: + +- bundle parent may or may not have its own SKU +- component stock must be checked +- chosen component variants must be checked individually +- optional add-ons must become explicit resolved lines +- order snapshot must preserve both: + - the shopper-facing bundle structure + - the fulfillment/accounting-facing component resolution + +### My recommendation + +Treat bundle support in two levels: + +#### Level 1 — simple bundles + +- fixed components +- optional fixed add-ons +- no customer variant choice inside bundle, or very limited variant choice + +#### Level 2 — configurable bundles + +- customer chooses variants for component products +- optional add-ons +- per-component quantity rules +- full resolved-component snapshot in order data + +That lets the project land bundles incrementally without corrupting the underlying order and inventory model. + +### Complexity and risk + +- **Complexity:** moderate to high +- **Architectural risk:** moderate +- **Best timing:** after the first simple/variable product checkout path is stable + +### Bottom line + +This feature is **possible within the current architecture**, but it is **not yet fully modeled**. + +So the honest answer is: + +> **Yes, the architecture makes it possible. No, the current bundle schema is not yet sufficient for your actual requirement.** + +It needs a more explicit bundle-component design before implementation starts. + +--- + +## Final verdict on feature-fit + +## Swatches + +- **Fit with current architecture:** strong +- **Effort to add cleanly:** low to moderate +- **Confidence:** high + +## Configurable bundles with variants and optional add-ons + +- **Fit with current architecture:** moderate to strong +- **Effort to add cleanly:** moderate to high +- **Confidence:** medium +- **Important caveat:** requires a richer bundle model before implementation + +--- + +## What I would tell the developer to do next + +## Priority 1 — prove the commerce core + +Implement the first real vertical slice: + +- simple product +- cart +- checkout +- Stripe session/payment +- webhook +- finalizePayment +- ledger write +- order snapshot +- admin order view +- replay/conflict tests + +## Priority 2 — make variable products real + +Before swatches or advanced bundles, prove: + +- product attributes +- variant selection +- variant availability +- variant snapshotting into order lines +- inventory version checks on variants + +## Priority 3 — add image swatches + +Once variable products are real: + +- extend attribute term schema with swatch media +- build attribute/admin UI for uploaded swatches +- render image swatches in storefront component library +- keep resolution logic independent of widget type + +## Priority 4 — redesign bundle schema before implementing advanced bundles + +Do not start coding advanced bundles from the current `BundleTypeData` alone. + +First write a more explicit schema for: + +- bundle components +- required vs optional +- variant selection rules +- quantity rules +- pricing behavior +- resolved component snapshot format + +Then implement simple bundles first, configurable bundles second. + +--- + +## My final judgment in plain language + +This project is **on the right path**. + +It is not done. It is not yet proven. But it is pointed in a much better direction than a direct WooCommerce clone, and it now has enough architectural discipline that it is worth continuing. + +For your two specific WooCommerce-driven needs: + +- **swatches:** yes, this architecture supports them well +- **advanced bundles:** yes in principle, but the model needs to be extended before implementation + +So my final position is: + +> **Proceed. Keep the current overall architecture. Do not broaden scope yet. Prove the core. Add image swatches soon after variable products. Redesign bundle modeling before implementing configurable bundles with optional add-ons.** diff --git a/emdash-commerce-external-review-update-latest.md b/emdash-commerce-external-review-update-latest.md new file mode 100644 index 000000000..6e127d7c5 --- /dev/null +++ b/emdash-commerce-external-review-update-latest.md @@ -0,0 +1,223 @@ +# EmDash Commerce External Review Update + +## Review scope + +This memo reflects a review of the latest iteration contained in: + +- `abb1d36-review.zip` + +It updates the prior external-review memo based on the most recent code changes and handoff materials. + +--- + +## Executive summary + +This is a **materially better iteration**. + +The most important prior concern — that bundles looked more complete as a catalog concept than as a transactional commerce concept — now appears **substantially addressed**. + +The current code and handoff strongly suggest that bundle behavior is now integrated much more deeply into the transaction path, including: + +- bundle-aware stock validation during cart/checkout, +- finalize-time expansion of bundle lines into component inventory mutations, +- stronger bundle-aware snapshot handling. + +That is the biggest improvement in this version. + +At the same time, this iteration is **not fully polished yet**. The handoff still points to: + +- failing catalog tests, +- lint violations, +- and some remaining domain-hardening work. + +So the overall review changes from: + +> “good direction, but bundles are not yet transaction-complete” + +to: + +> **“good direction, with bundle transaction integration now materially improved; remaining concerns are mostly polish and completeness rather than architectural direction.”** + +--- + +## Overall verdict + +**Current state: stronger and more complete.** + +I do **not** see new architectural red flags in this iteration. + +Instead, I see meaningful improvements in the areas that mattered most: + +- bundle transaction semantics, +- catalog domain validation, +- snapshot/type discipline, +- small but important cleanup work. + +The remaining concerns are no longer primarily architectural. They are now more about: + +- implementation completeness, +- test cleanliness, +- lint hygiene, +- and remaining schema coverage gaps relative to the full target spec. + +--- + +## What improved materially + +### 1. Bundle transaction behavior looks much closer to correct + +This is the most important upgrade. + +The latest iteration appears to support bundles much more appropriately across the commerce flow, not just at the catalog layer. + +The review materials strongly suggest the following are now present: + +- bundle stock validation during cart/checkout, +- finalize-time expansion of bundle lines into component SKU inventory mutations, +- snapshot support that carries bundle component inventory context. + +That is the right direction and substantially closes the biggest gap from the previous review. + +The system now looks much closer to supporting bundles as both: + +- a catalog concept, and +- a transaction/inventory concept. + +That is a major improvement. + +### 2. Catalog invariants are tighter + +This version also improves domain validation in several practical ways. + +The changes appear to include: + +- explicit slug uniqueness checking on product update, +- explicit SKU code uniqueness checking on SKU update, +- explicit validation that bundle discount fields only apply to bundle products, +- stronger defaulting behavior in create flows. + +These are all worthwhile improvements. They reduce reliance on storage-layer uniqueness failures and improve correctness at the domain level. + +### 3. Asset unlink normalization appears improved + +This was a smaller issue in earlier review passes. + +The latest changes suggest asset unlink operations now re-normalize sibling positions after deletion. That is a good cleanup and gives the media-link layer more predictable behavior. + +### 4. Snapshot and typing discipline improved + +There are several signs of better implementation maturity here: + +- `CheckoutResponse` now appears explicitly typed, +- replay integrity tests appear tighter, +- snapshot helpers use more deterministic ordering behavior, +- bundle snapshot handling appears more deliberate. + +Taken together, these changes make the system feel more intentional and less ad hoc. + +--- + +## Remaining concerns + +### 1. The codebase still does not look fully “clean” + +This is the biggest remaining practical concern. + +The new handoff explicitly states there are still: + +- failing catalog tests, +- lint violations, +- open domain-hardening work. + +That matters. + +Even when the architecture is improving, unresolved test failures and lint debt reduce confidence in the current implementation state. + +So while I would describe the **direction** as strong, I would **not** yet describe the current iteration as fully solid until: + +- catalog test failures are resolved, +- lint issues in touched files are cleaned up, +- remaining domain gaps are either implemented or explicitly deferred. + +### 2. SKU schema still appears partial versus the full target spec + +The handoff still points to missing or incomplete areas such as: + +- inventory mode, +- backorder behavior, +- weight and dimensions, +- tax class, +- archived SKU behavior. + +That means the current implementation is progressing well, but it is still best described as a **good staged implementation**, not full parity with the broader product-catalog spec. + +That is acceptable if intentional, but it should be described honestly. + +### 3. Low-stock behavior is only partly improved + +The latest iteration appears to move low-stock counting to use `COMMERCE_LIMITS.lowStockThreshold`, which is structurally better than a hardcoded check. + +However, if the threshold is currently set to `0`, then the practical behavior is still closer to “out of stock” than true “low stock.” + +That is a useful structural step, but not a finished feature. + +### 4. Bundle component ordering deserves one more careful check + +One subtle point still worth reviewing: + +`normalizeBundleComponentPositions()` appears to assign positions based on the current array order, rather than explicitly sorting first. + +That may be completely fine if every caller already passes a correctly ordered array. But if any caller passes unsorted data, position stability could become inconsistent. + +I do not see enough evidence to call this a confirmed bug, but it is worth checking before calling the bundle layer fully polished. + +--- + +## Updated practical assessment + +Here is the concise version: + +- **Architecture:** strong +- **Bundle transaction integration:** materially improved +- **Catalog domain validation:** improved +- **Snapshot/order-history direction:** strong +- **Overall polish:** still incomplete due to test failures, lint debt, and partial SKU schema coverage + +That is a much better place to be than the previous review state. + +--- + +## Recommended next steps + +The next steps should focus less on architecture and more on closure: + +1. resolve failing catalog tests, +2. clean lint issues in touched files, +3. finish or explicitly defer remaining SKU schema fields, +4. verify bundle component ordering/normalization behavior, +5. keep bundle purchase and replay paths heavily integration-tested. + +At this point, the right move is not broad redesign. It is disciplined completion and cleanup. + +--- + +## Bottom line + +**Current state: better, and credibly better.** + +The most important prior concern appears materially reduced: + +> **Bundles now look much closer to being supported as both a catalog concept and a transaction/inventory concept.** + +That is a meaningful improvement. + +The main remaining concerns are now: + +- implementation polish, +- test cleanliness, +- lint hygiene, +- and still-partial SKU schema coverage versus the full specification. + +So the updated review is: + +> **This version materially improves the prior state. The bundle integration gap is much smaller, and the remaining issues are mostly completeness and polish rather than architectural direction.** diff --git a/emdash-commerce-external-review-update.md b/emdash-commerce-external-review-update.md new file mode 100644 index 000000000..fa162278b --- /dev/null +++ b/emdash-commerce-external-review-update.md @@ -0,0 +1,242 @@ +# EmDash Commerce External Review Update + +## Review scope + +This memo reflects a review of the current iteration contained in: + +- `emDash-review-for-external-review.zip` + +It is an update to the prior external-review posture, focused on the latest state of the catalog implementation and its integration with the existing commerce kernel. + +--- + +## Executive summary + +This is a **stronger iteration** than the prior version. + +The catalog layer now has real substance: + +- immutable-field rules are in place, +- variable-product invariants are materially better, +- shared domain helpers are cleaner, +- snapshot logic is better separated from handler code. + +The main remaining issue is this: + +> **Bundles appear to be more complete as a catalog concept than as a transactional commerce concept.** + +In other words, bundle creation, storage, pricing, and derived availability are advancing well, but checkout/finalization still appears too dependent on direct line-item inventory records rather than derived component inventory behavior. + +That is the most important thing that calls for an updated review. + +--- + +## Overall verdict + +**Current state: good, with meaningful architectural improvement.** + +I do **not** see new architectural chaos or obvious structural regression. + +The codebase is improving in the right ways: + +- less sloppy mutation behavior, +- better domain separation, +- stronger invariant enforcement, +- better groundwork for product snapshots and future catalog growth. + +But I would **not** yet describe the bundle implementation as fully end-to-end complete. + +--- + +## What improved materially + +### 1. Handler coupling is better + +Earlier concern around handler-to-handler coupling appears improved. + +The code now uses shared domain helpers such as: + +- `lib/catalog-domain.ts` +- `lib/catalog-variants.ts` +- `lib/catalog-bundles.ts` +- `lib/catalog-order-snapshots.ts` + +This is the right direction. + +It keeps handlers thinner and reduces the risk of circular or muddled handler responsibilities. + +### 2. Immutable-field discipline is now present + +This is a meaningful improvement. + +The current catalog-domain layer protects important immutable fields such as: + +- product `id` +- product `type` +- product `createdAt` +- SKU `id` +- SKU `productId` +- SKU `createdAt` + +That is much safer than loose merge-on-write behavior and better matches a commerce-grade data model. + +### 3. Variable-product invariants are reasonably solid + +This part now looks genuinely decent. + +The variable-product validation logic appears to enforce: + +- exact option count, +- only variant-defining attributes, +- no duplicate attribute assignment, +- no missing attribute values, +- no duplicate variant combinations. + +That is one of the strongest areas of the current implementation. + +### 4. Snapshot logic was extracted into a better place + +This is also a good improvement. + +Moving snapshot assembly into a shared helper such as `lib/catalog-order-snapshots.ts` is the correct design move. It keeps checkout code narrower and makes the historical-order strategy more explicit and maintainable. + +--- + +## Main issue requiring updated review + +## Bundles appear catalog-complete before they are transaction-complete + +This is the biggest issue in the current iteration. + +The code now appears to support bundle catalog behavior reasonably well: + +- bundle entities exist, +- bundle component management exists, +- derived pricing exists, +- derived availability exists. + +That is all good. + +However, checkout/finalization still appears to validate stock in a way that assumes a direct inventory row for each line item. If bundle products do **not** own independent inventory, then the transaction path must not require bundle-owned stock rows. + +### Why this matters + +Your own stated model is: + +- bundles do **not** have independent inventory, +- bundle availability is derived from component SKUs, +- successful purchase of a bundle should decrement component inventory, not bundle inventory. + +If checkout is still trying to validate line-item inventory directly against a bundle row, then one of two things is true: + +1. bundle purchases will fail incorrectly, or +2. fake bundle inventory rows are being used, which would violate the intended model. + +Either way, the model is not fully closed yet. + +### What should happen next + +Before bundle support is considered fully complete, the transaction core should explicitly support bundle lines by doing all of the following: + +- recognize bundle products in cart/checkout, +- validate stock against component SKUs, +- decrement component inventory on successful finalize, +- avoid requiring bundle-owned inventory rows. + +This is the main gap I would want fixed next. + +--- + +## Secondary concerns + +### 1. Update flows appear a little too dependent on storage-layer uniqueness + +This is not a deep flaw, but it is still worth tightening. + +Examples of what should be validated explicitly at the domain layer: + +- slug uniqueness on product update, +- SKU code uniqueness on SKU update, +- bundle discount field validity only for bundle products. + +Storage-level uniqueness is useful, but domain-level validation gives better correctness and much better admin/operator errors. + +### 2. Current SKU model still looks narrower than the full spec + +The current implementation appears staged, which is fine. But it still looks thinner than the full target schema in several areas. + +Examples that may still be missing or only partially implemented: + +- inventory mode (`tracked` vs `not_tracked`) +- backorder flag +- weight and dimensions +- tax class at SKU level +- archived SKU status beyond `active | inactive` + +That does **not** make the work bad. It just means this is best described as a **good staged implementation**, not yet full schema parity with the broader v1 catalog specification. + +### 3. Snapshot representation is ahead of some underlying bundle operations + +The snapshot system is structurally good. + +But because bundle stock and finalize semantics do not yet appear fully integrated, bundle snapshot handling currently looks stronger than the underlying transactional behavior for that same product type. + +That is a sequencing issue, not a design collapse, but it is still worth calling out. + +--- + +## Smaller notes + +These are smaller observations, but still useful: + +- asset unlink/reorder behavior should keep sibling positions normalized, +- low-stock logic should not simply mean `inventoryQuantity <= 0` if the intent is truly “low stock,” +- bundle discount fields should be constrained clearly to bundle products, +- read-style operations using post-style handler semantics are acceptable internally, but still a little awkward if judged as public API design. + +--- + +## Updated practical verdict + +The current codebase is stronger than the previous iteration. + +I would describe it this way: + +**The catalog architecture is now materially more credible. Immutable-field rules, variable-option invariants, shared domain helpers, and extracted order snapshot logic all improve the structure of the system.** + +But I would also say: + +**The bundle model still appears only partially integrated into the transaction core. Catalog support is ahead of checkout/finalization support, because bundle availability and stock ownership are derived from component SKUs while the transaction path still appears too dependent on direct line-item inventory rows.** + +That is the main outstanding concern. + +--- + +## Recommended next step + +The next priority should be: + +## Make bundles transaction-complete + +Specifically: + +1. teach checkout/cart validation how to handle bundle lines using component SKU stock, +2. teach finalization how to decrement bundle component inventory, +3. ensure no bundle-owned stock rows are required, +4. add integration tests for bundle purchase success/failure paths. + +Once that is done, the catalog work will feel much more end-to-end complete. + +--- + +## Bottom line + +**Current state: good, but not fully closed.** + +I do not see new architectural red flags. + +The most important update to the external review is: + +> **Bundles are implemented faster as a catalog concept than as a transactional commerce concept.** + +That is the main gap I would fix next. diff --git a/emdash-commerce-final-review-plan.md b/emdash-commerce-final-review-plan.md new file mode 100644 index 000000000..2a6ec7c9b --- /dev/null +++ b/emdash-commerce-final-review-plan.md @@ -0,0 +1,673 @@ +# EmDash Commerce Plugin — Final Review Direction and Implementation Plan + +## Purpose + +This document is the final direction for the EmDash commerce project after reviewing: + +- `3rdpary_review.md` +- `commerce-plugin-architecture.md` +- `high-level-plan.md` +- `skills/creating-plugins/SKILL.md` +- the bundled Forms plugin reference files + +It is written as a practical handoff for the current developer. The goal is not to restart the project. The goal is to sharpen the foundation now, before implementation choices calcify. + +## Progress checkpoint (2026-04-03) + +### What has been completed since this direction was defined + +- `packages/plugins/commerce/src` now includes a closed-loop payment finalization path with: + - webhook dedupe receipts + - payment attempt persistence + - inventory version checks + - ledger writes + - idempotent completion and replay responses +- Possession is enforced for cart reads/mutations (`ownerToken` + hash) and order readback (`finalizeToken` + hash). +- Legacy compatibility paths are removed from active runtime flows; token hashes are required in stored domain types. +- Package checks are currently green: + - `pnpm --filter @emdash-cms/plugin-commerce test` + - `pnpm --filter @emdash-cms/plugin-commerce typecheck` + - `pnpm test` for the workspace + +### What remains intentionally deferred + +- broader provider abstraction (one provider path remains v1 target), +- taxes/shipping/discount breadth, +- MCP surfaces and broader AI tooling, +- advanced bundle/product abstractions beyond minimal v1 scope. + +--- + +## Executive verdict + +The project is on a **promising path** and the current architecture shows strong judgment in several key areas: + +- EmDash-native commerce is the right framing. +- Typed contracts are the right answer to WooCommerce-style hook chaos. +- Headless Astro storefronts are the right default. +- Orders should be snapshots, not live joins into mutable catalog state. +- Inventory, payments, and order finalization should be treated as the real core. +- Designing for AI-readable and machine-usable operations is a good long-term choice. + +However, I do **not** recommend proceeding unchanged. + +The current plan is directionally strong, but it still risks being: + +- a little too abstract too early, +- slightly too HTTP-centric internally, +- too broad in surface area for v1, +- and not explicit enough yet on state machines, idempotency, and finalization correctness. + +So the correct move is: + +> **Keep the core philosophy. Tighten the boundaries. Shrink the first executable slice. Freeze the dangerous semantics now.** + +--- + +## Final recommendation in one sentence + +Build this as a **small, correctness-first commerce kernel with one brutally real end-to-end slice**, and delay formal complexity until it is justified by real pressure. + +--- + +## What should remain from the current plan + +These parts are sound and should remain in place. + +### 1. EmDash-native commerce, not WooCommerce mimicry + +Do not reproduce: + +- WordPress theme coupling +- mutable global hooks +- template override sprawl +- inheritance-heavy product logic +- extension-by-side-effect + +That is exactly the trap this project should avoid. + +### 2. Typed contracts over loose extensibility + +The architecture should stay contract-driven. Provider integrations should be typed, explicit, versioned, and narrow. + +### 3. Products as discriminated unions + +`type` + `typeData` is the correct direction. It is materially better than invasive inheritance trees. + +### 4. Orders as immutable snapshots + +Orders should embed commercial facts captured at checkout time. Do not make historical order integrity depend on live product rows. + +### 5. Shipping and tax outside the kernel + +Do not let shipping/tax complexity contaminate the first kernel. Keep them modular. + +### 6. Durable logged-in carts + +The logged-in durable-cart direction is correct, provided merge rules are explicitly defined. + +--- + +## Where the current plan should change + +## 1. Do not make internal HTTP delegation the default architectural boundary + +The current architecture leans toward a provider registry where the core calls provider routes over HTTP. The contract idea is good. The default execution model is not ideal. + +### Why this should change + +Within EmDash, especially with sandbox and Cloudflare-style constraints, making internal extension boundaries look like network boundaries too early creates avoidable problems: + +- more failure modes +- more subrequest pressure +- more timeout and retry complexity +- harder local testing +- awkward trust/auth assumptions between plugins +- premature coupling to route mechanics instead of domain contracts + +### Recommended correction + +Keep the provider registry, but support **three execution modes** conceptually: + +- `local` — direct in-process contract implementation +- `internal` — route-mediated/internal adapter only where isolation is genuinely needed +- `external` — real provider/webhook/API boundary + +For v1, prefer this rule: + +> **All core provider integrations should behave as local adapters first.** +> External API calls should happen inside the provider adapter itself. +> Do not add route-mediated internal delegation unless a real need appears. + +This preserves the contract model without forcing faux-network architecture inside the system. + +--- + +## 2. Shrink v1 to a real vertical slice + +The strongest devil’s-advocate critique is valid: the project risks solving for year three before proving month one. + +### The v1 slice should prove only this + +A customer can: + +1. view a simple product, +2. add it to a cart, +3. start checkout, +4. pay through one real gateway, +5. create a correct order snapshot, +6. finalize inventory safely, +7. see the order in admin, +8. and recover correctly from expected failure cases. + +That is the minimum slice that proves the foundation. + +### Therefore, v1 should exclude or defer + +- advanced bundle behavior +- rich analytics +- broad AI tooling +- MCP surfaces +- multiple storefront component families +- generalized fulfillment abstraction +- tax/shipping sophistication +- broad content block ecosystems +- aggressive event/platform generalization + +The right question for the first milestone is: + +> **Can this system survive a real purchase flow correctly and repeatedly?** + +If yes, then the architecture is earning its abstractions. + +--- + +## 3. Separate the architecture mentally now, even if code packaging stays simple initially + +I do recommend a conceptual split immediately, but not necessarily a heavy package split on day one. + +### Recommended conceptual layers + +#### Layer A — Commerce kernel + +Pure domain logic only: + +- product and variant domain rules +- cart logic +- pricing/totals +- order creation +- inventory transitions +- provider interfaces +- state transitions +- error codes +- idempotency model +- domain events + +No admin UI. No Astro. No React. No MCP. + +#### Layer B — EmDash plugin wrapper + +EmDash-specific glue: + +- plugin descriptor +- capabilities +- storage declarations +- routes +- config +- hook wiring + +#### Layer C — Admin UI + +Merchant-facing UI only. + +#### Layer D — Storefront UI + +Astro components and display primitives only. + +### Practical instruction + +For now, one repo and even one plugin package is acceptable if needed for speed. But the directories, imports, and tests must enforce these boundaries. + +Do **not** let kernel logic depend on admin/storefront concerns. + +--- + +## 4. Freeze the dangerous semantics before implementation expands + +There are a few areas where ambiguity is expensive. These must be explicitly written down before major coding continues. + +### A. Order state machine + +Define the allowed order states and transitions centrally. + +Suggested initial order states: + +- `draft` +- `payment_pending` +- `paid` +- `processing` +- `fulfilled` +- `canceled` +- `refund_pending` +- `refunded` +- `payment_conflict` + +### B. Payment state machine + +Suggested initial payment states: + +- `requires_action` +- `pending` +- `authorized` +- `captured` +- `failed` +- `voided` +- `refund_pending` +- `refunded` +- `partial_refund` + +### C. Cart state machine + +Suggested initial cart states: + +- `active` +- `converted` +- `expired` +- `abandoned` +- `merged` + +Do not let handlers improvise transitions independently. + +--- + +## 5. Define inventory finalization precisely + +The existing payment-first inventory direction is defensible, but only if its concurrency behavior is explicit. + +### Recommended rule + +The system should not perform inventory decrement as a scattered side effect. There must be **one authoritative finalization path**. + +### Recommended flow + +1. `checkout.create` validates the cart and creates a `payment_pending` order snapshot. +2. A payment attempt record is created. +3. The gateway flow begins. +4. On confirmation/webhook/callback, the system calls a single finalization function. +5. Finalization: + - verifies idempotency, + - verifies order state, + - performs a final availability/version check, + - decrements inventory, + - marks order/payment states, + - records events, + - emits merchant/customer side effects after the transaction boundary. + +### If inventory changed before finalize + +The system must produce a specific, stable error/result path such as: + +- `inventory_changed` +- `insufficient_stock` +- `payment_conflict` + +And there must be a documented refund/void policy when payment succeeded but stock cannot be finalized. + +--- + +## 6. Add an inventory ledger now + +Do not rely only on mutating `stockQty`. + +Create an explicit inventory transaction log from the beginning. + +Suggested fields: + +- `productId` +- `variantId` +- `delta` +- `reason` +- `actor` +- `referenceType` +- `referenceId` +- `createdAt` + +This will pay off later in reconciliation, debugging, reporting, and support. + +--- + +## 7. Freeze an error catalog early + +The project already values machine-readable errors. Good. Now formalize them. + +Suggested initial error catalog: + +- `inventory_changed` +- `insufficient_stock` +- `cart_expired` +- `product_unavailable` +- `variant_unavailable` +- `payment_initiation_failed` +- `payment_confirmation_failed` +- `payment_already_processed` +- `provider_unavailable` +- `shipping_required` +- `feature_not_enabled` +- `invalid_discount` +- `currency_mismatch` +- `order_state_conflict` +- `webhook_signature_invalid` +- `webhook_replay_detected` + +Every route should use a consistent structure for: + +- machine code +- human message +- HTTP status +- optional retryability flag +- optional structured details + +This is important for admin UX, storefront UX, AI tooling, and test reliability. + +--- + +## 8. Add idempotency and webhook handling as first-class design elements + +This is not a “later hardening” concern. It is part of the core. + +### Minimum required records + +- `paymentAttempts` +- `webhookReceipts` +- `idempotencyKeys` + +Suggested stored facts: + +- provider +- external request/event id +- order id +- status +- normalized payload reference or hash +- first seen timestamp +- processed timestamp + +The system must tolerate: + +- duplicate webhooks +- duplicate callbacks +- retried confirmations +- out-of-order provider events + +--- + +## 9. Be more opinionated about the product model, but keep v1 narrow + +The product model direction is good. The v1 feature set should still be narrow. + +### Recommended v1 support + +- simple products +- variable products only if truly necessary for the first slice +- digital as a small extension if trivial +- no heavy bundle semantics yet + +### Product/variant fields worth settling now + +#### Product + +- `merchantSku` optional +- `publishedAt` +- `requiresShipping` +- `taxCategory` +- `defaultVariantId` if variants exist +- denormalized `searchText` or equivalent + +#### Variant + +- normalized option values +- `active` +- `sortOrder` +- `priceOverride` +- `compareAtPriceOverride` +- `stockQty` +- `inventoryVersion` + +This is enough to avoid bad migrations later without opening too much scope now. + +--- + +## 10. Define customer identity and cart merge rules now + +Because logged-in durable carts are in scope, the merge semantics must be explicit. + +Write down: + +- whether guest checkout is allowed +- whether guest orders can later associate with a logged-in account by email +- what happens when a guest cart and user cart both exist on login +- whether line quantities merge, replace, or conflict +- what happens if merged items are no longer valid + +These rules should not emerge accidentally from implementation details. + +--- + +## 11. Promote observability to a mandatory workstream + +The commerce core needs operational clarity from the beginning. + +### Must-have observability + +- correlation id across checkout/payment/finalization flow +- order timeline or event stream +- provider call logs with redaction +- webhook receipt logging +- inventory mutation logging +- actor attribution (`customer`, `merchant`, `system`, `agent`) +- stable structured error payloads + +Do not postpone this until after the first gateway lands. It is part of making the first gateway safe to debug. + +--- + +## Final project shape I recommend + +## Principle + +**Keep the architecture strong, but prove it with the smallest real flow possible.** + +## Required approach + +- domain-first +- correctness-first +- small-scope +- explicit-state +- contract-driven +- low-magic +- test-first around dangerous transitions + +--- + +## Revised phased plan + +## Phase 0 — Architecture hardening + +This is the current highest-priority phase. + +The developer should produce or revise the architecture docs so that the following are explicit and unambiguous: + +1. order state machine +2. payment state machine +3. cart state machine +4. inventory finalization algorithm +5. provider execution model +6. idempotency model +7. webhook replay policy +8. error catalog +9. customer/cart merge rules +10. observability schema +11. compatibility/versioning policy for contracts and events + +This phase should end with a short, crisp architecture addendum. Not more sprawling prose. + +## Phase 1 — Minimal kernel implementation + +Implement only the smallest kernel required for a real purchase flow: + +- simple product model +- cart +- order snapshot creation +- totals +- payment attempt records +- inventory versioning +- inventory ledger +- idempotent finalization service +- error types +- domain event records + +No rich storefront library. No broad admin system. No AI/MCP work. + +## Phase 2 — One real vertical slice + +Build one full flow end to end: + +- product display +- add to cart +- cart view +- checkout start +- payment through one provider +- webhook/callback handling +- order finalize +- order visible in admin +- order timeline visible for debugging + +Use one gateway only in this phase. Stripe is a sensible choice. + +## Phase 3 — Hardening and test pressure + +Before expanding features, harden the first slice. + +Required tests: + +- duplicate webhook +- retry after timeout +- inventory changed before finalize +- stale cart +- payment success plus inventory failure +- order finalization idempotency +- repeated callback replay +- cancellation/refund state transition guards + +If the architecture bends badly here, adjust it now. + +## Phase 4 — Second gateway to validate abstraction + +Add a second gateway only after the first path is solid. + +The point is not feature breadth. The point is testing whether the provider abstraction is actually correct. + +If Authorize.net causes awkward branching or leaky abstractions, fix the contract before adding more providers. + +## Phase 5 — Admin UX expansion + +Only after the core transaction path is stable: + +- better product editing +- order detail pages +- settings UI +- basic operational dashboards +- low-stock visibility + +## Phase 6 — Storefront and extension growth + +After correctness is proven: + +- richer Astro components +- optional content blocks +- additional product types +- shipping/tax modules +- fulfillment abstractions +- AI/MCP surfaces + +--- + +## Concrete instructions to the current developer + +### Do next + +1. Revise the architecture doc with the frozen semantics listed above. +2. Reduce the first milestone to one real end-to-end checkout path. +3. Treat provider integrations as local adapters first. +4. Implement one authoritative finalization path. +5. Add inventory ledger + payment/idempotency records immediately. +6. Keep kernel logic isolated from admin/storefront code. +7. Add tests around replay, concurrency, and state transitions before expanding features. + +### Do not do yet + +- do not build wide provider ecosystems +- do not formalize marketplace/plugin breadth too early +- do not build MCP surfaces yet +- do not over-generalize analytics/events +- do not add broad bundle logic +- do not optimize prematurely for many execution paths + +### Watch for these anti-patterns + +- HTTP-shaped architecture where simple local contracts would do +- admin/storefront code importing kernel internals in uncontrolled ways +- `meta` fields turning into a junk drawer +- handler-specific state transition logic +- payment side effects happening outside the finalization boundary +- growing abstractions without a real second implementation forcing them + +--- + +## How I would rate the current project after this correction + +### Current direction + +Good. Promising. Worth continuing. + +### Current architectural maturity + +Not ready for broad implementation without one more tightening pass. + +### Overall verdict + +> **Proceed, but only after shrinking the first executable scope and freezing the risky semantics.** + +That is the best path to a durable commerce foundation on EmDash. + +--- + +## Acceptance criteria for the next review checkpoint + +Before broader implementation proceeds, the developer should be able to show: + +1. a revised architecture addendum covering the frozen semantics +2. a minimal kernel directory structure with clean boundaries +3. one implemented end-to-end simple-product checkout path +4. explicit state transition guards +5. idempotent payment finalization +6. webhook replay protection +7. inventory ledger records +8. structured errors with stable codes +9. tests covering duplicate finalize and stock-change failure cases +10. no unnecessary internal HTTP indirection in the core path + +If those are in place, the project is on a strong foundation. + +--- + +## Final note + +The existing plan has real strengths. This is not a teardown. It is a correction toward sharper execution. + +The right outcome is not “more architecture.” +The right outcome is: + +- **fewer assumptions** +- **more explicit semantics** +- **one real, correct commerce flow** +- **and an architecture that earns its abstractions by surviving real pressure** diff --git a/emdash-commerce-product-catalog-v1-spec-updated.md b/emdash-commerce-product-catalog-v1-spec-updated.md new file mode 100644 index 000000000..3d1b93c2e --- /dev/null +++ b/emdash-commerce-product-catalog-v1-spec-updated.md @@ -0,0 +1,1362 @@ +# EmDash Commerce Product Catalog v1 Specification + +## Document purpose + +This document defines the **v1 product catalog schema and implementation plan** for the EmDash commerce plugin. It is written as a build-ready specification for the developer. The goal is to create a clean, durable product model that supports: + +- simple physical products, +- simple digital/downloadable products, +- variable products, +- fixed bundles composed of SKU-level components, +- mixed physical + digital fulfillment, +- product images and galleries, +- future-safe storage abstraction, +- order-line historical accuracy via snapshots. + +This spec is intentionally **practical, explicit, and staged**. It is designed to reduce ambiguity, prevent over-engineering, and give the developer a clear build order. + +--- + +## Core principles + +### 1. Sellable units must be modeled consistently + +Every product must have **one or more SKU records**. + +That means: + +- a **simple product** has exactly **one SKU** +- a **variable product** has **multiple SKU variants** +- a **bundle** is a **sellable record** whose components reference underlying SKU records + +Do **not** mix models where sometimes the product itself is purchasable and sometimes only variants are purchasable. That creates downstream complexity in inventory, pricing, order lines, and bundle composition. + +### 2. The product record is not the inventory record + +The product is the catalog/container record. + +The SKU is the sellable unit and should own the fields that differ at the sellable level, such as: + +- SKU code +- price +- compare-at price +- cost (optional but recommended) +- inventory quantity +- barcode/GTIN/UPC +- weight and dimensions +- fulfillment behavior when SKU-specific +- variant option values + +### 3. Bundles must be SKU-derived, not stock-owned + +Bundles do **not** have independent inventory. + +Bundle stock must be derived from the availability of the component SKUs. When a bundle sells, inventory is decremented from the component SKU rows. + +### 4. Historical order accuracy must not depend on live catalog rows + +Orders must store **snapshots** of what was purchased at checkout time. + +Order lines may keep `product_id` or `sku_id` references for convenience, but those live references must **not** be treated as the authoritative historical record. + +### 5. Physical + digital should not always be modeled as a bundle + +A physical product may include access to one or more digital assets, such as: + +- a manual +- a PDF pattern +- setup instructions +- bonus download + +This should be supported through **digital entitlements / digital attachments** linked to the purchased SKU. Do not force every physical+digital combination into a formal bundle model. + +### 6. Storage must be abstracted + +For product images and digital files, do not bake in local filesystem assumptions. + +Store provider-neutral asset metadata so storage can move later from local disk to cloud/object storage with minimal schema churn. + +### 7. Align with EmDash's typed collections and media model + +This schema must align with EmDash's apparent platform model: + +- commerce entities such as products, SKUs, attributes, bundles, and category relationships should be modeled as **typed commerce collections/tables** +- images and downloadable files should be modeled as **media/file assets**, not as disguised product/content rows +- product-to-file relationships should be explicit links/references, not a WordPress-style "everything is one generic record" approach + +Practical rules for the developer: + +- do **not** model product images or downloads as generic product/content records +- do **not** make file storage paths the primary product-owned truth +- do **not** assume a WordPress-style universal `posts` table or attachment model + +Instead: + +- create explicit commerce entities for catalog data +- create explicit asset/media records for files +- link products/SKUs to assets through relation records +- keep file/storage metadata provider-neutral so local storage can later move to cloud storage with minimal redesign + +### 8. Upload flow must be asset-first, then product-linking + +The product/file flow should be designed as: + +1. create or upload media asset +2. receive asset/media identifier +3. link asset to product or SKU +4. use that relation in storefront/admin retrieval + +Do not design the catalog API around sending binary file payloads inside product create/update requests unless EmDash explicitly requires that later. The safer default is asset-first upload, then relational linking. + +--- + +## Supported v1 product capabilities + +The catalog must support the following: + +1. **Simple physical product** + - shipped to the customer + - one sellable SKU + - may have one or more product images + - may optionally include one or more digital entitlements + +2. **Simple digital/downloadable product** + - no shipping required + - one sellable SKU + - may reference one or more downloadable files + - may enforce download rules + +3. **Variable product** + - parent catalog/container product + - two or more SKU variants + - sellable unit is always the SKU variant + - variants may differ by options such as size, color, material + - variants may override image, price, inventory, and shipping characteristics + +4. **Fixed bundle product** + - customer purchases the bundle as one unit + - bundle is composed of one or more underlying SKU components + - components may reference: + - simple product SKUs + - variable product SKUs + - bundle price is derived from component prices + - optional bundle discount is supported: + - fixed dollar amount + - percentage + - bundle has no independent stock + - bundle stock availability is derived from component stock + +5. **Images/media** + - product-level primary image + - product-level gallery images + - variant-level image override + - image metadata stored via provider-neutral asset records + +--- + +## Non-goals for v1 + +The following are explicitly out of scope unless separately approved: + +- subscriptions +- configurable/customizable bundles chosen by customer +- marketplace / multi-vendor features +- multi-warehouse inventory +- customer-specific pricing +- advanced tax engine integration +- reviews/ratings +- coupons/promotions beyond bundle discount +- product kits with optional substitutions +- internationalized per-locale product copy +- faceted search engine design +- returns/RMA schema +- gift cards +- serial number/license-key issuance + +The schema should leave room for future growth, but these features should **not** drive v1 complexity. + +--- + +## Domain model overview + +The v1 catalog should be modeled using the following primary entities: + +- `products` +- `product_skus` +- `product_attributes` +- `product_attribute_values` +- `product_sku_option_values` +- `product_assets` +- `product_asset_links` +- `digital_assets` +- `digital_entitlements` +- `bundle_components` +- `categories` +- `product_category_links` +- `product_tags` +- `product_tag_links` +- `order_line_snapshots` + +Some of these may be implemented as separate tables/collections, or as structured linked collections, depending on EmDash/D1 patterns. The important thing is that the conceptual boundaries remain intact. + +--- + +# 1. Entity specification + +## 1.1 `products` + +The `products` entity is the main catalog record. It is the storefront-facing/container record. + +### Required fields + +- `id` + - stable internal primary identifier +- `type` + - enum: + - `simple` + - `variable` + - `bundle` +- `status` + - enum: + - `draft` + - `active` + - `archived` +- `visibility` + - enum: + - `public` + - `hidden` +- `slug` + - unique storefront handle / URL key +- `title` +- `short_description` +- `long_description` +- `brand` + - nullable +- `vendor` + - nullable +- `featured` + - boolean +- `sort_order` + - integer +- `created_at` +- `updated_at` +- `published_at` + - nullable +- `archived_at` + - nullable + +### Recommended fields + +- `seo_title` +- `seo_description` +- `badge_text` + - e.g. `New`, `Limited`, `Best Seller` +- `requires_shipping_default` + - default for simple products or SKU fallback +- `tax_class_default` + - default for SKU fallback +- `metadata_json` + - tightly controlled extensibility field if needed + +### Rules + +- `slug` must be unique among non-deleted products. +- `variable` products act as catalog parents and must have 2+ SKU rows. +- `simple` products must have exactly 1 SKU row. +- `bundle` products should typically have 1 bundle sellable row if modeled as a purchasable product/SKU pair, but stock is derived from components. + +--- + +## 1.2 `product_skus` + +This is the most important commerce entity. Every purchasable unit must have a SKU record. + +### Required fields + +- `id` +- `product_id` +- `sku` + - unique merchant SKU code +- `status` + - enum: + - `active` + - `inactive` + - `archived` +- `title_override` + - nullable; optional label for variant/sellable display +- `currency` +- `price_minor` + - integer in minor currency unit +- `compare_at_price_minor` + - nullable +- `cost_minor` + - nullable but strongly recommended +- `inventory_mode` + - enum: + - `tracked` + - `not_tracked` +- `inventory_quantity` + - integer, nullable if `not_tracked` +- `allow_backorder` + - boolean +- `requires_shipping` + - boolean +- `is_digital` + - boolean +- `weight_grams` + - nullable +- `length_mm` + - nullable +- `width_mm` + - nullable +- `height_mm` + - nullable +- `barcode` + - nullable +- `tax_class` + - nullable +- `created_at` +- `updated_at` + +### Recommended fields + +- `position` + - sort order inside product +- `fulfillment_type` + - enum: + - `physical` + - `digital` + - `mixed` +- `hs_code` + - optional, future trade/shipping support +- `country_of_origin` + - optional +- `metadata_json` + +### Rules + +- Every `simple` product must have one SKU. +- Every `variable` product must have at least two SKUs. +- Inventory is always tracked at SKU level. +- Variant-specific price lives on the SKU, not the parent product. +- If `is_digital = true` and `requires_shipping = true`, then this is a mixed-fulfillment SKU and must be supported. +- For `not_tracked` inventory, `inventory_quantity` should be null or ignored. +- Negative inventory should be rejected unless explicitly enabled later. + +--- + +## 1.3 `product_attributes` + +Represents the attribute definitions used by variable products or descriptive metadata. + +### Required fields + +- `id` +- `product_id` +- `name` + - e.g. `Color`, `Size` +- `code` + - normalized machine-safe identifier, e.g. `color`, `size` +- `kind` + - enum: + - `variant_defining` + - `descriptive` +- `position` +- `created_at` +- `updated_at` + +### Rules + +- `variant_defining` attributes determine variant combinations. +- `descriptive` attributes are display-only and should not drive SKU uniqueness. + +--- + +## 1.4 `product_attribute_values` + +Allowed values for product attributes. + +### Required fields + +- `id` +- `attribute_id` +- `value` + - e.g. `Blue`, `Large` +- `code` + - normalized, e.g. `blue`, `large` +- `position` + +### Rules + +- Values must be unique per `attribute_id`. +- Order should be stable for display purposes. + +--- + +## 1.5 `product_sku_option_values` + +Maps a SKU to its selected option values for variant-defining attributes. + +### Required fields + +- `sku_id` +- `attribute_id` +- `attribute_value_id` + +### Rules + +- Every SKU under a variable product must have exactly one value per variant-defining attribute. +- No duplicate option combinations are allowed within the same product. +- Simple-product single SKUs do not need variant option rows. + +--- + +## 1.6 `product_assets` + +Represents a storage-provider-neutral asset record. + +This is used for images and may also support downloadable file assets if desired. + +### Required fields + +- `id` +- `asset_type` + - enum: + - `image` + - `file` +- `storage_provider` + - enum: + - `local` + - `r2` + - `s3` + - `other` +- `storage_key` + - opaque storage path/key +- `original_filename` +- `mime_type` +- `file_size_bytes` +- `checksum` + - nullable but recommended +- `width_px` + - nullable +- `height_px` + - nullable +- `access_mode` + - enum: + - `public` + - `private` +- `created_at` + +### Rules + +- The schema must not assume local filesystem semantics. +- `storage_key` must be treated as opaque. +- Image dimensions are required when asset_type is `image` if easily available. +- Asset records should be treated as EmDash-aligned media objects, not as overloaded product/content rows. +- The commerce layer should reference assets by ID/linkage, not by assuming direct file ownership inside the product record. + +--- + +## 1.7 `product_asset_links` + +Links assets to either products or SKUs. + +### Required fields + +- `id` +- `product_id` + - nullable +- `sku_id` + - nullable +- `asset_id` +- `role` + - enum: + - `primary_image` + - `gallery_image` + - `variant_image` +- `alt_text` + - nullable +- `position` +- `created_at` + +### Rules + +- Exactly one of `product_id` or `sku_id` must be set. +- Product-level galleries belong to product. +- Variant image overrides belong to SKU. +- A product should have at most one `primary_image`. + +--- + +## 1.8 `digital_assets` + +Represents downloadable or protected digital content made available to purchasers. + +This may share storage with `product_assets`, but logical separation is encouraged. + +### Required fields + +- `id` +- `asset_id` + - reference to file asset +- `label` + - display name for customer/admin +- `download_limit` + - nullable +- `download_expiry_days` + - nullable +- `is_manual_only` + - boolean +- `created_at` +- `updated_at` + +### Rules + +- These assets are for customer entitlements, not just product media. +- Protected/private access should be the default unless there is a strong reason otherwise. + +--- + +## 1.9 `digital_entitlements` + +Maps which digital assets are granted by purchasing a SKU. + +### Required fields + +- `id` +- `sku_id` +- `digital_asset_id` +- `granted_quantity` + - usually 1 +- `created_at` + +### Rules + +- This supports: + - simple digital products + - mixed physical+digital products + - bundle-derived digital access via component SKUs or bundle-level explicit entitlements +- Use this instead of forcing physical+digital combinations into a formal bundle model. + +--- + +## 1.10 `bundle_components` + +Defines which SKUs make up a fixed bundle. + +### Required fields + +- `id` +- `bundle_product_id` +- `component_sku_id` +- `quantity` +- `position` +- `created_at` +- `updated_at` + +### Bundle pricing fields (on bundle product or separate bundle pricing record) + +The bundle must also support: + +- `discount_type` + - enum: + - `none` + - `fixed_amount` + - `percentage` +- `discount_value_minor` + - nullable for fixed amount +- `discount_value_bps` + - nullable for percentage, e.g. basis points or percentage integer +- `rounding_mode` + - enum: + - `currency_standard` + +### Rules + +- Bundles are fixed composition only in v1. +- A component must reference a SKU, never a parent product alone. +- Bundle subtotal is derived from component SKUs × quantity. +- Final price = subtotal − bundle discount. +- Bundle inventory is derived from component availability. +- Bundle has no inventory row of its own. +- Bundle should support both: + - simple-product SKUs + - variant-product SKUs + +### Inventory availability rule + +Bundle sellable quantity should be computed as the minimum whole-bundle count supported by component stock: + +`min(floor(component_inventory / component_quantity))` + +ignoring `not_tracked` SKUs as unlimited for bundle availability purposes. + +--- + +## 1.11 `categories` + +### Required fields + +- `id` +- `name` +- `slug` +- `parent_id` + - nullable +- `position` +- `created_at` +- `updated_at` + +--- + +## 1.12 `product_category_links` + +### Required fields + +- `product_id` +- `category_id` + +--- + +## 1.13 `product_tags` + +### Required fields + +- `id` +- `name` +- `slug` +- `created_at` + +--- + +## 1.14 `product_tag_links` + +### Required fields + +- `product_id` +- `tag_id` + +--- + +## 1.15 `order_line_snapshots` + +This is a logical entity. It may live inside order line storage or in a dedicated snapshot structure. What matters is the semantics. + +### Required snapshot fields per order line + +- `product_id` + - nullable convenience reference +- `sku_id` + - nullable convenience reference +- `product_type` +- `product_title` +- `product_slug` + - nullable +- `sku` +- `sku_title` + - nullable +- `selected_options` + - structured map/list of attribute name + value +- `currency` +- `unit_price_minor` +- `quantity` +- `line_subtotal_minor` +- `line_discount_minor` +- `line_total_minor` +- `compare_at_price_minor` + - nullable +- `tax_class` + - nullable +- `requires_shipping` +- `is_digital` +- `weight_grams` + - nullable +- `image_snapshot` + - nullable representative image info +- `bundle_snapshot` + - nullable, but required for bundle lines: + - component SKU list + - quantities + - derived subtotal at purchase + - bundle discount type/value +- `digital_entitlement_snapshot` + - nullable, but recommended when digital access is granted + +### Rules + +- Snapshot data is the historical truth. +- Live catalog references are optional conveniences only. +- Snapshot must be written at checkout/order creation time. +- Editing the catalog later must not change historical order rendering. + +--- + +# 2. Product type behavior + +## 2.1 Simple physical product + +### Characteristics + +- product type = `simple` +- exactly one SKU +- SKU: + - `requires_shipping = true` + - `is_digital = false` unless mixed +- may have: + - product images + - digital entitlements attached to the SKU + +### Example + +A yarn kit sold as one physical shipped item, with an included PDF guide. + +--- + +## 2.2 Simple digital/downloadable product + +### Characteristics + +- product type = `simple` +- exactly one SKU +- SKU: + - `requires_shipping = false` + - `is_digital = true` +- one or more digital entitlements linked to SKU +- no shipping dimensions required + +### Example + +A downloadable knitting pattern PDF. + +--- + +## 2.3 Variable product + +### Characteristics + +- product type = `variable` +- 2+ SKU variants +- parent product contains: + - descriptions + - merchandising + - shared image gallery + - attribute definitions +- SKU variants contain: + - SKU code + - option combination + - price + - inventory + - barcode + - shipping characteristics + - optional variant image override + +### Example + +A sweater sold in sizes S/M/L and colors Blue/Red. + +--- + +## 2.4 Bundle product + +### Characteristics + +- product type = `bundle` +- fixed set of component SKUs +- derived bundle subtotal from components +- optional bundle discount +- no independent stock +- bundle availability derived from component SKU stock +- may include mixed components: + - physical only + - digital only + - physical + digital + +### Example + +A knitting starter bundle containing: + +- one yarn SKU +- one needle SKU +- one pattern PDF SKU + +--- + +# 3. Pricing rules + +## 3.1 SKU pricing + +Each SKU must support: + +- `price_minor` +- `compare_at_price_minor` (optional) +- `currency` + +The price on the SKU is the sellable price before cart/order-level promotions. + +## 3.2 Bundle pricing + +Bundle pricing must be derived from component prices. + +### Formula + +- Component subtotal = sum(component SKU price × quantity) +- Bundle discount: + - none + - fixed amount + - percentage +- Final bundle price = derived subtotal − discount + +### Required decisions + +- All bundle component SKUs must share currency. +- Rounding must be deterministic. +- Fixed discount must not reduce final price below zero. +- Percentage discount must be validated within sane bounds. + +## 3.3 Sale pricing + +v1 may support sale pricing via `compare_at_price_minor`, but a fully scheduled promotions engine is out of scope. + +--- + +# 4. Inventory rules + +## 4.1 Inventory belongs to SKU rows + +Inventory must never belong to the parent variable product. + +## 4.2 Bundle inventory is derived + +Bundle availability must be calculated from component SKU stock. + +## 4.3 Backorder behavior + +A tracked SKU may allow backorders if `allow_backorder = true`. + +If a bundle contains any tracked component with insufficient inventory and backorders are not allowed for that component, the bundle should be unavailable beyond supported quantity. + +## 4.4 Inventory tracking modes + +v1 inventory modes: + +- `tracked` +- `not_tracked` + +No multi-location or reserved-stock complexity in v1 unless already present elsewhere. + +--- + +# 5. Media and file handling + +## 5.1 Product images + +The catalog must support: + +- one product primary image +- multiple product gallery images +- optional variant image override + +## 5.2 Asset abstraction + +All media/file records must use provider-neutral storage fields: + +- storage provider +- storage key +- MIME type +- size +- checksum +- filename + +Do not store hardcoded local absolute paths in the schema. + +## 5.3 Digital downloads + +Digital files should be modeled as protected assets with entitlement rules. Even if local storage is used initially, schema should remain portable. + +--- + +# 6. Status, visibility, and lifecycle + +## 6.1 Product lifecycle states + +Required product statuses: + +- `draft` +- `active` +- `archived` + +Required visibility states: + +- `public` +- `hidden` + +## 6.2 SKU lifecycle states + +Required SKU statuses: + +- `active` +- `inactive` +- `archived` + +### Rules + +- Archived products should remain renderable in historical/admin order contexts. +- Archived SKUs must not break old order displays. +- Do not hard-delete products casually. + +--- + +# 7. Validation rules + +The following validations are required. + +## 7.1 Product validations + +- `simple` product must have exactly one SKU +- `variable` product must have at least two SKUs +- `bundle` product must have at least one bundle component +- `slug` must be unique +- `status` and `visibility` must be valid enums + +## 7.2 SKU validations + +- `sku` must be unique +- `price_minor` must be non-negative +- `compare_at_price_minor` must be null or >= `price_minor` +- if `inventory_mode = tracked`, inventory quantity must be integer +- if `requires_shipping = false`, dimensions/weight may be null +- if `is_digital = true`, at least one digital entitlement should exist for digital-only products + +## 7.3 Variable product validations + +- each variant-defining attribute must have allowed values +- each SKU must map one value for each variant-defining attribute +- no duplicate attribute combinations + +## 7.4 Bundle validations + +- each component must reference a valid SKU +- quantity must be positive integer +- bundle must not reference itself recursively +- all component SKUs must use same currency +- discount must not create negative final price + +## 7.5 Asset validations + +- primary image uniqueness per product +- only image assets can be linked with image roles +- digital entitlement files should be `private` by default + +--- + +# 8. Retrieval requirements + +The developer must support the following retrieval/use cases. + +## 8.1 Product detail retrieval + +Retrieve one product with: + +- core product fields +- active SKU rows +- attributes and values +- primary image + gallery +- variant image overrides +- category/tag associations +- bundle composition if bundle +- digital entitlement summary if needed for admin + +## 8.2 Catalog listing retrieval + +List products with: + +- primary image +- product title +- status/visibility +- price range summary +- inventory summary +- type +- featured flag +- category/tag filters later + +## 8.3 Bundle availability retrieval + +Given a bundle product, compute: + +- component list +- derived subtotal +- discount +- final bundle price +- max available whole-bundle quantity from stock + +## 8.4 Variant selection retrieval + +Given a variable product, return: + +- attributes/options +- allowed combinations +- per-SKU: + - price + - inventory + - image override + - status + +## 8.5 Admin retrieval + +Admin views must support: + +- draft/inactive products +- archived products +- hidden products +- low stock SKUs +- asset references +- digital entitlement associations + +--- + +# 9. Write/update requirements + +## 9.1 Product creation + +The developer must support creating: + +- simple product + one SKU +- variable product + attribute definitions + multiple SKUs +- bundle product + bundle components + bundle discount config + +## 9.2 Product update + +Must support updating: + +- core product copy and visibility +- SKU price/inventory fields +- product/variant images +- bundle composition +- digital entitlements +- category/tag assignments + +## 9.3 Soft lifecycle updates + +Must support: + +- publish/unpublish +- archive/unarchive +- activate/deactivate SKU + +## 9.4 Order snapshot compatibility + +When orders are created, the checkout/order flow must be able to consume product/SKU data and write snapshot-compatible line data without requiring schema redesign later. + +--- + +# 10. Recommended implementation order + +This section defines the build order. The developer should follow this order unless there is a very strong reason not to. + +## Phase 1 — Foundation schema and invariants + +Build first: + +1. `products` +2. `product_skus` +3. status/visibility enums +4. unique constraints: + - product slug + - SKU code +5. base validation layer for product type rules + +### Exit criteria + +- can create a simple product with one SKU +- can retrieve it +- can update it +- invalid shapes are rejected + +--- + +## Phase 2 — Media/assets abstraction + +Build next: + +1. `product_assets` +2. `product_asset_links` +3. image roles: + - primary + - gallery + - variant +4. local storage adapter using provider-neutral schema + +### Exit criteria + +- can upload/link one or more product images +- can assign primary image +- can assign variant image override +- schema does not depend on local-only path assumptions + +--- + +## Phase 3 — Variable product model + +Build next: + +1. `product_attributes` +2. `product_attribute_values` +3. `product_sku_option_values` +4. validation for duplicate variant combinations + +### Exit criteria + +- can create variable product +- can define attributes and allowed values +- can create multiple variant SKUs +- can retrieve variant matrix +- duplicate combinations rejected + +--- + +## Phase 4 — Digital entitlement model + +Build next: + +1. `digital_assets` +2. `digital_entitlements` +3. download metadata and access rules + +### Exit criteria + +- can create simple digital product +- can attach downloadable assets to SKU +- can attach digital entitlement to physical SKU +- schema remains storage-provider-neutral + +--- + +## Phase 5 — Bundle model + +Build next: + +1. `bundle_components` +2. bundle pricing fields +3. derived subtotal computation +4. bundle discount computation +5. bundle inventory availability computation + +### Exit criteria + +- can create fixed bundle from SKU components +- components can be simple or variable SKUs +- final derived price is correct +- bundle quantity availability is computed from component stock +- no independent bundle inventory is stored + +--- + +## Phase 6 — Catalog organization and retrieval + +Build next: + +1. `categories` +2. `product_category_links` +3. `product_tags` +4. `product_tag_links` +5. catalog-list retrieval shapes +6. admin retrieval shapes + +### Exit criteria + +- can list products for storefront/admin +- can retrieve products by category/tag +- admin can inspect type/status/basic inventory state + +--- + +## Phase 7 — Order snapshot integration + +Build next, before broad launch: + +1. order-line snapshot mapping +2. bundle snapshot rules +3. digital entitlement snapshot rules +4. representative image snapshot rules + +### Exit criteria + +- order creation can store frozen catalog snapshot data +- historical order rendering no longer depends on mutable live catalog rows +- bundles and digital entitlements are represented safely in order history + +--- + +# 11. API/handler recommendations + +The exact route names may change, but the following conceptual operations should exist. + +## Product operations + +- create simple product +- create variable product +- create bundle product +- update product +- archive product +- list products +- get product detail + +## SKU operations + +- create SKU +- update SKU +- set inventory +- set price +- activate/deactivate SKU + +## Asset operations + +- upload asset +- link asset to product +- link asset to SKU +- reorder gallery +- set primary image + +## Digital operations + +- create digital asset +- attach entitlement to SKU +- remove entitlement from SKU + +## Bundle operations + +- add bundle component +- remove bundle component +- reorder bundle components +- set bundle discount +- compute bundle summary + +These may be implemented as explicit handlers or internal service methods depending on EmDash plugin patterns, but the domain boundaries should remain clear. + +--- + +# 12. Order snapshot recommendation explained + +The chosen recommendation is: + +## **Use order snapshots plus optional live references** + +That means: + +- keep `product_id` / `sku_id` references if useful +- but always store frozen line-item purchase data at checkout time + +### Why this is required + +If live product rows change later, the order must still show exactly what the customer bought. + +Without snapshots, old orders can become incorrect when: + +- titles change +- prices change +- variants are archived +- bundle composition changes +- downloadable assets change + +This is not acceptable for real commerce. + +--- + +# 13. Must-pass scenario checklist + +The developer should treat the following as must-pass scenarios. + +## Simple product scenarios + +- create a simple physical product with one SKU +- attach gallery images +- mark draft, publish, archive +- update inventory and price + +## Digital product scenarios + +- create digital-only simple product +- attach downloadable file +- retrieve entitlement metadata +- confirm no shipping required + +## Variable product scenarios + +- create parent product with attributes `Color` and `Size` +- create multiple SKU combinations +- assign variant image override +- reject duplicate option combination + +## Bundle scenarios + +- create fixed bundle from three SKU components +- derive subtotal correctly +- apply fixed discount correctly +- apply percentage discount correctly +- compute bundle availability from component stock +- reject invalid component SKU references + +## Mixed physical + digital scenarios + +- create physical SKU with attached digital manual/PDF +- ensure shipping still required +- ensure digital entitlement is still granted + +## Snapshot scenarios + +- place order +- then rename product +- then change price +- then archive SKU +- historical order must still show original purchased data + +--- + +# 14. Developer guidance / anti-patterns + +The following are important constraints. + +## Do not: + +- store inventory on parent variable product rows +- let bundles own independent stock in v1 +- force every physical+digital combination into a bundle +- store raw absolute local file paths as canonical schema data +- model files the WordPress way as generic product/content rows +- rely only on live product references for order history +- make product type behavior ambiguous +- support customer-configurable bundles in v1 +- over-generalize with speculative plugin extension points before core catalog paths are solid + +## Do: + +- keep schema explicit +- keep sellable-unit logic on SKU rows +- keep bundle composition at SKU level +- keep storage provider abstracted +- build retrieval shapes that match real storefront/admin needs +- protect invariants with validation at write time + +--- + +# 15. Final recommended v1 minimum deliverable + +The minimum acceptable deliverable for this product catalog project is: + +1. simple physical product with one SKU +2. simple digital product with one SKU and downloadable entitlement +3. variable product with attributes and SKU variants +4. fixed bundle product composed of SKU-level components +5. product gallery plus variant image override +6. product status and visibility controls +7. SKU-level inventory and price fields +8. digital entitlement support for mixed physical+digital sales +9. category/tag assignment +10. order-line snapshot compatibility + +If these ten areas are implemented cleanly, the catalog foundation will be strong enough to support real commerce evolution without immediate redesign. + +--- + +## Final instruction to developer + +Build this in phases, keep the catalog kernel narrow, and protect invariants early. Do not skip the sellable-unit model, bundle rules, or order snapshot compatibility. Those are the structural decisions most likely to prevent painful rework later. diff --git a/emdash-commerce-third-party-review-memo.md b/emdash-commerce-third-party-review-memo.md new file mode 100644 index 000000000..1f4c14421 --- /dev/null +++ b/emdash-commerce-third-party-review-memo.md @@ -0,0 +1,171 @@ +# Third-Party Review Memo: EmDash Commerce Plugin Current State + +## Review scope + +This memo reflects a code and package review of the current `commerce-plugin-external-review.zip` archive and its associated reviewer-facing handoff files. + +Confirmed package metadata: + +- File path: `./commerce-plugin-external-review.zip` +- Generator script: `scripts/build-commerce-external-review-zip.sh` + +## Executive summary + +The current codebase is in **good shape**. + +This is now a **credible stage-1 EmDash commerce core** with disciplined route boundaries, a coherent possession model, sensible replay and recovery semantics, improved runtime portability, and stronger reviewer-facing documentation than earlier iterations. + +I do **not** see new architectural red flags. + +The main remaining production caveat is still the same one documented in earlier reviews: **perfectly concurrent duplicate webhook delivery remains the primary residual risk**, due to storage and claim limitations rather than an obvious design flaw in the application logic. + +## Overall assessment + +The project now reads like a deliberate and controlled commerce kernel rather than an experimental plugin. + +The implementation shows good judgment in the places that matter most for a first commerce foundation: + +- keeping the money path narrow, +- enforcing explicit possession and ownership semantics, +- designing for replay and partial recovery, +- avoiding premature feature sprawl, +- and packaging the code for serious outside review. + +In practical terms, this looks like a strong stage-1 base for controlled forward progress. + +## Key strengths + +### 1. Scope discipline is strong + +The core HTTP surface remains narrow and sane: + +- `cart/upsert` +- `cart/get` +- `checkout` +- `checkout/get-order` +- `webhooks/stripe` +- `recommendations` + +That is the right shape for an early commerce kernel. The codebase does not appear to be diluting critical checkout/finalization logic with premature secondary features. + +### 2. Possession and ownership semantics are coherent + +One of the strongest aspects of the design is the possession model: + +- carts use `ownerToken` / `ownerTokenHash` +- orders use `finalizeToken` / `finalizeTokenHash` + +This model appears consistent across cart access, mutation, checkout, and order retrieval. That gives the system a clear ownership story and reduces ambiguity around public access patterns. + +### 3. API semantics are materially improved + +`checkout/get-order` now reads as intentional API design rather than an evolving patch. + +Its behavior is appropriately tight: + +- token required for token-protected orders, +- invalid token rejected with order-scoped errors, +- legacy rows without token hash hidden behind `ORDER_NOT_FOUND`, +- token-hash values excluded from the public response. + +That is a meaningful improvement and increases both clarity and long-term maintainability. + +### 4. Replay and recovery thinking is strong + +The code continues to show good commerce instincts around failure handling: + +- explicit idempotency behavior in `checkout`, +- deterministic order and payment-attempt IDs, +- webhook verification before finalization, +- replay and resume semantics in finalization, +- documented handling of partial progress and `pending` states. + +That is one of the strongest parts of the codebase. The implementation appears to assume that failure, duplication, and partial progress will happen and is designed accordingly. + +### 5. Runtime portability is better than before + +The crypto/runtime story appears improved: + +- hot paths now use `crypto-adapter.ts`, +- the adapter fallback uses dynamic import rather than `require(...)`, +- the general runtime direction is better aligned with modern ESM and Worker-style environments. + +That does not make the portability story perfect, but it is notably cleaner than earlier iterations. + +### 6. Third-party review readiness is better + +The external handoff is stronger and easier to navigate: + +- `@THIRD_PARTY_REVIEW_PACKAGE.md` functions as a canonical reviewer entrypoint, +- `SHARE_WITH_REVIEWER.md` aligns with that entrypoint, +- the archive is easier for an outside reviewer to inspect without guessing where to start. + +That increases confidence not only in the code, but in the team’s ability to present it coherently to a third party. + +### 7. Extension seams look intentional, not accidental + +The current package suggests that extension points are being shaped deliberately: + +- `COMMERCE_EXTENSION_SURFACE.md` +- `AI-EXTENSIBILITY.md` +- `services/commerce-extension-seams.*` +- `services/commerce-provider-contracts.*` + +At present, this still looks controlled rather than overbuilt. The abstraction level appears acceptable for the current scope. + +## Main caveat + +### Same-event concurrency remains the primary residual production risk + +This is still the most important caution I would raise to a third-party reviewer. + +The apparent limitation is not in the overall architecture, but in the storage/claim model available to the system: + +- no true compare-and-set or insert-if-not-exists claim primitive, +- no transaction boundary across receipt, order, and inventory writes, +- perfectly concurrent duplicate webhook deliveries can still race. + +That means the system appears **well-designed within current storage limits**, but not fully hardened against simultaneous duplicate-event processing across workers. + +This caveat should remain explicit in any serious external review. + +## Secondary caution + +### `pending` remains the sharpest semantic area + +The current `pending` behavior appears defensible and much better documented than before. Even so, it is still the area most likely to be damaged by future refactors. + +That is because `pending` appears to serve two purposes: + +- claim/in-progress marker, +- resumable recovery state. + +That dual meaning is workable, but it should remain heavily test-protected and carefully documented. Any future cleanup in this area should be treated as high-risk. + +## Minor polish observations + +These are not architectural blockers, but they remain worth noting: + +- the repository/package could still benefit from a little less root-level review-document clutter, +- the crypto path should remain singular to avoid future drift, +- future changes should continue to prioritize failure-path tests over feature expansion. + +## Recommended near-term posture + +My recommendation would be: + +1. keep checkout and finalization narrow, +2. avoid broadening the money path prematurely, +3. continue adding tests only around duplicate delivery, partial writes, replay from `pending`, and ownership failures, +4. preserve a single runtime-portable crypto path, +5. keep the third-party review packet canonical and tidy. + +## Final verdict + +**This is a solid stage-1 EmDash commerce core.** + +It has disciplined boundaries, coherent possession and replay semantics, improved runtime portability, and stronger operational/reviewer documentation than earlier versions. + +I do **not** see new architectural red flags. + +The one meaningful remaining caveat is still the documented concurrency limitation around perfectly concurrent duplicate webhook delivery. That appears to be a platform/storage constraint issue, not evidence of careless application design. diff --git a/emdash_commerce_review_update_ordered_children.md b/emdash_commerce_review_update_ordered_children.md new file mode 100644 index 000000000..9d7337f91 --- /dev/null +++ b/emdash_commerce_review_update_ordered_children.md @@ -0,0 +1,123 @@ +# EmDash Commerce Review Update — Ordered Child Mutation Refactor Progress + +## Summary + +This stage is a good step. + +It directly addresses the next refactor pressure point from the prior review: **ordered child mutation logic is now more deliberate, more shared, and less fragile**. + +## What Improved + +### 1) Ordered-row logic is materially cleaner + +The new helpers are a real win: + +- `normalizeOrderedPosition` +- `normalizeOrderedChildren` +- `addOrderedRow` +- `removeOrderedRow` +- `moveOrderedRow` +- `persistOrderedRows` + +This is the right abstraction level. It removes repeated hand-written position math from multiple handlers without introducing a large new framework. + +This is the kind of refactor that pays for itself: + +- local +- validated +- low-risk +- clearly useful + +### 2) Bundle ordering is now deterministic + +This is the most important part of this stage. + +Bundle components are now sorted by: + +- `position` +- then `createdAt` as a tiebreaker + +and positions are normalized before persistence. + +That matters because earlier reorder/remove behavior could become unstable if storage returned equal-position rows in inconsistent order. This update closes that gap in a grounded, practical way. + +### 3) Asset and bundle paths now follow the same pattern + +This is a meaningful DRY improvement. + +The asset-link and bundle-component handlers now both follow the same shape: + +1. load rows +2. apply ordered-row mutation +3. normalize +4. persist + +That reduces cognitive load and lowers the chance that one path quietly drifts from the other. + +## Why This Is a Strong Refactor + +From a pragmatic engineering perspective, this is a good example of fixing what is actually costing the codebase. + +It improves: + +- **Cognitive load:** less repeated position logic +- **Correctness:** more deterministic reorder/remove behavior +- **DRY:** clearly better +- **YAGNI:** still disciplined, not speculative +- **Scalability:** modestly better because future ordered-child features now have a reusable pattern + +Importantly, it does **not** disturb the kernel or broaden scope. + +## What Was Validated + +These are the specific signs that this is not just stylistic cleanup: + +- a deterministic sort helper for bundle components was added +- bundle queries now normalize ordering before downstream use +- reorder/remove/create handlers for assets and bundles now share the same ordered-row mutation model +- tests cover: + - asset reordering + - bundle component reordering + - bundle component removal with position normalization + +That is enough evidence to say this change is supported by real code and tests, not just preference. + +## Minor Caveat + +There is one small note: + +`normalizeBundleComponentPositions(...)` now conceptually overlaps with `normalizeOrderedChildren(...)`. + +This is not a bug. But it is a small sign that the ordered-row abstraction is **almost** fully consolidated, not quite fully consolidated. + +This is not worth changing right now unless that area is already being touched again. + +## Recommendation + +**Accept this stage.** + +This is a practical refactor with good judgment: + +- it fixes a real stability issue +- it reduces duplication +- it stays disciplined + +## Current Overall State + +Compared with the earlier reviews, the codebase now looks materially healthier: + +- inventory consistency improved +- simple-product SKU capacity is guarded +- ordered-child mutation logic is cleaner and more deterministic + +The remaining concerns are no longer correctness-fire issues. They are more typical foundational-project concerns: + +- `catalog.ts` still carries a lot of responsibility +- read assembly is still somewhat heavy +- partial-write and transactional integrity are still only partially hardened + +None of those look like mandatory next-stage fixes unless new evidence shows they are causing trouble. + +## Bottom Line + +This stage is a good, appropriately scoped improvement. It strengthens correctness, reduces duplication, and keeps the project aligned with a disciplined, non-overengineered path. diff --git a/emdash_commerce_sanity_check_review.md b/emdash_commerce_sanity_check_review.md new file mode 100644 index 000000000..d17a00821 --- /dev/null +++ b/emdash_commerce_sanity_check_review.md @@ -0,0 +1,444 @@ +# EmDash Commerce Plugin — Fresh-Eyes Sanity Check Review + +Date: April 5, 2026 +Scope: Current state of the foundational ecommerce plugin framework in `packages/plugins/commerce` +Reviewer stance: Validate real issues only. Do not over-engineer. Do not fix what is not broken. + +--- + +## Executive Summary + +The current foundation is generally strong. The codebase shows a clear kernel-oriented structure, a disciplined checkout/finalization path, sensible schema and storage separation, and meaningful test coverage across catalog, checkout, and webhook flows. + +This is **not** a case of broad over-engineering. + +The main concerns are narrower and concrete: + +1. **Inventory currently has split authority** between SKU rows and `inventoryStock`, and those two paths are not being kept in sync. +2. **Catalog read assembly is already N+1 heavy** in several places. +3. **Ordered child mutations** for assets and bundle components use repeated multi-write normalization loops that increase correctness risk and duplication. +4. **`catalog.ts` is becoming a monolith**, which is not yet a failure, but is the main technical-debt pressure point. + +The only issue I would classify as a genuine correctness risk right now is **inventory split-brain**. The others are maintainability and scaling issues, not immediate architectural failures. + +--- + +## Objectives Review + +This review specifically looked for: + +- logic flaws +- edge cases +- performance issues +- technical debt +- duplicated or semi-duplicated data/processes that could be consolidated +- refactoring opportunities that respect EmDash best practices + +All recommendations below are based on validated code behavior, not assumptions. + +--- + +## Validated Findings + +### 1) Inventory has two sources of truth + +This is the highest-risk issue in the current codebase. + +### What the code shows + +`StoredInventoryStock` is defined as the materialized inventory record: + +- `packages/plugins/commerce/src/types.ts:191-207` + +`StoredProductSku` also stores inventory state directly on the SKU: + +- `packages/plugins/commerce/src/types.ts:241-254` + +Checkout validation reads from `inventoryStock`, not SKU inventory fields: + +- `packages/plugins/commerce/src/lib/checkout-inventory-validation.ts:33-99` + +Finalization also reads and writes `inventoryStock` only: + +- `packages/plugins/commerce/src/orchestration/finalize-payment-inventory.ts:42-44` +- `packages/plugins/commerce/src/orchestration/finalize-payment-inventory.ts:72-107` +- `packages/plugins/commerce/src/orchestration/finalize-payment-inventory.ts:157` + +SKU create/update handlers write SKU inventory fields, but do not create or synchronize a corresponding `inventoryStock` row in the same flow: + +- `packages/plugins/commerce/src/handlers/catalog.ts:996-1035` +- `packages/plugins/commerce/src/handlers/catalog.ts:1037-1061` + +Catalog listing derives inventory summaries and low-stock counts from SKU inventory fields: + +- `packages/plugins/commerce/src/handlers/catalog.ts:714-722` + +### Why this is a real problem + +This creates a validated split-brain condition: + +- A SKU can be created with inventory on the SKU document but **without** a matching `inventoryStock` row. +- Checkout can reject a purchasable SKU because it only trusts `inventoryStock`. +- Product listing can show stock and low-stock status based on SKU values that may not match the operational stock actually used by checkout/finalization. + +This is not hypothetical. The code paths are materially divergent. + +### Severity + +**High** — correctness and operational consistency. + +--- + +### 2) Catalog read assembly is already N+1 heavy + +This is a validated scaling and maintainability concern. + +### What the code shows + +`getProductHandler` performs multiple nested follow-up reads: + +- load product +- query SKUs +- query categories/tags/images +- for variable products, query option rows and images per SKU +- for bundles, load component SKUs individually +- for digital products, query entitlements per SKU and then load digital assets individually + +See: + +- `packages/plugins/commerce/src/handlers/catalog.ts:538-662` + +`listProductsHandler` queries products once, then per product performs additional queries for SKUs, images, categories, and tags: + +- `packages/plugins/commerce/src/handlers/catalog.ts:665-729` + +`buildOrderLineSnapshots` and related helpers perform repeated per-line and per-component lookups: + +- `packages/plugins/commerce/src/lib/catalog-order-snapshots.ts:53-62` +- `packages/plugins/commerce/src/lib/catalog-order-snapshots.ts:83-167` +- `packages/plugins/commerce/src/lib/catalog-order-snapshots.ts:195-255` + +### Why this matters + +At current scope, this may be acceptable. But the pattern is already repeated enough that it will become expensive and harder to reason about as: + +- product count grows +- variable products increase +- bundle composition grows +- digital entitlement usage expands + +This is not an emergency rewrite trigger. It is a good candidate for targeted refactoring before the catalog grows much larger. + +### Severity + +**Medium** — performance and maintainability. + +--- + +### 3) Ordered child mutation logic is duplicated and multi-write fragile + +This is a validated technical-debt and consistency risk. + +### What the code shows + +Asset-link creation, unlink, and reorder all rebuild ordered collections and then rewrite rows in loops: + +- `packages/plugins/commerce/src/handlers/catalog.ts:1201-1231` +- `packages/plugins/commerce/src/handlers/catalog.ts:1234-1256` +- `packages/plugins/commerce/src/handlers/catalog.ts:1259-1300` + +Bundle-component add, remove, and reorder do the same: + +- `packages/plugins/commerce/src/handlers/catalog.ts:1302-1371` +- `packages/plugins/commerce/src/handlers/catalog.ts:1373-1395` +- `packages/plugins/commerce/src/handlers/catalog.ts:1398-1440` + +### Why this matters + +The logic is reasonable, but it is duplicated and relies on repeated write loops. + +Risks: + +- partial failure can leave positions half-normalized +- bug fixes must be applied in multiple similar paths +- cognitive overhead increases because the same mutation shape exists in more than one domain area + +This is a good consolidation candidate because the duplication is concrete and local. + +### Severity + +**Medium** — maintainability and mutation safety. + +--- + +### 4) `catalog.ts` is becoming the technical-debt concentration point + +This is validated, but it is not yet a correctness issue. + +### What the code shows + +`catalog.ts` is currently 1,588 lines and mixes: + +- product CRUD +- SKU CRUD +- media and asset linking +- category/tag linkage +- bundle management +- digital assets and entitlements +- product read assembly + +See file length and content concentration: + +- `packages/plugins/commerce/src/handlers/catalog.ts` + +### Why this matters + +This raises: + +- change risk +- review complexity +- missed-path bugs in similar flows +- difficulty onboarding future contributors + +However, this should be treated as a **controlled refactor opportunity**, not a sign that the architecture is broken. + +### Severity + +**Low to Medium** — maintainability. + +--- + +## What Looks Good and Should Not Be Disturbed + +These areas look intentional and appropriately structured for the current stage: + +- kernel/finalization architecture +- inventory ledger + stock separation as a concept +- idempotency/finalization discipline +- schema-driven input validation +- extension seam direction +- broad route contract shape +- presence of meaningful tests across catalog and checkout flows + +I would **not** recommend broad architectural changes in these areas right now. + +--- + +## Refactoring Options + +The goal here is to improve the current solution without changing its overall shape. + +## Strategy 1 — Harden inventory source-of-truth rules + +### Description + +Keep the existing solution, but make `inventoryStock` the clearly authoritative operational stock record and eliminate drift between SKU-level inventory fields and `inventoryStock`. + +### What this would involve + +- Ensure SKU creation also creates the matching `inventoryStock` row. +- Ensure SKU inventory updates either: + - update both records consistently, or + - stop treating SKU inventory fields as authoritative in reads. +- Update catalog listing/detail inventory summaries so they read from operational stock or from a dedicated stock read-model assembler. +- Add tests proving SKU create/update cannot leave stock missing or stale. + +### Analysis + +**Cognitive load:** Low +**Performance:** Neutral to slightly better +**DRY:** Moderate improvement +**YAGNI:** Strong +**Scalability:** Strong for current stage +**EmDash fit:** Excellent — clear boundaries, minimal scope, high correctness value + +### Verdict + +This is the most important refactor. + +--- + +## Strategy 2 — Extract a catalog read assembler layer + +### Description + +Without changing route contracts, move catalog response composition into dedicated internal read builders/services. + +### What this would involve + +Create internal helpers for: + +- product list assembly +- product detail assembly +- order line snapshot assembly +- shared DTO builders for categories, tags, images, digital entitlements, and bundle summaries + +Batch related reads where possible and reuse shared assembly paths. + +### Analysis + +**Cognitive load:** Medium +**Performance:** Good improvement potential +**DRY:** High improvement +**YAGNI:** Reasonable +**Scalability:** Materially better +**EmDash fit:** Good — respects route contracts and modular service boundaries + +### Verdict + +Good second-phase refactor once correctness issues are stabilized. + +--- + +## Strategy 3 — Consolidate ordered-child mutation flows + +### Description + +Create one internal mutation helper for ordered child collections and use it for asset links and bundle components. + +### What this would involve + +Unify the pattern: + +1. load ordered rows +2. apply mutation +3. normalize positions +4. persist updated rows +5. assert invariants in tests + +Apply this helper to: + +- asset add/unlink/reorder +- bundle component add/remove/reorder + +### Analysis + +**Cognitive load:** Medium-low +**Performance:** Neutral +**DRY:** High improvement +**YAGNI:** Strong +**Scalability:** Indirectly strong because bug surface shrinks +**EmDash fit:** Very good — this is a clean internal consolidation + +### Verdict + +Very worthwhile. High signal, low scope expansion. + +--- + +## Strategy 4 — Split `catalog.ts` into bounded modules + +### Description + +Keep behavior the same, but split the handler file into narrower domain modules. + +### Suggested split + +- `catalog-products.ts` +- `catalog-skus.ts` +- `catalog-media.ts` +- `catalog-taxonomy.ts` +- `catalog-bundles.ts` +- `catalog-digital.ts` +- `catalog-read.ts` + +### Analysis + +**Cognitive load:** Best long-term +**Performance:** Neutral +**DRY:** Moderate unless combined with Strategies 2 or 3 +**YAGNI:** Acceptable only if done mechanically +**Scalability:** Strong for future contributor velocity +**EmDash fit:** Good, but less urgent than correctness consolidation + +### Verdict + +Useful, but not first. + +--- + +## Recommendation + +## Recommended sequence + +### First: Strategy 1 + +Fix inventory consistency first. + +Why: + +- It addresses the only clearly validated correctness flaw. +- It removes split authority between display-level and operational inventory. +- It reduces the chance of shipping a catalog that looks correct but fails at checkout. + +### Second: Strategy 3 + +Consolidate ordered-child mutation logic. + +Why: + +- The duplication is real. +- The consolidation is local and low-risk. +- It improves DRY and reduces maintenance burden without widening scope. + +### Third: Strategy 2, only if needed soon + +Extract read assembly if catalog complexity is actively growing. + +Why: + +- It is valuable, but not as urgent as correctness and duplication reduction. +- It should be done based on real pressure, not speculative elegance. + +### Fourth: Strategy 4, only as a mechanical cleanup + +Split `catalog.ts` after the higher-value refactors are done. + +Why: + +- This is about maintainability, not rescuing a broken design. +- Done too early, it risks generating churn without enough payoff. + +--- + +## Best Single Recommendation + +If choosing only one refactor right now: + +# Choose Strategy 1 — inventory source-of-truth hardening + +This is the best 10x-engineer recommendation because it solves the highest-risk issue with the least architectural disruption. + +It is: + +- validated by the current code +- high leverage +- not over-engineered +- fully aligned with the instruction to avoid fixing what is not broken + +--- + +## Concrete "Do Not Over-Engineer" Guidance + +To stay disciplined, avoid these moves for now: + +- do not redesign the storage model +- do not introduce a generalized repository abstraction everywhere +- do not rewrite checkout/finalize flow +- do not add broad caching infrastructure prematurely +- do not split files just for aesthetics +- do not replace working route contracts + +The right move is targeted improvement, not reinvention. + +--- + +## Final Bottom Line + +This project is in good shape as a foundational commerce plugin. + +It does **not** need a major architectural reset. + +The best next step is to correct the validated inventory consistency issue, then consolidate the repeated ordered-child mutation logic. After that, reassess whether catalog read assembly is large enough to justify extraction. + +That path gives the strongest improvement in correctness, maintainability, and future safety while remaining DRY, YAGNI-compliant, and faithful to EmDash best practices. diff --git a/external_review.md b/external_review.md new file mode 100644 index 000000000..780bf5197 --- /dev/null +++ b/external_review.md @@ -0,0 +1,14 @@ +# External developer review — pointer + +The full briefing for reviewers is in **[`@THIRD_PARTY_REVIEW_PACKAGE.md`](./@THIRD_PARTY_REVIEW_PACKAGE.md)**, then `HANDOVER.md`, `commerce-plugin-architecture.md`, and `3rd-party-checklist.md`. + +Use `@THIRD_PARTY_REVIEW_PACKAGE.md` as the canonical entrypoint. + +Regenerating **`commerce-plugin-external-review.zip`** copies the canonical review +packets plus the commerce plugin sources. Zip files are not included in the bundle. + +Priority review areas: + +- same-event concurrent webhook delivery remains the primary residual production risk, +- receipt `pending` semantics must remain replay-safe and resumable, +- concentrate on duplicate delivery, partial writes, and ownership/possession boundaries before suggesting broader architecture changes. diff --git a/high-level-plan.md b/high-level-plan.md new file mode 100644 index 000000000..71c25cbcf --- /dev/null +++ b/high-level-plan.md @@ -0,0 +1,169 @@ +# EmDash Ecommerce/Cart Plugin — High-Level Plan + +## Current status (2026-04-03) + +### Implemented and validated + +- Stage-1 kernel route set exists for carts, checkout, secure order readback, and Stripe webhook entry (`packages/plugins/commerce`). +- Token-based possession and idempotency semantics are enforced and covered by tests. +- Inventory ledgering, payment finalization bookkeeping, and webhook replay/conflict behavior are implemented. +- Core contract surfaces and route handlers are in place and passing full package test + typecheck. + +### In progress / deferred + +- EmDash-native storefront/admin extensions are the next growth area after kernel hardening. +- taxes/shipping/discounts, fulfillment abstractions, and broad storefront feature coverage remain out-of-scope for v1. +- multiple gateway comparison remains intentionally deferred until the first vertical slice is stable. + +## 1) Recommended architecture + +Implement this as a **trusted plugin** initially. + +`trusted` is the practical choice because: + +- custom API routes are required for cart/checkout flows +- rich admin pages/widgets are needed for order and product operations +- optional Portable Text blocks with custom rendering are required for editor insertion of product actions + +`packages/plugins/forms` demonstrates the trusted pattern and `docs/src/content/docs/plugins/sandbox.mdx` documents these constraints. + +## 2) Plugin capabilities and security + +Use explicit capability declarations: + +- `read:content`, `write:content` (if products are also represented in core content) +- `network:fetch` (payment gateway, shipping, fulfillment APIs) +- `email:send` (order email notifications) +- `read:users` (optional, for registered customers) +- `read:media`, `write:media` (optional, for product media workflows) + +Set `allowedHosts` narrowly to gateway and external service endpoints only (avoid `*` unless required for local dev). + +## 3) Data model in plugin storage + +Use `ctx.storage` as the canonical structured commerce store: + +### Collections + +- `products` + - fields: `sku`, `slug`, `name`, `basePrice`, `currency`, `active`, `stockQty`, `images`, `metadata` + - indexes: `sku`, `slug`, `active`, `category`, `createdAt` +- `carts` + - fields: `cartId`, `userId`/`visitorId`, `status`, `expiresAt`, `currency`, `discountCode`, `updatedAt` + - indexes: `userId`, `status`, `expiresAt` +- `cartItems` + - fields: `cartId`, `productId`, `variantId`, `qty`, `unitPrice`, `lineTotal` + - indexes: `cartId`, `productId` +- `orders` + - fields: `orderNumber`, `cartId`, `userId`, `customerSnapshot`, `subtotal`, `tax`, `shipping`, `total`, `status`, `paymentStatus`, `paymentProviderRef`, `createdAt`, `updatedAt` + - indexes: `status`, `paymentStatus`, `userId`, `createdAt` +- `orderEvents` (optional audit trail) + - fields: `orderId`, `event`, `actor`, `payload`, `createdAt` + - indexes: `orderId`, `createdAt` + +If available, use `uniqueIndexes` for stable identifiers such as `orderNumber`/`sku` and enforce uniqueness in handlers. + +## 4) KV keys (`ctx.kv`) + +Use KV for operational config/state: + +- `settings:commerce:provider` (gateway choice, region config) +- `settings:commerce:taxRates` (tax profiles/rules) +- `state:cart:expiryMinutes` +- `state:webhook:dedupe:` (idempotency/replay protection) + +Prefixing by `settings:` and `state:` helps avoid collisions and keeps maintenance simple. + +## 5) Public API routes (trusted plugin routes) + +Implement REST-style plugin routes under `/_emdash/api/plugins/emdash-commerce/...`: + +### Cart + +- `products.list` / `products.get` +- `cart.createOrResume` +- `cart.addItem` +- `cart.updateItem` +- `cart.removeItem` +- `cart.get` + +### Checkout + +- `checkout.create` + - validate cart state and inventory + - freeze price snapshot + - create order with status `pending` + - call payment provider session/intent endpoint via `ctx.http.fetch` +- `checkout.confirm` + - webhook handler + - verify signature and idempotency + - finalize order status and payment status + - decrement inventory and send notifications + +### Optional support endpoints + +- `shipping.estimate` +- `discount.apply` +- `coupon.validate` + +## 6) Admin UI + +Use `admin.pages` and `admin.widgets` for merchant workflows: + +- Product management page (create/edit/archive products) +- Order management page (status transitions, refunds, notes) +- Dashboard widget (today’s revenue, open carts, low stock, payout health) + +If using blocks for editor insertion, include plugin block metadata; rendering belongs to site-side Astro component integration in trusted mode. + +## 7) Payment model + +`@emdash-cms/x402` is a good EmDash-native primitive, useful for content-paywall styles or simple pay-per-content use-cases. + +For full cart checkout, start with direct gateway integration (one provider first), with a provider abstraction behind plugin settings to allow later expansion. + +## 8) Lifecycle and operational hooks + +- `plugin:install` / `plugin:activate` + - bootstrap default indexes/seed any required config references +- `plugin:deactivate` / `plugin:uninstall` + - clean up job state and optional temp data +- `cron` hook + - clear expired carts + - emit abandoned-cart reminders (email optional) +- `content`/`email` hooks +- `beforeSave/afterSave` hooks if inventory or order snapshots rely on content updates + +## 9) Transactional and reliability safeguards + +- EmDash plugin storage does not expose low-level DB transaction docs as primary contract, so use deterministic state guards: + - validate and lock inventory before order creation + - move orders through explicit states (`pending` → `authorized` → `paid` → `fulfilled`) + - keep webhook handlers idempotent using dedupe keys + - avoid double-charging and double-reserve by re-checking stock/status transitions + +## 10) Implementation phases (iterative, low risk) + +1. **Phase 1 (MVP)**: plugin descriptor, product/cart storage, public cart API routes. +2. **Phase 2**: checkout + payment session + webhook verification + order creation lifecycle. +3. **Phase 3**: admin pages/widgets, email confirmations, basic reporting metrics. +4. **Phase 4**: taxes/shipping/discounts, provider abstraction, abandoned cart automation. +5. **Phase 5**: polish (validation, logging, test coverage, docs, observability). + +## 11) Practical next steps + +From here: + +1. Scaffold plugin package (`packages/plugins/commerce`) with `definePlugin` and typed route handlers. +2. Implement `products`, `carts`, `orders` storage and minimal route handlers for adding/removing/reading cart. +3. Add checkout creation + basic payment provider integration. +4. Add admin list pages and KPI widget. + +## Reference files to mirror while implementing + +- `packages/core/src/plugins/types.ts` for plugin contracts +- `docs/src/content/docs/plugins/overview.mdx` +- `docs/src/content/docs/plugins/sandbox.mdx` +- `docs/src/content/docs/plugins/storage.mdx` +- `packages/plugins/forms/src/index.ts` and `packages/plugins/forms/src/handlers/submit.ts` for full-featured route/hook/admin patterns +- `docs/src/content/docs/guides/x402-payments.mdx` for payment strategy context diff --git a/latest-code_3_review_instructions.md b/latest-code_3_review_instructions.md new file mode 100644 index 000000000..7c2142380 --- /dev/null +++ b/latest-code_3_review_instructions.md @@ -0,0 +1,12 @@ +# Third-Party Review Instructions for latest-code_3 + +## Status + +This document is a historical review instruction packet for an earlier snapshot of the project. + +## Canonical current packet + +- Use `@THIRD_PARTY_REVIEW_PACKAGE.md` and `external_review.md` as the current review entrypoint. +- `SHARE_WITH_REVIEWER.md` describes the current single-file handoff flow for external reviewers. + +For archival context, this packet remains in the repo to preserve the original review progression. diff --git a/latest-code_4_review_instructions.md b/latest-code_4_review_instructions.md new file mode 100644 index 000000000..aebbb53ea --- /dev/null +++ b/latest-code_4_review_instructions.md @@ -0,0 +1,12 @@ +# Third-Party Review Instructions for latest-code_4 + +## Status + +This document is a historical review instruction packet for an earlier snapshot of the project. + +## Canonical current packet + +- Use `@THIRD_PARTY_REVIEW_PACKAGE.md` and `external_review.md` as the current review entrypoint. +- `SHARE_WITH_REVIEWER.md` describes the current single-file handoff flow for external reviewers. + +For archival context, this packet remains in the repo to preserve the original review progression. diff --git a/packages/admin/tests/editor/toolbar.test.tsx b/packages/admin/tests/editor/toolbar.test.tsx index beca90966..e9d0c5c7c 100644 --- a/packages/admin/tests/editor/toolbar.test.tsx +++ b/packages/admin/tests/editor/toolbar.test.tsx @@ -123,6 +123,12 @@ async function focusAndSelectAll(screen: Awaited>) { await userEvent.keyboard(`${mod}{a}${modUp}`); } +function getBoldButton(screen: Awaited>) { + return screen + .getByRole("toolbar", { name: "Text formatting" }) + .getByRole("button", { name: "Bold" }); +} + // ============================================================================= // 1. Toolbar Presence and Structure // ============================================================================= @@ -136,7 +142,7 @@ describe("Toolbar Presence and Structure", () => { it("has all formatting buttons", async () => { const { screen } = await renderEditor(); - await expect.element(screen.getByRole("button", { name: "Bold" })).toBeVisible(); + await expect.element(getBoldButton(screen)).toBeVisible(); await expect.element(screen.getByRole("button", { name: "Italic" })).toBeVisible(); await expect.element(screen.getByRole("button", { name: "Underline" })).toBeVisible(); await expect.element(screen.getByRole("button", { name: "Strikethrough" })).toBeVisible(); @@ -205,7 +211,7 @@ describe("Formatting Button Toggle States", () => { const { screen } = await renderEditor(); await focusAndSelectAll(screen); - const btn = screen.getByRole("button", { name: "Bold" }); + const btn = getBoldButton(screen); await expect.element(btn).toHaveAttribute("aria-pressed", "false"); btn.element().click(); @@ -357,7 +363,7 @@ describe("Formatting Button Toggle States", () => { const { screen } = await renderEditor(); await focusAndSelectAll(screen); - const btn = screen.getByRole("button", { name: "Bold" }); + const btn = getBoldButton(screen); // First click: on btn.element().click(); @@ -452,7 +458,7 @@ describe("Undo/Redo", () => { await focusAndSelectAll(screen); // Make a change - toggle bold - screen.getByRole("button", { name: "Bold" }).element().click(); + getBoldButton(screen).element().click(); const undo = screen.getByRole("button", { name: "Undo" }); await vi.waitFor( @@ -468,7 +474,7 @@ describe("Undo/Redo", () => { await focusAndSelectAll(screen); // Make a change - screen.getByRole("button", { name: "Bold" }).element().click(); + getBoldButton(screen).element().click(); const undo = screen.getByRole("button", { name: "Undo" }); const redo = screen.getByRole("button", { name: "Redo" }); @@ -495,7 +501,7 @@ describe("Undo/Redo", () => { await focusAndSelectAll(screen); // Make a change - screen.getByRole("button", { name: "Bold" }).element().click(); + getBoldButton(screen).element().click(); const undo = screen.getByRole("button", { name: "Undo" }); const redo = screen.getByRole("button", { name: "Redo" }); @@ -706,7 +712,7 @@ describe("WAI-ARIA Keyboard Navigation", () => { it("ArrowRight from Bold moves focus to Italic", async () => { const { screen } = await renderEditor(); - const bold = screen.getByRole("button", { name: "Bold" }); + const bold = getBoldButton(screen); const italic = screen.getByRole("button", { name: "Italic" }); // Focus the Bold button @@ -724,7 +730,7 @@ describe("WAI-ARIA Keyboard Navigation", () => { it("ArrowLeft from Italic moves focus to Bold", async () => { const { screen } = await renderEditor(); - const bold = screen.getByRole("button", { name: "Bold" }); + const bold = getBoldButton(screen); const italic = screen.getByRole("button", { name: "Italic" }); // Focus the Italic button @@ -742,7 +748,7 @@ describe("WAI-ARIA Keyboard Navigation", () => { it("Home moves focus to first button", async () => { const { screen } = await renderEditor(); - const bold = screen.getByRole("button", { name: "Bold" }); + const bold = getBoldButton(screen); const alignCenter = screen.getByRole("button", { name: "Align Center" }); // Focus a button in the middle @@ -759,7 +765,7 @@ describe("WAI-ARIA Keyboard Navigation", () => { it("End moves focus to last button", async () => { const { screen } = await renderEditor(); - const bold = screen.getByRole("button", { name: "Bold" }); + const bold = getBoldButton(screen); // Focus the first button bold.element().focus(); @@ -778,7 +784,7 @@ describe("WAI-ARIA Keyboard Navigation", () => { const { screen } = await renderEditor(); const spotlightBtn = screen.getByRole("button", { name: "Spotlight Mode" }); - const bold = screen.getByRole("button", { name: "Bold" }); + const bold = getBoldButton(screen); // Focus the last button spotlightBtn.element().focus(); @@ -794,7 +800,7 @@ describe("WAI-ARIA Keyboard Navigation", () => { it("ArrowLeft wraps from first to last button", async () => { const { screen } = await renderEditor(); - const bold = screen.getByRole("button", { name: "Bold" }); + const bold = getBoldButton(screen); // Focus the first button bold.element().focus(); diff --git a/packages/auth/src/adapters/kysely.ts b/packages/auth/src/adapters/kysely.ts index 24d3207a0..b5c5857c7 100644 --- a/packages/auth/src/adapters/kysely.ts +++ b/packages/auth/src/adapters/kysely.ts @@ -93,9 +93,8 @@ interface AllowedDomainTable { // ============================================================================ export function createKyselyAdapter(db: Kysely): AuthAdapter { - // Type cast to work with generic Kysely instance - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- generic Kysely narrowed to concrete AuthTables for internal queries - const kdb = db as unknown as Kysely; + // `Kysely` is structurally compatible at runtime with the subset this adapter reads/writes. + const kdb = db as Kysely; return { // ======================================================================== diff --git a/packages/cloudflare/src/db/d1-introspector.ts b/packages/cloudflare/src/db/d1-introspector.ts index 60ebf41db..01e5d8ccc 100644 --- a/packages/cloudflare/src/db/d1-introspector.ts +++ b/packages/cloudflare/src/db/d1-introspector.ts @@ -7,29 +7,31 @@ * This introspector queries tables individually instead. */ -import type { DatabaseIntrospector, DatabaseMetadata, SchemaMetadata, TableMetadata } from "kysely"; +import type { + DatabaseIntrospector, + DatabaseMetadata, + Kysely, + SchemaMetadata, + TableMetadata, +} from "kysely"; import { sql } from "kysely"; // Kysely's default migration table names const DEFAULT_MIGRATION_TABLE = "kysely_migration"; const DEFAULT_MIGRATION_LOCK_TABLE = "kysely_migration_lock"; -// Kysely's DatabaseIntrospector.createIntrospector receives Kysely. -// We must use `any` here to match Kysely's own interface contract — -// it needs untyped schema access to query sqlite_master dynamically. -// eslint-disable-next-line @typescript-eslint/no-explicit-any -type AnyKysely = any; +type IntrospectorShape = Record>; // Regex patterns for parsing CREATE TABLE statements const SPLIT_PARENS_PATTERN = /[(),]/; const WHITESPACE_PATTERN = /\s+/; const QUOTES_PATTERN = /["`]/g; -export class D1Introspector implements DatabaseIntrospector { - readonly #db: AnyKysely; +export class D1Introspector implements DatabaseIntrospector { + readonly #db: Kysely; - constructor(db: AnyKysely) { - this.#db = db; + constructor(db: Kysely) { + this.#db = db as Kysely; } async getSchemas(): Promise { diff --git a/packages/cloudflare/src/db/d1.ts b/packages/cloudflare/src/db/d1.ts index 4ef0e8962..cdbca5bf4 100644 --- a/packages/cloudflare/src/db/d1.ts +++ b/packages/cloudflare/src/db/d1.ts @@ -9,6 +9,7 @@ */ import { env } from "cloudflare:workers"; +import type { Database } from "emdash"; import type { DatabaseIntrospector, Dialect, Kysely } from "kysely"; import { D1Dialect } from "kysely-d1"; @@ -30,7 +31,7 @@ interface D1Config { * cross-join with pragma_table_info() that D1 doesn't allow. */ class EmDashD1Dialect extends D1Dialect { - override createIntrospector(db: Kysely): DatabaseIntrospector { + override createIntrospector(db: Kysely): DatabaseIntrospector { return new D1Introspector(db); } } diff --git a/packages/cloudflare/src/db/do-dialect.ts b/packages/cloudflare/src/db/do-dialect.ts index 391b15f3e..81acfa378 100644 --- a/packages/cloudflare/src/db/do-dialect.ts +++ b/packages/cloudflare/src/db/do-dialect.ts @@ -5,6 +5,7 @@ * Preview mode is read-only — no transaction support needed. */ +import type { Database } from "emdash"; import type { CompiledQuery, DatabaseConnection, @@ -62,7 +63,7 @@ export class PreviewDODialect implements Dialect { return new SqliteQueryCompiler(); } - createIntrospector(db: Kysely): DatabaseIntrospector { + createIntrospector(db: Kysely): DatabaseIntrospector { return new D1Introspector(db); } } diff --git a/packages/cloudflare/src/db/do-preview.ts b/packages/cloudflare/src/db/do-preview.ts index 0f1feb968..82be0d50a 100644 --- a/packages/cloudflare/src/db/do-preview.ts +++ b/packages/cloudflare/src/db/do-preview.ts @@ -22,6 +22,7 @@ import type { MiddlewareHandler } from "astro"; import { env } from "cloudflare:workers"; +import type { Database } from "emdash"; import { runWithContext } from "emdash/request-context"; import { Kysely } from "kysely"; import { ulid } from "ulidx"; @@ -220,13 +221,12 @@ export function createPreviewMiddleware(config: PreviewMiddlewareConfig): Middle // --- 4. Create Kysely dialect pointing at the DO --- const getStub = (): PreviewDBStub => { // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- RPC type limitation - return stub as unknown as PreviewDBStub; + return stub as PreviewDBStub; }; const dialect = new PreviewDODialect({ getStub }); // --- 5. Create Kysely instance and override request-context DB --- - // eslint-disable-next-line @typescript-eslint/no-explicit-any - const previewDb = new Kysely({ dialect }); + const previewDb = new Kysely({ dialect }); return runWithContext( { diff --git a/packages/cloudflare/src/db/do.ts b/packages/cloudflare/src/db/do.ts index c07ee3014..e2b5a34ea 100644 --- a/packages/cloudflare/src/db/do.ts +++ b/packages/cloudflare/src/db/do.ts @@ -48,7 +48,7 @@ export function createDialect(config: PreviewDOConfig & { name: string }): Diale const getStub = (): PreviewDBStub => { const stub = namespace.get(id); // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- Rpc type limitation with unknown in return types - return stub as unknown as PreviewDBStub; + return stub as PreviewDBStub; }; return new PreviewDODialect({ getStub }); diff --git a/packages/cloudflare/src/db/playground-middleware.ts b/packages/cloudflare/src/db/playground-middleware.ts index e56b55237..72ac8888a 100644 --- a/packages/cloudflare/src/db/playground-middleware.ts +++ b/packages/cloudflare/src/db/playground-middleware.ts @@ -15,6 +15,7 @@ import { defineMiddleware } from "astro:middleware"; import { env } from "cloudflare:workers"; +import type { Database } from "emdash"; import { Kysely, sql } from "kysely"; import { ulid } from "ulidx"; // @ts-ignore - virtual module populated by EmDash integration at build time @@ -79,7 +80,7 @@ function getStub(binding: string, token: string): PreviewDBStub { const doId = namespace.idFromName(token); const stub = namespace.get(doId); // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- RPC type limitation - return stub as unknown as PreviewDBStub; + return stub as PreviewDBStub; } /** @@ -117,11 +118,7 @@ function getSessionCreatedAt(token: string): string { /** * Initialize a playground DO: run migrations, apply seed, create admin user. */ -async function initializePlayground( - // eslint-disable-next-line @typescript-eslint/no-explicit-any - db: Kysely, - token: string, -): Promise { +async function initializePlayground(db: Kysely, token: string): Promise { // Check if already initialized (persisted in the DO) try { const { rows } = await sql<{ value: string }>` @@ -259,7 +256,7 @@ export const onRequest = defineMiddleware(async (context, next) => { const stub = getStub(binding, token); const dialect = new PreviewDODialect({ getStub: () => stub }); // eslint-disable-next-line @typescript-eslint/no-explicit-any - const db = new Kysely({ dialect }); + const db = new Kysely({ dialect }); if (!initializedSessions.has(token)) { await initializePlayground(db, token); @@ -300,7 +297,7 @@ export const onRequest = defineMiddleware(async (context, next) => { const stub = getStub(binding, token); const dialect = new PreviewDODialect({ getStub: () => stub }); // eslint-disable-next-line @typescript-eslint/no-explicit-any - const db = new Kysely({ dialect }); + const db = new Kysely({ dialect }); // Ensure initialized if (!initializedSessions.has(token)) { diff --git a/packages/cloudflare/src/plugins/vectorize-search.ts b/packages/cloudflare/src/plugins/vectorize-search.ts index 586981dea..a28a79b5f 100644 --- a/packages/cloudflare/src/plugins/vectorize-search.ts +++ b/packages/cloudflare/src/plugins/vectorize-search.ts @@ -45,6 +45,8 @@ import type { PluginDefinition, PluginContext, RouteContext, ContentHookEvent } from "emdash"; import { extractPlainText } from "emdash"; +const ASTRO_LOCALS_SYMBOL = Symbol.for("astro.locals"); + /** Safely extract a string from an unknown value */ function toString(value: unknown): string { return typeof value === "string" ? value : ""; @@ -55,6 +57,27 @@ function isRecord(value: unknown): value is Record { return value != null && typeof value === "object" && !Array.isArray(value); } +interface AstroRequestLocals { + runtime?: { + env?: CloudflareEnv; + }; +} + +interface PortableTextLikeBlock { + _type: string; + [key: string]: unknown; +} + +function isPortableTextLikeArray(value: unknown[]): value is PortableTextLikeBlock[] { + return value.every( + (item) => + item !== null && + typeof item === "object" && + "_type" in item && + typeof (item as { _type?: unknown })._type === "string", + ); +} + /** * Vectorize Search Plugin Configuration */ @@ -84,8 +107,7 @@ export interface VectorizeSearchConfig { function getCloudflareEnv(request: Request): CloudflareEnv | null { // Access runtime.env from Astro's Cloudflare adapter // This is available when running on Cloudflare Workers - // eslint-disable-next-line @typescript-eslint/no-explicit-any, typescript-eslint(no-unsafe-type-assertion) -- Astro locals accessed via internal symbol; no typed API available - const locals = (request as any)[Symbol.for("astro.locals")]; + const locals = (request as { [ASTRO_LOCALS_SYMBOL]?: AstroRequestLocals })[ASTRO_LOCALS_SYMBOL]; if (locals?.runtime?.env) { return locals.runtime.env; } @@ -112,9 +134,7 @@ function extractSearchableText(content: Record): string { const text = extractPlainText(value); if (text) parts.push(text); } else if (Array.isArray(value)) { - // Assume Portable Text array - // eslint-disable-next-line @typescript-eslint/no-explicit-any, typescript-eslint(no-unsafe-type-assertion) -- Portable Text arrays are untyped at this point; extractPlainText handles validation - const text = extractPlainText(value as any); + const text = isPortableTextLikeArray(value) ? extractPlainText(value) : JSON.stringify(value); if (text) parts.push(text); } } diff --git a/packages/cloudflare/tests/db/playground-dialect.test.ts b/packages/cloudflare/tests/db/playground-dialect.test.ts index 7c776828d..255c4cdd5 100644 --- a/packages/cloudflare/tests/db/playground-dialect.test.ts +++ b/packages/cloudflare/tests/db/playground-dialect.test.ts @@ -1,3 +1,4 @@ +import type { Database } from "emdash"; import { Kysely } from "kysely"; import { describe, it, expect } from "vitest"; @@ -32,7 +33,7 @@ describe("playground dummy dialect", () => { it("throws when a query is executed (no middleware ALS override)", async () => { const dialect = createTestDialect(); - const db = new Kysely({ dialect }); + const db = new Kysely({ dialect }); await expect( db diff --git a/packages/core/src/api/handlers/marketplace.ts b/packages/core/src/api/handlers/marketplace.ts index 6dcadb9f7..bf73937a8 100644 --- a/packages/core/src/api/handlers/marketplace.ts +++ b/packages/core/src/api/handlers/marketplace.ts @@ -241,10 +241,7 @@ export async function loadBundleFromR2( const parsed: unknown = JSON.parse(manifestText); const result = pluginManifestSchema.safeParse(parsed); if (!result.success) return null; - // Elements are validated as unknown[] by Zod; cast to PluginManifest - // for the Element[] type (Block Kit validation happens at render time). - // eslint-disable-next-line @typescript-eslint/no-unsafe-type-assertion -- Zod types elements as unknown[]; Element type validated at render time - const manifest = result.data as unknown as PluginManifest; + const manifest = result.data; // Try to load admin code (optional) let adminCode: string | undefined; diff --git a/packages/core/src/api/openapi/document.ts b/packages/core/src/api/openapi/document.ts index 35e7290bd..ca0bdfd5f 100644 --- a/packages/core/src/api/openapi/document.ts +++ b/packages/core/src/api/openapi/document.ts @@ -2249,7 +2249,7 @@ const userPaths = { // Merge all paths // --------------------------------------------------------------------------- -const allPaths = { +const allPaths: ZodOpenApiPathsObject = { ...contentPaths, ...mediaPaths, ...schemaPaths, @@ -2362,7 +2362,6 @@ export function generateOpenApiDocument(): oas31.OpenAPIObject { }, }, security: [{ session: [] }, { bearer: [] }], - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- readonly const paths are compatible at runtime - paths: allPaths as unknown as ZodOpenApiPathsObject, + paths: allPaths, }); } diff --git a/packages/core/src/astro/routes/api/auth/oauth/[provider].ts b/packages/core/src/astro/routes/api/auth/oauth/[provider].ts index d8150d5c6..eaccaedde 100644 --- a/packages/core/src/astro/routes/api/auth/oauth/[provider].ts +++ b/packages/core/src/astro/routes/api/auth/oauth/[provider].ts @@ -66,6 +66,12 @@ function getOAuthConfig(env: Record): OAuthConsumerConfig["prov return providers; } +type RuntimeLocals = { + runtime?: { + env?: Record; + }; +}; + export const GET: APIRoute = async ({ params, request, locals, redirect }) => { const { emdash } = locals; const provider = params.provider; @@ -88,8 +94,7 @@ export const GET: APIRoute = async ({ params, request, locals, redirect }) => { // Get OAuth providers from environment // Access via locals.runtime for Cloudflare, or import.meta.env for Node - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- locals.runtime is injected by the Cloudflare adapter at runtime; not declared on App.Locals since the adapter is optional - const runtimeLocals = locals as unknown as { runtime?: { env?: Record } }; + const runtimeLocals = locals as RuntimeLocals; // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- import.meta.env is typed as ImportMetaEnv but we need Record for getOAuthConfig const env = runtimeLocals.runtime?.env ?? (import.meta.env as Record); const providers = getOAuthConfig(env); diff --git a/packages/core/src/astro/routes/api/auth/oauth/[provider]/callback.ts b/packages/core/src/astro/routes/api/auth/oauth/[provider]/callback.ts index 7c69cd613..4a6059693 100644 --- a/packages/core/src/astro/routes/api/auth/oauth/[provider]/callback.ts +++ b/packages/core/src/astro/routes/api/auth/oauth/[provider]/callback.ts @@ -21,6 +21,12 @@ import { createOAuthStateStore } from "#auth/oauth-state-store.js"; type ProviderName = "github" | "google"; +type RuntimeLocals = { + runtime?: { + env?: Record; + }; +}; + const VALID_PROVIDERS = new Set(["github", "google"]); function isValidProvider(provider: string): provider is ProviderName { @@ -113,8 +119,7 @@ export const GET: APIRoute = async ({ params, request, locals, session, redirect try { // Get OAuth providers from environment - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- locals.runtime is injected by the Cloudflare adapter at runtime; not declared on App.Locals since the adapter is optional - const runtimeLocals = locals as unknown as { runtime?: { env?: Record } }; + const runtimeLocals = locals as RuntimeLocals; // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- import.meta.env is typed as ImportMetaEnv but we need Record for getOAuthConfig const env = runtimeLocals.runtime?.env ?? (import.meta.env as Record); const providers = getOAuthConfig(env); diff --git a/packages/core/src/astro/routes/api/import/wordpress-plugin/execute.ts b/packages/core/src/astro/routes/api/import/wordpress-plugin/execute.ts index 54fb42923..84f5972ab 100644 --- a/packages/core/src/astro/routes/api/import/wordpress-plugin/execute.ts +++ b/packages/core/src/astro/routes/api/import/wordpress-plugin/execute.ts @@ -16,7 +16,7 @@ import { wpPluginExecuteBody } from "#api/schemas.js"; import { BylineRepository } from "#db/repositories/byline.js"; import { getSource } from "#import/index.js"; import { validateExternalUrl, SsrfError } from "#import/ssrf.js"; -import type { ImportConfig, ImportResult, NormalizedItem } from "#import/types.js"; +import type { ImportConfig, ImportResult, NormalizedItem, PostTypeMapping } from "#import/types.js"; import { resolveImportByline } from "#import/utils.js"; import type { FieldType } from "#schema/types.js"; import type { EmDashHandlers, EmDashManifest } from "#types"; @@ -35,6 +35,54 @@ export interface WpPluginImportResponse { error?: { message: string }; } +function isRecord(value: unknown): value is Record { + return typeof value === "object" && value !== null && !Array.isArray(value); +} + +function isPostTypeMapping(value: unknown): value is PostTypeMapping { + if (!isRecord(value)) return false; + return typeof value.collection === "string" && typeof value.enabled === "boolean"; +} + +function parseWpPluginImportConfig( + rawConfig: Record, +): WpPluginImportConfig | null { + if (!isRecord(rawConfig.postTypeMappings)) return null; + const postTypeMappings: Record = {}; + for (const [postType, rawMapping] of Object.entries(rawConfig.postTypeMappings)) { + if (!isPostTypeMapping(rawMapping)) return null; + postTypeMappings[postType] = rawMapping; + } + + if (Object.keys(postTypeMappings).length === 0) return null; + + const config: WpPluginImportConfig = { + postTypeMappings, + }; + + if (rawConfig.skipExisting !== undefined && rawConfig.skipExisting !== null) { + if (typeof rawConfig.skipExisting !== "boolean") return null; + config.skipExisting = rawConfig.skipExisting; + } + + if (rawConfig.authorMappings !== undefined && rawConfig.authorMappings !== null) { + if (!isRecord(rawConfig.authorMappings)) return null; + const authorMappings: Record = {}; + for (const [login, userId] of Object.entries(rawConfig.authorMappings)) { + if (typeof userId === "string" || userId === null) { + authorMappings[login] = userId; + } else { + return null; + } + } + if (Object.keys(authorMappings).length > 0) { + config.authorMappings = authorMappings; + } + } + + return config; +} + export const POST: APIRoute = async ({ request, locals }) => { const { emdash, emdashManifest, user } = locals; @@ -57,8 +105,10 @@ export const POST: APIRoute = async ({ request, locals }) => { return apiError("SSRF_BLOCKED", msg, 400); } - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- Zod schema output narrowed to WpPluginImportConfig - const config = body.config as unknown as WpPluginImportConfig; + const config = parseWpPluginImportConfig(body.config); + if (!config) { + return apiError("VALIDATION_ERROR", `Invalid import config`, 400); + } // Get the WordPress plugin source const source = getSource("wordpress-plugin"); diff --git a/packages/core/src/astro/routes/api/import/wordpress/rewrite-urls.ts b/packages/core/src/astro/routes/api/import/wordpress/rewrite-urls.ts index 1002a4019..63b923dc5 100644 --- a/packages/core/src/astro/routes/api/import/wordpress/rewrite-urls.ts +++ b/packages/core/src/astro/routes/api/import/wordpress/rewrite-urls.ts @@ -357,17 +357,18 @@ async function rewriteUrls( if (rowUpdated) { try { - // Build update query dynamically - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- Kysely dynamic table requires type assertion - let query = db.updateTable(tableName as any).where("id", "=", row.id); - - for (const [key, value] of Object.entries(updates)) { - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- Kysely dynamic column update requires type assertion - query = query.set({ [key]: value } as any); + const setClauses = Object.entries(updates).map( + ([key, value]) => sql`${sql.ref(key)} = ${value}`, + ); + + if (setClauses.length > 0) { + await sql` + UPDATE ${sql.ref(tableName)} + SET ${sql.join(setClauses, sql`, `)} + WHERE id = ${row.id} + `.execute(db); } - await query.execute(); - result.updated++; result.urlsRewritten += rowUrlsRewritten; result.byCollection[collection.slug] = (result.byCollection[collection.slug] || 0) + 1; diff --git a/packages/core/src/auth/rate-limit.ts b/packages/core/src/auth/rate-limit.ts index 2710be0e3..7127f5635 100644 --- a/packages/core/src/auth/rate-limit.ts +++ b/packages/core/src/auth/rate-limit.ts @@ -112,8 +112,7 @@ export function rateLimitResponse(retryAfterSeconds: number): Response { */ export function getClientIp(request: Request): string | null { const headers = request.headers; - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- CF Workers runtime shape - const cf = (request as unknown as { cf?: Record }).cf; + const cf = (request as { cf?: Record }).cf; if (!cf) { // Not on Cloudflare — no trusted source of client IP diff --git a/packages/core/src/cleanup.ts b/packages/core/src/cleanup.ts index ef1014e0a..8f339334d 100644 --- a/packages/core/src/cleanup.ts +++ b/packages/core/src/cleanup.ts @@ -68,10 +68,8 @@ export async function runSystemCleanup( // 2. Magic link / invite / signup tokens try { - // Cast needed: Database extends AuthTables but uses Generated<> wrappers - // that confuse structural checks. The adapter casts internally anyway. - // eslint-disable-next-line @typescript-eslint/no-unsafe-type-assertion -- Database uses Generated<> wrappers incompatible with AuthTables structurally; safe at runtime - const authAdapter = createKyselyAdapter(db as unknown as Kysely); + // `db` includes all core tables; we only need AuthTables for this adapter. + const authAdapter = createKyselyAdapter(db as Kysely); await authAdapter.deleteExpiredTokens(); result.expiredTokens = 0; // deleteExpiredTokens returns void } catch (error) { diff --git a/packages/core/src/cli/commands/bundle.ts b/packages/core/src/cli/commands/bundle.ts index 2f10a8aec..7c1af67da 100644 --- a/packages/core/src/cli/commands/bundle.ts +++ b/packages/core/src/cli/commands/bundle.ts @@ -212,7 +212,7 @@ export const bundleCommand = defineCommand({ } else if (typeof pluginModule.default === "object" && pluginModule.default !== null) { const defaultExport = pluginModule.default as Record; if ("id" in defaultExport && "version" in defaultExport) { - resolvedPlugin = defaultExport as unknown as ResolvedPlugin; + resolvedPlugin = defaultExport as ResolvedPlugin; } } diff --git a/packages/core/src/database/dialect-helpers.ts b/packages/core/src/database/dialect-helpers.ts index a3fe6f408..a87f88dd7 100644 --- a/packages/core/src/database/dialect-helpers.ts +++ b/packages/core/src/database/dialect-helpers.ts @@ -14,26 +14,24 @@ import type { ColumnDataType, Kysely, RawBuilder } from "kysely"; import { sql } from "kysely"; import type { DatabaseDialectType } from "../db/adapters.js"; +import type { Database } from "./types.js"; export type { DatabaseDialectType }; /** * Detect dialect type from a Kysely instance via the adapter class name. */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function detectDialect(db: Kysely): DatabaseDialectType { +export function detectDialect(db: Kysely): DatabaseDialectType { const name = db.getExecutor().adapter.constructor.name; if (name === "PostgresAdapter") return "postgres"; return "sqlite"; } -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function isSqlite(db: Kysely): boolean { +export function isSqlite(db: Kysely): boolean { return detectDialect(db) === "sqlite"; } -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function isPostgres(db: Kysely): boolean { +export function isPostgres(db: Kysely): boolean { return detectDialect(db) === "postgres"; } @@ -44,8 +42,7 @@ export function isPostgres(db: Kysely): boolean { * sqlite: (datetime('now')) * postgres: CURRENT_TIMESTAMP */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function currentTimestamp(db: Kysely): RawBuilder { +export function currentTimestamp(db: Kysely): RawBuilder { if (isPostgres(db)) { return sql`CURRENT_TIMESTAMP`; } @@ -59,8 +56,7 @@ export function currentTimestamp(db: Kysely): RawBuilder { * sqlite: datetime('now') * postgres: CURRENT_TIMESTAMP */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function currentTimestampValue(db: Kysely): RawBuilder { +export function currentTimestampValue(db: Kysely): RawBuilder { if (isPostgres(db)) { return sql`CURRENT_TIMESTAMP`; } @@ -70,8 +66,7 @@ export function currentTimestampValue(db: Kysely): RawBuilder { /** * Check if a table exists in the database. */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export async function tableExists(db: Kysely, tableName: string): Promise { +export async function tableExists(db: Kysely, tableName: string): Promise { if (isPostgres(db)) { const result = await sql<{ exists: boolean }>` SELECT EXISTS( @@ -92,8 +87,7 @@ export async function tableExists(db: Kysely, tableName: string): Promise, pattern: string): Promise { +export async function listTablesLike(db: Kysely, pattern: string): Promise { if (isPostgres(db)) { const result = await sql<{ table_name: string }>` SELECT table_name FROM information_schema.tables @@ -115,8 +109,7 @@ export async function listTablesLike(db: Kysely, pattern: string): Promise< * sqlite: blob * postgres: bytea */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function binaryType(db: Kysely): ColumnDataType { +export function binaryType(db: Kysely): ColumnDataType { if (isPostgres(db)) { return "bytea"; } @@ -129,8 +122,7 @@ export function binaryType(db: Kysely): ColumnDataType { * sqlite: json_extract(column, '$.path') * postgres: column->>'path' */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function jsonExtractExpr(db: Kysely, column: string, path: string): string { +export function jsonExtractExpr(db: Kysely, column: string, path: string): string { if (isPostgres(db)) { return `${column}->>'${path}'`; } diff --git a/packages/core/src/database/repositories/content.ts b/packages/core/src/database/repositories/content.ts index 1016d7e25..927d57767 100644 --- a/packages/core/src/database/repositories/content.ts +++ b/packages/core/src/database/repositories/content.ts @@ -464,6 +464,7 @@ export class ContentRepository { // Validate order direction to prevent injection const safeOrderDirection = orderDirection.toLowerCase() === "asc" ? "ASC" : "DESC"; + const orderColumn = sql.ref(dbField); // Build query with parameterized values (no string interpolation) // Note: Dynamic content tables have deleted_at column, cast needed for Kysely @@ -482,7 +483,7 @@ export class ContentRepository { } if (options.where?.locale) { - query = query.where("locale" as any, "=", options.where.locale); + query = query.where(sql`locale = ${options.where.locale}`); } // Handle cursor pagination @@ -492,18 +493,12 @@ export class ContentRepository { const { orderValue, id: cursorId } = decoded; if (safeOrderDirection === "DESC") { - query = query.where((eb) => - eb.or([ - eb(dbField as any, "<", orderValue), - eb.and([eb(dbField as any, "=", orderValue), eb("id", "<", cursorId)]), - ]), + query = query.where( + sql`(${orderColumn} < ${orderValue} OR (${orderColumn} = ${orderValue} AND "id" < ${cursorId}))`, ); } else { - query = query.where((eb) => - eb.or([ - eb(dbField as any, ">", orderValue), - eb.and([eb(dbField as any, "=", orderValue), eb("id", ">", cursorId)]), - ]), + query = query.where( + sql`(${orderColumn} > ${orderValue} OR (${orderColumn} = ${orderValue} AND "id" > ${cursorId}))`, ); } } @@ -511,7 +506,7 @@ export class ContentRepository { // Apply ordering and limit query = query - .orderBy(dbField as any, safeOrderDirection === "ASC" ? "asc" : "desc") + .orderBy(orderColumn, safeOrderDirection === "ASC" ? "asc" : "desc") .orderBy("id", safeOrderDirection === "ASC" ? "asc" : "desc") .limit(limit + 1); @@ -660,6 +655,7 @@ export class ContentRepository { const dbField = this.mapOrderField(orderField); const safeOrderDirection = orderDirection.toLowerCase() === "asc" ? "ASC" : "DESC"; + const orderColumn = sql.ref(dbField); let query = this.db .selectFrom(tableName as keyof Database) @@ -673,25 +669,19 @@ export class ContentRepository { const { orderValue, id: cursorId } = decoded; if (safeOrderDirection === "DESC") { - query = query.where((eb) => - eb.or([ - eb(dbField as any, "<", orderValue), - eb.and([eb(dbField as any, "=", orderValue), eb("id", "<", cursorId)]), - ]), + query = query.where( + sql`(${orderColumn} < ${orderValue} OR (${orderColumn} = ${orderValue} AND "id" < ${cursorId}))`, ); } else { - query = query.where((eb) => - eb.or([ - eb(dbField as any, ">", orderValue), - eb.and([eb(dbField as any, "=", orderValue), eb("id", ">", cursorId)]), - ]), + query = query.where( + sql`(${orderColumn} > ${orderValue} OR (${orderColumn} = ${orderValue} AND "id" > ${cursorId}))`, ); } } } query = query - .orderBy(dbField as any, safeOrderDirection === "ASC" ? "asc" : "desc") + .orderBy(orderColumn, safeOrderDirection === "ASC" ? "asc" : "desc") .orderBy("id", safeOrderDirection === "ASC" ? "asc" : "desc") .limit(limit + 1); @@ -760,7 +750,7 @@ export class ContentRepository { } if (where?.locale) { - query = query.where("locale" as any, "=", where.locale); + query = query.where(sql`locale = ${where.locale}`); } const result = await query.executeTakeFirst(); diff --git a/packages/core/src/database/repositories/plugin-storage.ts b/packages/core/src/database/repositories/plugin-storage.ts index bd115f1de..dc3b22a6f 100644 --- a/packages/core/src/database/repositories/plugin-storage.ts +++ b/packages/core/src/database/repositories/plugin-storage.ts @@ -25,6 +25,7 @@ import type { } from "../../plugins/types.js"; import { withTransaction } from "../transaction.js"; import type { Database } from "../types.js"; +import { isUniqueConstraintViolation } from "../unique-constraint.js"; import { encodeCursor, decodeCursor } from "./types.js"; /** @@ -87,6 +88,57 @@ export class PluginStorageRepository implements StorageCollection { + const now = new Date().toISOString(); + const jsonData = JSON.stringify(data); + + try { + await this.db + .insertInto("_plugin_storage") + .values({ + plugin_id: this.pluginId, + collection: this.collection, + id, + data: jsonData, + created_at: now, + updated_at: now, + }) + .execute(); + return true; + } catch (error) { + if (isUniqueConstraintViolation(error)) return false; + throw error; + } + } + + /** + * Replace a document only when the row version matches the expected value. + * Returns true when the document was updated. + */ + async compareAndSwap(id: string, expectedVersion: string, data: T): Promise { + const now = new Date().toISOString(); + const jsonData = JSON.stringify(data); + + const result = await this.db + .updateTable("_plugin_storage") + .set({ + data: jsonData, + updated_at: now, + }) + .where("plugin_id", "=", this.pluginId) + .where("collection", "=", this.collection) + .where("id", "=", id) + .where("updated_at", "=", expectedVersion) + .executeTakeFirst(); + + return Number(result.numUpdatedRows ?? 0) > 0; + } + /** * Delete a document */ diff --git a/packages/core/src/database/unique-constraint.ts b/packages/core/src/database/unique-constraint.ts new file mode 100644 index 000000000..399aa7eb5 --- /dev/null +++ b/packages/core/src/database/unique-constraint.ts @@ -0,0 +1,58 @@ +/** + * Detect duplicate-key / unique constraint failures across SQL drivers. + * Used by insert-only paths (e.g. `putIfAbsent`) where conflict must map to `false`, not throw. + */ + +function messageLooksLikeUniqueViolation(message: string): boolean { + const m = message.toLowerCase(); + return ( + m.includes("unique constraint failed") || + m.includes("uniqueness violation") || + m.includes("duplicate key value violates unique constraint") || + m.includes("duplicate entry") + ); +} + +function readPgCode(err: unknown): string | undefined { + if (!err || typeof err !== "object") return undefined; + const o = err as Record; + const code = o.code; + if (typeof code === "string" && code.length > 0) return code; + const cause = o.cause; + if (cause && typeof cause === "object") { + const c = cause as Record; + if (typeof c.code === "string") return c.code; + } + return undefined; +} + +/** + * Returns true when `error` represents a primary/unique constraint violation on insert. + */ +export function isUniqueConstraintViolation(error: unknown): boolean { + if (error == null) return false; + + const pg = readPgCode(error); + if (pg === "23505") return true; + + let current: unknown = error; + const seen = new Set(); + for (let depth = 0; depth < 6 && current != null && !seen.has(current); depth++) { + seen.add(current); + if (current instanceof Error) { + if (messageLooksLikeUniqueViolation(current.message)) return true; + current = (current as Error & { cause?: unknown }).cause; + continue; + } + if (typeof current === "object") { + const o = current as Record; + const msg = o.message; + if (typeof msg === "string" && messageLooksLikeUniqueViolation(msg)) return true; + current = o.cause; + continue; + } + break; + } + + return false; +} diff --git a/packages/core/src/loader.ts b/packages/core/src/loader.ts index 99f0dadd0..0bc8a5dc1 100644 --- a/packages/core/src/loader.ts +++ b/packages/core/src/loader.ts @@ -218,9 +218,8 @@ export type OrderBySpec = Record; * When filtering for 'published' status, also include scheduled content * whose scheduled_at time has passed (treating it as effectively published). */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance function buildStatusCondition( - db: Kysely, + db: Kysely, status: string, tablePrefix?: string, ): ReturnType { diff --git a/packages/core/src/plugins/adapt-sandbox-entry.ts b/packages/core/src/plugins/adapt-sandbox-entry.ts index 5a2da9475..5133a1485 100644 --- a/packages/core/src/plugins/adapt-sandbox-entry.ts +++ b/packages/core/src/plugins/adapt-sandbox-entry.ts @@ -15,6 +15,7 @@ import { PLUGIN_CAPABILITIES, HOOK_NAMES } from "./manifest-schema.js"; import type { StandardPluginDefinition, StandardHookEntry, + StandardRouteEntry, StandardHookHandler, ResolvedPlugin, ResolvedPluginHooks, @@ -104,6 +105,7 @@ export function adaptSandboxEntry( const resolvedHooks: ResolvedPluginHooks = {}; if (definition.hooks) { for (const [hookName, entry] of Object.entries(definition.hooks)) { + const standardHook = entry as StandardHookEntry; if (!VALID_HOOK_NAMES_SET.has(hookName)) { throw new Error( `Plugin "${pluginId}" declares unknown hook "${hookName}". ` + @@ -114,7 +116,10 @@ export function adaptSandboxEntry( // We store it as the generic type and let HookPipeline's typed dispatch // methods handle the type narrowing at call time. // eslint-disable-next-line typescript-eslint/no-unsafe-type-assertion -- bridging untyped map to typed interface - (resolvedHooks as Record)[hookName] = resolveStandardHook(entry, pluginId); + (resolvedHooks as Record)[hookName] = resolveStandardHook( + standardHook, + pluginId, + ); } } @@ -125,11 +130,12 @@ export function adaptSandboxEntry( const resolvedRoutes: Record = {}; if (definition.routes) { for (const [routeName, routeEntry] of Object.entries(definition.routes)) { - const standardHandler = routeEntry.handler; + const standardRoute = routeEntry as StandardRouteEntry; + const standardHandler = standardRoute.handler; resolvedRoutes[routeName] = { // eslint-disable-next-line @typescript-eslint/no-unsafe-type-assertion -- StandardRouteEntry.input is intentionally loosely typed; callers validate at runtime - input: routeEntry.input as PluginRoute["input"], - public: routeEntry.public, + input: standardRoute.input as PluginRoute["input"], + public: standardRoute.public, handler: async (ctx) => { // Build the routeCtx shape that standard handlers expect const routeCtx = { diff --git a/packages/core/src/plugins/marketplace.ts b/packages/core/src/plugins/marketplace.ts index 58b4b93cb..93c67face 100644 --- a/packages/core/src/plugins/marketplace.ts +++ b/packages/core/src/plugins/marketplace.ts @@ -8,7 +8,7 @@ import { createGzipDecoder, unpackTar } from "modern-tar"; -import { pluginManifestSchema } from "./manifest-schema.js"; +import { pluginManifestSchema, type ValidatedPluginManifest } from "./manifest-schema.js"; import type { PluginManifest } from "./types.js"; // ── Module-level regex patterns ─────────────────────────────────── @@ -393,7 +393,7 @@ async function extractBundle(tarballBytes: Uint8Array): Promise { throw new MarketplaceError("Invalid bundle: missing backend.js", undefined, "INVALID_BUNDLE"); } - let manifest: PluginManifest; + let manifest: ValidatedPluginManifest; try { const parsed: unknown = JSON.parse(manifestJson); const result = pluginManifestSchema.safeParse(parsed); @@ -406,8 +406,7 @@ async function extractBundle(tarballBytes: Uint8Array): Promise { } // Elements are validated as unknown[] by Zod; cast to PluginManifest // for the Element[] type (Block Kit validation happens at render time). - // eslint-disable-next-line @typescript-eslint/no-unsafe-type-assertion -- Zod types elements as unknown[]; Element type validated at render time - manifest = result.data as unknown as PluginManifest; + manifest = result.data; } catch (err) { if (err instanceof MarketplaceError) throw err; throw new MarketplaceError( @@ -418,8 +417,7 @@ async function extractBundle(tarballBytes: Uint8Array): Promise { } // Compute SHA-256 checksum of the tarball for verification - // eslint-disable-next-line typescript-eslint(no-unsafe-type-assertion) -- Uint8Array is a valid BufferSource at runtime; TS lib mismatch - const hashBuffer = await crypto.subtle.digest("SHA-256", tarballBytes as unknown as BufferSource); + const hashBuffer = await crypto.subtle.digest("SHA-256", tarballBytes); const hashArray = new Uint8Array(hashBuffer); const checksum = Array.from(hashArray, (b) => b.toString(16).padStart(2, "0")).join(""); diff --git a/packages/core/src/plugins/request-meta.ts b/packages/core/src/plugins/request-meta.ts index dca7af271..06de9f81c 100644 --- a/packages/core/src/plugins/request-meta.ts +++ b/packages/core/src/plugins/request-meta.ts @@ -45,7 +45,7 @@ function parseFirstForwardedIp(header: string): string | null { * Returns undefined when not running on Cloudflare Workers. */ function getCfObject(request: Request): CfProperties | undefined { - return (request as unknown as { cf?: CfProperties }).cf; + return (request as { cf?: CfProperties }).cf; } /** diff --git a/packages/core/src/plugins/storage-indexes.ts b/packages/core/src/plugins/storage-indexes.ts index bfb32ea87..404bed0db 100644 --- a/packages/core/src/plugins/storage-indexes.ts +++ b/packages/core/src/plugins/storage-indexes.ts @@ -39,9 +39,8 @@ export function generateIndexName( * Validates all identifiers before interpolation to prevent SQL injection. * Plugin ID and collection values are parameterized in the WHERE clause. */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance export function generateCreateIndexSql( - db: Kysely, + db: Kysely, pluginId: string, collection: string, fields: string[], diff --git a/packages/core/src/plugins/storage-query.ts b/packages/core/src/plugins/storage-query.ts index ccbfc84ff..d2fc19939 100644 --- a/packages/core/src/plugins/storage-query.ts +++ b/packages/core/src/plugins/storage-query.ts @@ -9,6 +9,7 @@ import type { Kysely } from "kysely"; import { jsonExtractExpr } from "../database/dialect-helpers.js"; +import type { Database } from "../database/types.js"; import { validateJsonFieldName } from "../database/validate.js"; import type { WhereClause, WhereValue, RangeFilter, InFilter, StartsWithFilter } from "./types.js"; @@ -113,8 +114,7 @@ export function validateOrderByClause( * Validates the field name before interpolation to prevent SQL injection * via crafted JSON path expressions. */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance -export function jsonExtract(db: Kysely, field: string): string { +export function jsonExtract(db: Kysely, field: string): string { validateJsonFieldName(field, "query field name"); return jsonExtractExpr(db, "data", field); } @@ -122,9 +122,8 @@ export function jsonExtract(db: Kysely, field: string): string { /** * Build a WHERE clause condition for a single field */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance export function buildCondition( - db: Kysely, + db: Kysely, field: string, value: WhereValue, ): { sql: string; params: unknown[] } { @@ -191,9 +190,8 @@ export function buildCondition( /** * Build a complete WHERE clause from a WhereClause object */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance export function buildWhereClause( - db: Kysely, + db: Kysely, where: WhereClause, ): { sql: string; @@ -221,9 +219,8 @@ export function buildWhereClause( /** * Build ORDER BY clause */ -// eslint-disable-next-line @typescript-eslint/no-explicit-any -- accepts any Kysely instance export function buildOrderByClause( - db: Kysely, + db: Kysely, orderBy: Record, ): string { const clauses: string[] = []; diff --git a/packages/core/src/plugins/types.ts b/packages/core/src/plugins/types.ts index 76a262d1e..040ad302c 100644 --- a/packages/core/src/plugins/types.ts +++ b/packages/core/src/plugins/types.ts @@ -123,6 +123,17 @@ export interface StorageCollection { // Basic CRUD get(id: string): Promise; put(id: string, data: T): Promise; + /** + * Insert only if the document does not exist. Returns `true` when the row is created. + * This is an optional capability used for optimistic "claim" workflows. + */ + putIfAbsent?(id: string, data: T): Promise; + /** + * Atomically replace a document only when the current row version matches. + * The version token comes from a storage-stable value (currently the row's + * `updated_at` timestamp). + */ + compareAndSwap?(id: string, expectedVersion: string, data: T): Promise; delete(id: string): Promise; exists(id: string): Promise; @@ -1182,8 +1193,10 @@ export interface ResolvedPluginHooks { * Plugin authors annotate their event parameters with specific types for IDE * support. At the type level, we accept any function with compatible arity. */ -// eslint-disable-next-line typescript-eslint/no-explicit-any -- must accept handlers with specific event types -export type StandardHookHandler = (...args: any[]) => Promise; +export type StandardHookHandler< + TEvent = unknown, + TContext extends PluginContext = PluginContext, +> = (event: TEvent, ctx: TContext) => Promise; /** * Standard plugin hook entry -- either a bare handler or a config object. @@ -1202,13 +1215,22 @@ export type StandardHookEntry = /** * Standard plugin route handler -- takes (routeCtx, pluginCtx) like sandbox entries. * The routeCtx contains input and request info; pluginCtx is the full plugin context. - * - * Uses `any` for routeCtx to allow plugins to access properties like - * `routeCtx.request.url` without needing exact type matches across - * trusted (Request object) and sandboxed (plain object) modes. - */ -// eslint-disable-next-line typescript-eslint/no-explicit-any -- see above -export type StandardRouteHandler = (routeCtx: any, ctx: PluginContext) => Promise; + * Route context fields are intentionally narrow so sandbox and trusted handlers can + * share a single signature while remaining explicit in intent. + */ +export type StandardRouteContext = Pick< + RouteContext, + "input" | "request" | "requestMeta" +> & { + // Compatibility fallback for handlers that still expect optional PluginContext-like + // fields in the first argument (legacy standard-route shape). + [K in keyof Partial]?: PluginContext[K]; +}; + +export type StandardRouteHandler = ( + routeCtx: StandardRouteContext, + pluginCtx: PluginContext, +) => Promise; /** * Standard plugin route entry -- either a config object with handler, or just a handler. @@ -1226,15 +1248,13 @@ export interface StandardRouteEntry { * * This is the input to definePlugin() for standard-format plugins. * - * The hooks and routes use permissive types (Record) so that + * The hooks and routes use permissive types (Record) so that * plugin authors can annotate their handlers with specific event types * without type errors from strictFunctionTypes contravariance. */ export interface StandardPluginDefinition { - // eslint-disable-next-line typescript-eslint/no-explicit-any -- must accept handlers with specific event/route types - hooks?: Record; - // eslint-disable-next-line typescript-eslint/no-explicit-any -- must accept handlers with specific event/route types - routes?: Record; + hooks?: Record; + routes?: Record; } /** diff --git a/packages/core/tests/unit/database/unique-constraint.test.ts b/packages/core/tests/unit/database/unique-constraint.test.ts new file mode 100644 index 000000000..6f3cd6095 --- /dev/null +++ b/packages/core/tests/unit/database/unique-constraint.test.ts @@ -0,0 +1,34 @@ +import { describe, expect, it } from "vitest"; + +import { isUniqueConstraintViolation } from "../../../src/database/unique-constraint.js"; + +describe("isUniqueConstraintViolation", () => { + it("returns true for SQLite-style messages", () => { + expect( + isUniqueConstraintViolation(new Error("UNIQUE constraint failed: _plugin_storage.id")), + ).toBe(true); + expect(isUniqueConstraintViolation(new Error("unique constraint failed"))).toBe(true); + }); + + it("returns true for PostgreSQL code 23505", () => { + expect(isUniqueConstraintViolation({ code: "23505", message: "duplicate key" })).toBe(true); + }); + + it("returns true for nested cause with PG code", () => { + const inner = { code: "23505" }; + expect(isUniqueConstraintViolation({ cause: inner })).toBe(true); + }); + + it("returns true for Error with cause chain carrying message", () => { + const inner = new Error('duplicate key value violates unique constraint "pk"'); + const outer = new Error("wrap"); + (outer as Error & { cause?: unknown }).cause = inner; + expect(isUniqueConstraintViolation(outer)).toBe(true); + }); + + it("returns false for unrelated errors", () => { + expect(isUniqueConstraintViolation(new Error("connection refused"))).toBe(false); + expect(isUniqueConstraintViolation(null)).toBe(false); + expect(isUniqueConstraintViolation(undefined)).toBe(false); + }); +}); diff --git a/packages/core/tests/unit/plugins/plugin-storage.test.ts b/packages/core/tests/unit/plugins/plugin-storage.test.ts index 8981de6d6..bccd95845 100644 --- a/packages/core/tests/unit/plugins/plugin-storage.test.ts +++ b/packages/core/tests/unit/plugins/plugin-storage.test.ts @@ -85,6 +85,117 @@ describe("PluginStorageRepository", () => { }); }); + describe("putIfAbsent()", () => { + it("should insert a new document and return true", async () => { + const doc: TestDocument = { + title: "Test", + status: "active", + count: 5, + createdAt: "2024-01-01", + }; + + const inserted = await repo.putIfAbsent("doc1", doc); + expect(inserted).toBe(true); + + const result = await repo.get("doc1"); + expect(result).toEqual(doc); + }); + + it("should return false without overwriting an existing document", async () => { + const doc: TestDocument = { + title: "Original", + status: "active", + count: 1, + createdAt: "2024-01-01", + }; + const replacement: TestDocument = { + ...doc, + title: "Replacement", + count: 2, + }; + + await repo.put("doc1", doc); + const inserted = await repo.putIfAbsent("doc1", replacement); + expect(inserted).toBe(false); + + const result = await repo.get("doc1"); + expect(result).toEqual(doc); + }); + }); + + describe("compareAndSwap()", () => { + it("should replace the document when version matches", async () => { + const doc: TestDocument = { + title: "Original", + status: "active", + count: 1, + createdAt: "2024-01-01", + }; + const next = { + ...doc, + title: "Replaced", + count: 2, + }; + + await repo.put("doc1", doc); + const { updated_at: version } = await db + .selectFrom("_plugin_storage") + .select("updated_at") + .where("plugin_id", "=", "test-plugin") + .where("collection", "=", "items") + .where("id", "=", "doc1") + .executeTakeFirstOrThrow(); + + const replaced = await repo.compareAndSwap("doc1", version, next); + expect(replaced).toBe(true); + + const result = await repo.get("doc1"); + expect(result).toEqual(next); + }); + + it("should not replace the document when version mismatches", async () => { + const doc: TestDocument = { + title: "Original", + status: "active", + count: 1, + createdAt: "2024-01-01", + }; + const replacement: TestDocument = { + ...doc, + title: "Replacement", + count: 2, + }; + const stale = { + ...doc, + title: "Stale Attempt", + count: 99, + }; + + await repo.put("doc1", doc); + await repo.put("doc1", replacement); + const staleVersion = "1970-01-01T00:00:00.000Z"; + + const replaced = await repo.compareAndSwap("doc1", staleVersion, stale); + expect(replaced).toBe(false); + + const result = await repo.get("doc1"); + expect(result).toEqual(replacement); + }); + + it("should return false for a missing document", async () => { + const next: TestDocument = { + title: "Next", + status: "active", + count: 1, + createdAt: "2024-01-01", + }; + + const swapped = await repo.compareAndSwap("does-not-exist", "1970-01-01T00:00:00.000Z", next); + expect(swapped).toBe(false); + expect(await repo.get("does-not-exist")).toBeNull(); + }); + }); + describe("delete()", () => { it("should return false for non-existent document", async () => { const result = await repo.delete("non-existent"); diff --git a/packages/plugins/atproto/src/index.ts b/packages/plugins/atproto/src/index.ts index 1904a4fa9..3768e17bd 100644 --- a/packages/plugins/atproto/src/index.ts +++ b/packages/plugins/atproto/src/index.ts @@ -1,42 +1,87 @@ /** * AT Protocol / standard.site Plugin for EmDash CMS * - * Syndicates published content to the AT Protocol network using the - * standard.site lexicons, with optional cross-posting to Bluesky. + * This package supports both descriptor + native entrypoint usage. * - * Features: - * - Creates site.standard.publication record (one per site) - * - Creates site.standard.document records on publish - * - Optional Bluesky cross-post with link card - * - Automatic injection via page:metadata - * - Sync status tracking in plugin storage + * Descriptor mode: + * - `atprotoPlugin()` returns a `PluginDescriptor` for config. + * - Runtime uses standard format + sandbox/inline adaptation. * - * Designed for sandboxed execution: - * - All HTTP via ctx.http.fetch() - * - Block Kit admin UI (no React components) - * - Capabilities: read:content, network:fetch:any + * Native mode: + * - `createPlugin()` returns a resolved plugin via `definePlugin`. */ -import type { PluginDescriptor } from "emdash"; +import type { PluginDefinition, PluginDescriptor, ResolvedPlugin } from "emdash"; +import { definePlugin } from "emdash"; -// ── Descriptor ────────────────────────────────────────────────── +import sandboxPlugin from "./sandbox-entry.js"; + +const ATPROTO_PLUGIN_ID = "atproto"; +const ATPROTO_PLUGIN_VERSION = "0.1.0"; + +interface AtprotoPluginOptions { + // Placeholder for future options to preserve constructor signature. + [key: string]: unknown; +} /** * Create the AT Protocol plugin descriptor. * Import this in your astro.config.mjs / live.config.ts. */ -export function atprotoPlugin(): PluginDescriptor { +export function atprotoPlugin( + options: AtprotoPluginOptions = {}, +): PluginDescriptor { return { - id: "atproto", - version: "0.1.0", + id: ATPROTO_PLUGIN_ID, + version: ATPROTO_PLUGIN_VERSION, format: "standard", - entrypoint: "@emdash-cms/plugin-atproto/sandbox", + entrypoint: "@emdash-cms/plugin-atproto", + options, capabilities: ["read:content", "network:fetch:any"], storage: { - publications: { indexes: ["contentId", "platform", "publishedAt"] }, + records: { indexes: ["contentId", "status"] }, }, // Block Kit admin pages (no adminEntry needed -- sandboxed) adminPages: [{ path: "/status", label: "AT Protocol", icon: "globe" }], adminWidgets: [{ id: "sync-status", title: "AT Protocol", size: "third" }], }; } + +/** + * Native plugin factory. + * + * Uses the sandbox implementation as the source of hook/route behavior + * and adapts it into a fully resolved plugin. + */ +export function createPlugin(_options: AtprotoPluginOptions = {}): ResolvedPlugin { + const hooks = { + ...(sandboxPlugin.hooks as Record), + "content:afterSave": { + ...(sandboxPlugin.hooks?.["content:afterSave"] as Record), + errorPolicy: "continue", + }, + } as Record; + + return definePlugin({ + id: ATPROTO_PLUGIN_ID, + version: ATPROTO_PLUGIN_VERSION, + capabilities: ["read:content", "network:fetch:any"], + storage: { + records: { indexes: ["contentId", "status"] }, + }, + hooks, + routes: sandboxPlugin.routes as Record, + admin: { + settingsSchema: { + handle: { type: "string", label: "Handle" }, + appPassword: { type: "secret", label: "App Password" }, + siteUrl: { type: "string", label: "Site URL" }, + enableBskyCrosspost: { type: "boolean", label: "Enable Bluesky crosspost" }, + crosspostTemplate: { type: "string", label: "Crosspost template" }, + langs: { type: "string", label: "Languages" }, + }, + }, + } as PluginDefinition); +} + +export default sandboxPlugin; diff --git a/packages/plugins/commerce/AI-EXTENSIBILITY.md b/packages/plugins/commerce/AI-EXTENSIBILITY.md new file mode 100644 index 000000000..b8934ac80 --- /dev/null +++ b/packages/plugins/commerce/AI-EXTENSIBILITY.md @@ -0,0 +1,88 @@ +# Commerce plugin — AI, vectors, and MCP readiness + +This document aligns the **stage-1 commerce kernel** with future **LLM**, **vector search**, and **MCP** work. It is the operational companion to `COMMERCE_EXTENSION_SURFACE.md`. + +## Vectors and catalog + +- **Embeddings target catalog**, not transactional commerce storage. Product copy, `shortDescription`, and searchable facets live on **content / catalog documents** (or a future core vector index). +- **Orders and carts** keep **stable `productId` / `variantId`** and numeric snapshots (`unitPriceMinor`, `quantity`, `inventoryVersion`). Do not store duplicate canonical product text on line items for embedding purposes. +- Type-level contract for optional catalog fields: `CommerceCatalogProductSearchFields` in `src/catalog-extensibility.ts`. + +## Checkout and agents + +- **Checkout, webhooks, and finalize** remain **deterministic** and **mutation-authoritative**. Agents must not replace those flows with fuzzy reasoning. +- **Recommendation** and **search** are **read-only** surfaces. The `recommendations` plugin route is currently **disabled** (`strategy: "disabled"`, `reason: "no_recommender_configured"`) until vector search or an external recommender is wired; storefronts should hide the block when `enabled` is false. + +Implementation guardrails: + +- `src/index.ts` route table is the source of truth for shipped HTTP capabilities. +- `COMMERCE_EXTENSION_SURFACE.md` tracks stable extension seams and kernel closure rules. +- `src/catalog-extensibility.ts` defines export-level contracts for third-party providers. +- `commerce-extension-seams` helpers (`createRecommendationsRoute`, + `createPaymentWebhookRoute`, `queryFinalizationState`) are the only MCP-facing + extension surfaces for this stage. + +## Current hardening status (next-pass gate) + +- This branch ships regression-only updates for 5A (same-event duplicate webhook + finalization convergence), 5B (pending-state contract visibility and non-terminal + resume transitions), 5C (possession checks on order/cart entrypoints), + 5D (scope lock reaffirmation), 5E (deterministic claim lease policy), and + 5F (rollout/docs proof completed for strict lease mode with staged promotion controls) +- Post-5F optional AI roadmap items are tracked in `COMMERCE_AI_ROADMAP.md` and remain + non-blocking to Stage-1 money-path behavior. + Runtime behavior for checkout/finalize/routing remains unchanged while we continue + to enforce the same scope lock for provider topology (`webhooks/stripe` only) until + strict claim-lease mode (`COMMERCE_USE_LEASED_FINALIZE=1`) is promoted through current + operational checks in the strategy and regression documentation. + +### Strategy A acceptance guidance (contract hardening only) + +**Strategy A metadata** + +- Last updated: 2026-04-03 +- Owner: emDash Commerce/AI integration owner +- Scope owner: contract hardening only (no AI/MCP command expansion) + +- This stage is intentionally limited to **contract hardening**: keep all payment path runtime semantics unchanged. +- Contract consolidation and shape consistency are owned in `src/services/commerce-provider-contracts.ts` with matching tests in `src/services/commerce-provider-contracts.test.ts`. +- No provider registry routing, provider switching UI, or MCP command surface is introduced yet. +- Runtime gateway path remains `webhooks/stripe` until a second provider is actively enabled. +- Defer broader AI/MCP command expansions until: + - the provider ecosystem reaches a second active payment adapter, and + - a scoped commerce MCP command package is deployed. + +## Errors and observability + +- Public errors should continue to expose **machine-readable `code`** values (see kernel `COMMERCE_ERROR_WIRE_CODES` and `toCommerceApiError()`). LLMs and MCP tools should branch on `code`, not on free-form `message` text. +- Future `orderEvents`-style logs should record an **`actor`** (`system` | `merchant` | `agent` | `customer`) for audit trails; see `COMMERCE_EXTENSION_SURFACE.md`. +- For this stage, replay diagnostics should consume the enriched `queryFinalizationStatus` + state shape (`receiptStatus` + `resumeState`) rather than inspecting storage manually. + +### Stage-1 limits and Stage-2 roadmap + +This stage intentionally excludes adjustment-event lifecycle automation: + +- one active payment provider (`stripe`) through `webhooks/stripe`; +- no automatic refund/chargeback event replay for inventory restoration; +- no stage-2 “admin finalize transition” command surface; +- storefronts receive read-only finalization visibility only (`queryFinalizationState`). + +Out-of-band stage-2 work should introduce provider-independent event adapter hooks +for credits/adjustments and define an explicit recovery tool path with audit controls. + +## MCP + +- **EmDash MCP** today targets **content** tooling. A dedicated **`@emdash-cms/plugin-commerce-mcp`** package is **planned** (`COMMERCE_EXTENSION_SURFACE.md`) for scoped tools: product read/write, order lookup for customer service (prefer **short-lived tokens** over wide-open order id guessing), refunds, etc. +- MCP tools must respect the same invariants as HTTP routes: **no bypass** of finalize/idempotency rules for payments. +- MCP tools should be read/write-safe by design: reads use `queryFinalizationStatus`/order APIs, writes use service seams that enforce kernel checks. + +## Related files + +| Item | Location | +| ---------------------------------------- | --------------------------------- | +| Disabled recommendations route | `src/handlers/recommendations.ts` | +| Catalog/search field contract | `src/catalog-extensibility.ts` | +| Extension seams and invariants | `COMMERCE_EXTENSION_SURFACE.md` | +| Architecture (MCP tool list, principles) | `COMMERCE_EXTENSION_SURFACE.md` | +| Execution handoff | `HANDOVER.md` | diff --git a/packages/plugins/commerce/CI_REGRESSION_CHECKLIST.md b/packages/plugins/commerce/CI_REGRESSION_CHECKLIST.md new file mode 100644 index 000000000..d59fecbcb --- /dev/null +++ b/packages/plugins/commerce/CI_REGRESSION_CHECKLIST.md @@ -0,0 +1,204 @@ +# Minimal required regression checks for commerce plugin tickets + +Use this as a ticket-ready acceptance gate for follow-on work. + +## Reusable ticket template (copy/paste) + +### Ticket: Strategy A — Provider Contract Hardening + +**Summary** + +- Scope: Strategy A only (contract drift hardening, no topology changes). +- Goal: centralize provider defaults/contracts/adapters without changing runtime behavior. + +**Acceptance checklist** + +- [ ] Scope lock verified (see section 0). +- [ ] T1 canonical provider contract source in place. +- [ ] T2 seam exports consolidated. +- [ ] T3 tests added/updated and passing. +- [ ] T4 regression proof executed. +- [ ] DoD (section 0) complete. + +**Blocking assumptions** + +- Do not include second-provider routing until a second provider is active. +- Do not include MCP command surfaces unless commerce MCP command package is actively scoped. + +## 0) Strategy A (contract hardening, no topology change) — ticket checklist + +### Scope lock (hard stop) + +- [ ] Runtime behavior unchanged (`checkout`, `webhook`, `finalize`, diagnostics flow). +- [ ] No provider routing/registry introduced in this ticket. +- [ ] No MCP command surface added in this ticket. +- [ ] No runtime gateway branching changes. + +### Contract hardening tasks (must complete in order) + +- [ ] **T1 — Canonicalize payment default source** + - [ ] Confirm shared default/payment provider constant is in `src/services/commerce-provider-contracts.ts`. + - [ ] Confirm checkout-path resolution delegates to that shared contract. + - [ ] Confirm webhook adapter input contract type is the shared contract. + +- [ ] **T2 — Consolidate seam exports** + - [ ] Ensure `commerce-extension-seams.ts` re-exports actor constants/types from the shared contract module. + - [ ] Ensure `webhook-handler.ts` references shared adapter contracts for seam entry types. + - [ ] Ensure plugin public exports surface contract symbols for integrations (`index.ts`). + +- [ ] **T3 — Update acceptance tests** + - [ ] `src/services/commerce-provider-contracts.test.ts` + - [ ] `undefined`/blank provider input resolves to default. + - [ ] explicit provider input is preserved. + - [ ] actor map keys/values are stable (`system`, `merchant`, `agent`, `customer`). + - [ ] `src/handlers/checkout-state.test.ts` + - [ ] `resolvePaymentProviderId` behavior remains unchanged for missing/blank ids. + - [ ] `src/handlers/webhook-handler.test.ts` and `src/services/commerce-extension-seams.test.ts` + - [ ] adapter type/wiring contracts remain behavior-compatible. + - [ ] contract refactor does not alter `createPaymentWebhookRoute` semantics. + +- [ ] **T4 — Regression proof** + - [ ] Execute targeted and package-level test passes documented below: + - [ ] `pnpm --filter @emdash-cms/plugin-commerce test services/commerce-provider-contracts.test.ts` + - [ ] `pnpm --filter @emdash-cms/plugin-commerce test` + - [ ] Ensure existing baseline suite count is unchanged and no unrelated tests are required to pass newly. + +### Definition of done + +- [ ] Strategy A docs updated with scope/deferral statements in: + - `COMMERCE_DOCS_INDEX.md` + - `COMMERCE_EXTENSION_SURFACE.md` + - `AI-EXTENSIBILITY.md` + - `HANDOVER.md` +- [ ] No production logic change in payment, finalize, webhook ordering, or token/idempotency rules. +- [ ] Changes are additive and isolated to contract layering. +- [ ] Ticket is blocked for broader architecture changes unless one of the hard gates below is true: + - a second payment provider is live, or + - `@emdash-cms/plugin-commerce-mcp` command surface is actively in scope. + +## 1) Finalization diagnostics (queryFinalizationState) + +- Assert rate-limit rejection (`rate_limited`) when `consumeKvRateLimit` denies. +- Assert cache or in-flight coalescing: repeated or concurrent identical keys do not + multiply `orders.get` / storage reads beyond one pass per cache window. + +## 2) Concurrency / replay regression + +- Add/extend a test that replays the same webhook event from two callers with shared + `providerId` + `externalEventId` and asserts: + - Exactly one settlement side-effect is recorded (`order` reaches paid once). + - `queryFinalizationState` transitions to `replay_processed` or `replay_duplicate`. + - No uncontrolled exceptions are emitted for second-flight calls. +- Ensure logs include `commerce.finalize.inventory_reconcile`, `payment_attempt_update_attempt`, + and terminal `commerce.finalize.completed` / replay signal. + +## 3) Inventory preflight regression + +- Add/extend a test where cart inventory is stale/out-of-stock and checkout is rejected + with one of: + - `PRODUCT_UNAVAILABLE` + - `INSUFFICIENT_STOCK` +- Verify preflight happens before order creation and idempotency recording. +- Verify stock/version snapshots (`inventoryVersion`) are checked by finalize before decrement. + +## 4) Idempotency edge regression + +- Add/extend a test for each new mutation path that verifies: + - Same logical idempotency key replays return stable response when request payload hash + is unchanged. + - Payload hash drift (header/body mismatch or changed request body) is rejected. + - Duplicate storage writes in an error/retry path do not create duplicate ledger rows. +- Ensure replay states still preserve all required idempotency metadata (`route`, `attemptCount`, + `result`). + +## 5) External-review memo action roadmap (next phase) + +Use this section when continuing from the latest external review memo. Tickets are +narrow, high-signal, and ordered by failure risk. + +### 5A) Concurrency and duplicate delivery safety + +- [ ] Add/extend a race-focused test that drives same-event concurrent `webhooks/stripe` + handlers with identical `providerId` + `externalEventId`. +- [ ] Assert exactly one terminal side-effect set is produced for the event: + - one order-payment success + - one ledger movement set at most +- [ ] Assert follow-up flights return replay-safe statuses (`replay_processed` or + `replay_duplicate`) without duplicate stock/ledger side effects. +- [ ] Preserve diagnostic visibility for replay transitions and finalization completion log points. + +### 5B) Pending-state contract safety + +- [ ] Add/extend tests proving `pending` is a claim marker + resumable state boundary: + - resume from `pending` with missing/late finalize token, + - resume transition when order is already paid, + - nonterminal writes are not forced into `error` unless expected terminal inventory condition is met. +- [ ] Assert each finalize branch keeps `resumeState` and `inventoryState` coherent for operator visibility. + +### 5C) Ownership/possession boundary hardening + +- [ ] Add/extend tests for possession failures at all relevant entrypoints: + - `cart/get` with wrong/missing owner token, + - `checkout` when cart ownership hash state is inconsistent, + - `checkout/get-order` with missing/wrong finalize token. +- [ ] Assert unauthorized paths keep response shape stable and do not expose token-derived internals. + +### 5D) Roadmap gate before money-path expansion + +- [ ] Re-affirm the "narrow kernel first" guardrail in `HANDOVER.md` and + `COMMERCE_DOCS_INDEX.md` before any new provider runtime expansion. +- [ ] Keep Scope lock active: no provider routing/MCP command surface expansion until a second + provider or active `@emdash-cms/plugin-commerce-mcp` scope request. +- [ ] Keep ticket order: + 1. 5A + 2. 5B + 3. 5C + 4. 5D + +### 5E) Deterministic lease/expiry policy for claim reuse + +- [ ] Document claim lease semantics (`claimOwner`/`claimToken`/`claimVersion`/`claimExpiresAt`) in + `COMMERCE_EXTENSION_SURFACE.md` and `FINALIZATION_REVIEW_AUDIT.md`. +- [ ] Ensure `assertClaimStillActive()` checks lease ownership + lease expiry at every mutable finalize + boundary before performing: + - inventory writes, + - order settlement, + - payment-attempt transition, + - final receipt write. +- [ ] Verify behavior for malformed or missing claim state metadata returns safe replay semantics instead of + partial mutation. +- [ ] Keep race-focused replay tests passing for: + - stale claim reclamation, + - in-flight claim steal, + - stale lease preventing unsafe writes. + +### 5F) Rollout and documentation follow-up + +- [x] Confirm `HANDOVER.md`, `COMMERCE_DOCS_INDEX.md`, and `AI-EXTENSIBILITY.md` reflect finalized 5E status. +- [x] Prepare a staged rollout switch plan (`COMMERCE_USE_LEASED_FINALIZE`) so strict lease enforcement can + be toggled predictably in staged environments. +- [x] Run and archive both rollout-mode command families before enabling strict mode broadly: + - [x] Legacy behavior check (flag off): `pnpm --filter @emdash-cms/plugin-commerce test`. + - [x] Strict lease check mode: `COMMERCE_USE_LEASED_FINALIZE=1 pnpm --filter @emdash-cms/plugin-commerce test`. + - [x] Focused smoke on strict finalize regression: + `COMMERCE_USE_LEASED_FINALIZE=1 pnpm --filter @emdash-cms/plugin-commerce test src/orchestration/finalize-payment.test.ts`. + - [x] Proof artifacts are archived in CI artifacts tied to each executed command and test matrix. +- [x] Record proof artifacts for: + - command outputs for both modes, + - `src/orchestration/finalize-payment.test.ts` passing in both modes, + - docs updates in `COMMERCE_DOCS_INDEX.md`, `COMMERCE_EXTENSION_SURFACE.md`, and `FINALIZATION_REVIEW_AUDIT.md`. +- [x] Confirm environment promotion plan for `COMMERCE_USE_LEASED_FINALIZE` is written and that operations approval state is recorded before routing production-like webhook traffic through strict mode. + - [x] Approval evidence is recorded in the strategy/runbook notes. + - [x] Broad webhook traffic remains blocked in this branch until explicit production operations clearance is attached. + +### 6) Optional AI/LLM roadmap backlog (post-MVP) + +- [ ] Treat `COMMERCE_AI_ROADMAP.md` as the source of truth for the next optional 5-item backlog: + - Finalization incident forensics copilot. + - Webhook semantic drift guardrail. + - Reconciliation copilot for paid-but-wrong-stock events. + - Customer-incident communication templates. + - Catalog copy/type QA. +- [ ] Keep all five features advisory/read-only in initial implementation until evidence gates are added. +- [ ] Add execution tickets only after `Scope lock` and `Strategy A` obligations remain fully intact. +- [ ] Ensure every item includes an explicit review mode and explicit operator approval path before any write action. diff --git a/packages/plugins/commerce/COMMERCE_AI_ROADMAP.md b/packages/plugins/commerce/COMMERCE_AI_ROADMAP.md new file mode 100644 index 000000000..78b9d3f6d --- /dev/null +++ b/packages/plugins/commerce/COMMERCE_AI_ROADMAP.md @@ -0,0 +1,544 @@ +# Commerce plugin — AI/LLM Roadmap (post-MVP, 5 practical features) + +## Why this exists + +The core money path is already stable and deterministic (`cart` → `checkout` → +`webhook` → `finalize`). These features are intentionally scoped as +**post-MVP, nice-to-have enhancements** that add operational leverage and +customer-facing safeguards without replacing the deterministic kernel. + +This roadmap tracks 5 specific ideas, including the two you selected: + +- #8 (customer-facing incident communication) +- #9 (catalog/metadata quality guardrails) + +and the three must-have reliability extensions proposed next: + +- customer incident forensics copilot +- webhook event semantic drift guardrail +- paid-but-wrong-stock reconciliation copilot + +--- + +## Global design constraints (applies to all 5) + +1. **Kernel-first behavior never changes** + - No mutation path in checkout/finalization is delegated to LLM output. + - LLM artifacts are advisory unless explicitly approved by an operator. + +2. **Deterministic core, observable LLM assist** + - Use existing structured state (`queryFinalizationStatus`, `StoredWebhookReceipt`, + payment attempt rows, order/stock snapshots) as input. + - Keep suggestions side-effect free by default. + +3. **Environment-gated rollout** + - Keep every feature behind explicit feature flags/env toggles initially. + - Start in shadow/dry-run mode and collect evidence before write/enactment. + +4. **Evidence-first** + - Every recommendation should include: + - exact IDs (`orderId`, `externalEventId`, `paymentAttemptId`) + - confidence score + - what changed/what is read-only + - precise rollback/undo path + +5. **No external dependencies in core path** + - LLM calls happen in separate operator workflows (MCP command, admin endpoint, + cron/scheduled job, or support assistant), not inside webhook finalization handlers. + +--- + +## Priority list (likely to be needed first) + +| Rank | Feature | Category | Why this is near-term likely needed | Primary owner | +| ---- | ------------------------------------------ | ------------------------------- | --------------------------------------------------------------------------------------- | ------------------------------- | +| 1 | Finalization Incident Forensics Copilot | Reliability / Ops | Prevents long manual debugging loops on webhook replay/claim edge cases. | Platform/ops tooling | +| 2 | Webhook Semantic Drift Guardrail | Security / Integrity | Stops semantically unusual events from becoming silent recovery incidents. | Platform security + finance ops | +| 3 | Paid-vs-Wrong-Stock Reconciliation Copilot | Operations / CX trust | Directly protects fulfilled orders and support costs on inventory desync. | Ops + customer support | +| 4 | Customer Incident Communication Copilot | Support / UX / Merchant ops | Improves merchant and customer confidence during delayed/edge-case finalization states. | Growth + support tooling | +| 5 | LLM Catalog Intent QA | Content quality / Merchandising | Improves catalog quality and reduces merchant support on listing confusion. | Merchandising/content | + +--- + +## 1) Finalization Incident Forensics Copilot + +### Problem + +When claims/retries behave unexpectedly (e.g., `claim_in_flight` / `claim_retry_failed` +with mixed side effects), operators currently read logs manually and reconstruct a timeline. + +### Proposed behavior + +- Consume structured finalize telemetry: + - `resumeState`, `receiptStatus`, `isOrderPaid`, `isInventoryApplied` + - `isPaymentAttemptSucceeded`, `isReceiptProcessed` + - error kinds from `receiptErrorCode` / `errorDetails` + - finalize timeline markers from logs. +- Produce a short incident report: + - likely root cause class, + - likely next action (`retry`, `inspect`, `escalate`, `no-op`) + - exact proof commands. +- Include a machine-readable playbook step sequence (copy/paste) for operators. + +### Inputs + +- `queryFinalizationStatus` and storage reads from finalize collections. +- Correlation fields: `orderId`, `providerId`, `externalEventId`, `claimToken`. + +### Non-functional constraints + +- Never auto-finalizes in advisory mode. +- Supports replay: running the same query twice should return the same explanation given same input. +- Response includes redaction of sensitive order/customer context. + +### Acceptance criteria + +- Given representative edge-case fixture data, explanation includes one likely cause and one + safe action. +- Includes command snippet proving required proof artifacts. +- Can be run for merchant-facing support queue triage with bounded latency. + +### Proposed rollout + +1. Shadow mode (`/api` assistant returns analysis only, no actions). +2. Add audit logging for every suggestion. +3. Optional one-click follow-up tasks behind auth + permission checks. + +--- + +## 2) Webhook Semantic Drift Guardrail + +### Problem + +Webhook signature verification and schema validation can pass while payload semantics drift +or look inconsistent with internal invariants. + +### Proposed behavior + +- Compare incoming event semantics against order/payment expectations: + - provider metadata coherence (`orderId`, `externalEventId`, finalize binding) + - impossible or suspicious transition markers + - frequency anomalies for same event IDs / provider IDs + - malformed/ambiguous actor/context combinations +- Classify as: + - `ok` + - `warn` (monitor) + - `suspect` (quarantine for manual review) +- Emit a `suspect` advisory event (non-blocking default), then escalate to hard block only + if governance policy enables stricter mode. + +### Inputs + +- Raw event payload + metadata from webhook adapter input. +- Current payment/order state + existing receipt rows. + +### Non-functional constraints + +- Must not reject valid events silently in default compatibility mode. +- Policy toggle controls enforcement (observe, warn, block). + +### Acceptance criteria + +- Deterministic flags for known synthetic suspicious patterns. +- No change to existing finalized orders in non-blocking mode. +- When strict mode is enabled, flagged cases become auditable and traceable in logs. + +### Suggested implementation strategy + +- Separate "evidence extractor" and "judge" functions for testability. +- Keep in a read/write-guarded service seam so the kernel can still enforce exact semantics. + +--- + +## 3) Reconciliation Copilot for Paid-but-Wrong-Stock + +### Problem + +Complex partial-write/retry states can still produce merchant-visible mismatch where one +side of stock/payment state progressed and another did not. + +### Proposed behavior + +- Detect candidate mismatch classes by correlating: + - stock movements from `inventoryLedger` + - `inventoryStock` quantity/version + - finalize resume state (`pending_inventory`, `pending_order`, etc.) + - payment attempt outcome + receipt status. +- Produce ranked corrective plan: + - no-op/confirm + - ledger+stock correction + - controlled re-run (single replay) with prerequisites +- For each recommendation, include: + - idempotent SQL-style operations + - expected resulting invariants + - reversibility checklist. + +### Inputs + +- `inventoryStock`, `inventoryLedger`, `orders`, `paymentAttempts`, `webhookReceipts`. + +### Non-functional constraints + +- No direct stock updates by default. +- Recommendations always include audit fingerprint (ticket-ready evidence). +- Actions require explicit operator confirmation and actor tagging. + +### Acceptance criteria + +- For known mismatches, report at least one repair plan with safe guardrails. +- Never suggests blind auto-correct without constraints check. +- Supports dry-run mode that proves invariants before commit. + +### Suggested rollout + +- Start as support-tool integration only (view + copy suggestions). +- Promote to workflow assistant command after 2 release cycles with no false positives. + +--- + +## 4) Customer Incident Communication Copilot (#8) + +### Problem + +After delay, replay, or partial finalization visibility, merchants need high-quality, +policy-safe language quickly. + +### Proposed behavior + +- Generate localized message drafts for: + - delayed/under-review payments, + - resumed finalization success, + - escalation-required states. +- Templates use state-safe branching based on `isOrderPaid`, `receiptStatus`, + `resumeState`, and payment method context. +- Output two channels: + - merchant internal summary (support-ready) + - customer-facing tone with policy-safe wording (if configured). + +### Inputs + +- Finalization state + recent event history + resume state. +- Route-level locale and merchant communication style config. + +### Non-functional constraints + +- Must only compose from normalized state symbols (no free-text inference). +- Compliance-safe defaults (no speculative legal or payment claims). +- No automatic outbound communication initially. + +### Acceptance criteria + +- For each edge-case state, generated copy is non-empty and does not contradict kernel status. +- No path can generate a customer message while order/receipt state is inconsistent. + +--- + +## 5) LLM Catalog Intent QA (#9) + +### Problem + +Catalog copy/metadata drift often causes support tickets, poor search results, and poor +conversion; this is hard to police with rule-only checks. + +### Proposed behavior + +- Analyze product/copy against structured constraints: + - price/variant consistency with product type data + - shipping/stock policy conflicts + - obvious mislabels (e.g., "in stock" vs zero stock policy text) + - SEO and description quality signals for downstream search/embedding. +- Emit structured QA findings: + - severity + - exact field diffs + - suggested minimal edits. + +### Inputs + +- `shortDescription`, product/variant copy, tags, attributes, and pricing snapshots. + +### Non-functional constraints + +- Must never mutate product data. +- Suggestion output is structured and versioned by model/call timestamp. +- Optional "apply suggestions" flow only with explicit review and version bump. + +### Acceptance criteria + +- In QA report, each finding maps back to a field-level anchor. +- Low false-positive threshold from a small validation set before rollout. +- No edits are committed without explicit approval. + +--- + +## Suggested execution order + +1. Finalization Incident Forensics Copilot +2. Webhook Semantic Drift Guardrail +3. Reconciliation Copilot +4. Customer Incident Communication Copilot +5. LLM Catalog Intent QA + +That order keeps the first three on the same operational reliability spine, with the +customer/merchant enhancements following. + +## Concrete ticket sequence (recommended) + +### Legend + +- Effort: `XS` = 0.5–1 day, `S` = 1–2 days, `M` = 3–5 days, `L` = 1 week+ +- Owner: primary team responsible +- Dependencies: required completion before start + +### Epic A — Finalization Incident forensics + +- `AI-1`: Finalization Incident Forensics Copilot core (Owner: Platform/ops tooling; Effort: M) + - Build advisory analyzer that summarizes claim/retry failures and maps to safe next action. + - Inputs: `queryFinalizationStatus`, webhook receipt rows, payment/order rows. + - DoD: deterministic incident output, command snippets included, replay-safe and side-effect free. +- `AI-1a`: Forensics schema + policy switches (Owner: Platform core; Effort: XS) + - Add typed artifact schema + strict mode/env toggles. +- `AI-1b`: Forensics delivery endpoint/command (Owner: Platform/ops tooling; Effort: S) + - Add structured API/command output for support dashboards. + - DoD: same input always returns same output + redaction rules in place. +- `AI-1c`: Playbook mapping (Owner: Support enablement; Effort: S) + - Attach existing runbook steps by root cause class. + +### Epic B — Webhook semantic drift guardrail + +- `AI-2`: Webhook Semantic Drift Guardrail (Owner: Platform security + finance ops; Effort: M) + - Add advisory drift classifier (`ok` / `warn` / `suspect`) for event-to-state inconsistencies. +- `AI-2a`: Evidence extractor (Owner: Platform security; Effort: S) + - Build deterministic extraction from raw webhook payload + receipt state. +- `AI-2b`: Rule set + scoring (Owner: Platform security; Effort: M) + - Add conflict checks and suspicious-pattern scoring with tests. +- `AI-2c`: Policy routing (Owner: Finance ops; Effort: M) + - Route to observe/warn/block with explicit audit records. + +### Epic C — Reconciliation copilot + +- `AI-3`: Paid-vs-stock reconciliation copilot (Owner: Ops + support; Effort: M) + - Correlate inventory ledger/stock and finalize resume states to rank candidate repairs. +- `AI-3a`: Reconciliation classifier (Owner: Ops; Effort: M) + - Detect at least five mismatch classes deterministically. +- `AI-3b`: Safe repair plan builder (Owner: Ops tooling; Effort: M) + - Provide dry-run plan with invariants and rollback notes. +- `AI-3c`: Operator approval surface (Owner: Ops tooling; Effort: S) + - Add explicit confirmation/actor tagging before any mutable action. + +### Epic D — Customer incident communication + +- `AI-4`: Customer Incident Communication Copilot (#8) (Owner: Growth + support tooling; Effort: S) + - Generate state-safe incident messaging for merchant and customer channels. +- `AI-4a`: State-to-copy matrix (Owner: Support tooling; Effort: S) + - Map each resume/error state to approved template language. +- `AI-4b`: Safety gating (Owner: Product + Growth; Effort: XS) + - Enforce no autopush messaging and policy-safe language constraints. + +### Epic E — Catalog/metadata quality QA + +- `AI-5`: LLM Catalog Intent QA (#9) (Owner: Merchandising/content; Effort: M) + - Build advisory QA findings for copy/type consistency and metadata contradictions. +- `AI-5a`: Rule pack + scoring (Owner: Merchandising/content; Effort: M) + - Add structured finding schema with severity and field anchors. +- `AI-5b`: Suggestion review workflow (Owner: Content ops; Effort: M) + - Add reviewed "apply suggestion" path with explicit confirmation. +- `AI-5c`: Quality gates (Owner: QA; Effort: S) + - Add validation corpus and false-positive threshold before rollout. + +### Suggested release order + +1. `AI-1` + `AI-2` (observability and safety foundation) +2. `AI-3` (direct support-time operations value) +3. `AI-4` (support and merchant communication) +4. `AI-5` (quality pass, non-critical dependency-safe) + +### Exit criteria for this roadmap band + +- All advisory outputs are deterministic and idempotent for identical inputs. +- No ticket in this band directly mutates checkout/finalize core state. +- Any future write path requires explicit operator approval and evidence bundle. +- Rollout starts in observe mode; strict/auto paths enabled only after sign-off. + +--- + +## PR-ready ticket stubs + +Use this section to seed execution tickets directly. + +- Ticket: `AI-1` — `feat(commerce): add finalize incident forensics analyzer` + - **User story**: As a support engineer, I need an advisory incident analysis so replay/claim edge cases are recoverable faster. + - **Scope** + - Build deterministic analysis from finalize telemetry, receipt state, payment/order rows, and recent event history. + - Return root cause class + safe next action (`retry`, `inspect`, `escalate`, `no-op`) + evidence references. + - **Acceptance** + - Deterministic output for identical input. + - Includes `orderId`, `externalEventId`, `correlationId`, `recommendation`, `commandHint`. + - No mutation in this ticket. + - **Dependencies**: none + +- Ticket: `AI-1a` — `feat(commerce): add forensics schema and policy switches` + - **User story**: As platform owner, I need typed policy controls so analysis mode can be governed safely. + - **Scope** + - Add typed output schema + mode config (`observe`/`warn`/`manual`) and safe defaults. + - Add config docs and validation. + - **Acceptance** + - Unknown mode defaults to safe behavior (`observe`). + - Tests cover mode validation. + - **Dependencies**: `AI-1` + +- Ticket: `AI-1b` — `feat(commerce): expose finalize-forensics read endpoint` + - **User story**: As an operator, I want a read surface for one-click incident analysis. + - **Scope** + - Add read-only endpoint/command returning one analysis artifact per order/event. + - **Acceptance** + - Deterministic output + redaction for sensitive fields. + - Correct handling for missing receipts/events. + - **Dependencies**: `AI-1a` + +- Ticket: `AI-1c` — `feat(commerce): bind forensics to support playbooks` + - **User story**: As support, I need direct linkage from analysis results to remediation steps. + - **Scope** + - Map analysis classes to playbook actions and escalate paths. + - **Acceptance** + - Every emitted class maps to either concrete playbook or explicit escalation. + - **Dependencies**: `AI-1b` + +- Ticket: `AI-2` — `feat(commerce): add webhook semantic drift guardrail` + - **User story**: As finance/security, I need early alerting on suspicious event-to-state mismatch. + - **Scope** + - Add advisory drift classifier for webhook + state inconsistencies (`ok`/`warn`/`suspect`). + - **Acceptance** + - Known suspicious synthetic patterns deterministically classified. + - No behavior change in `observe` mode. + - **Dependencies**: `AI-1` + +- Ticket: `AI-2a` — `feat(commerce): extract webhook drift evidence` + - **User story**: As security reviewer, I need normalized drift evidence for reliable scoring. + - **Scope** + - Build typed evidence extractor from raw webhook payload, order state, and receipts. + - **Acceptance** + - Explicit evidence representation for malformed metadata, conflicting identifiers, replay anomalies. + - **Dependencies**: `AI-2` + +- Ticket: `AI-2b` — `feat(commerce): add drift scoring and rule matrix` + - **User story**: As maintainer, I need consistent scoring for suspicious events. + - **Scope** + - Add rule-based scorer with confidence values and deterministic outputs. + - **Acceptance** + - `ok`/`warn`/`suspect` test matrix passes repeatably. + - Score is replay-stable for identical input. + - **Dependencies**: `AI-2a` + +- Ticket: `AI-2c` — `feat(commerce): route drift signals by policy` + - **User story**: As operations, I need policy-based action on suspicious signals. + - **Scope** + - Implement `observe`/`warn`/`block` policy switch with explicit audit records. + - **Acceptance** + - `observe`: no runtime mutation. + - `warn`: advisory flag + log. + - `block`: explicit hard-stop behavior only for configured suspicious classes. + - **Dependencies**: `AI-2b` + +- Ticket: `AI-3` — `feat(commerce): build paid-vs-stock reconciliation analyzer` + - **User story**: As support, I need ranked reconciliation candidates for paid-but-wrong-stock incidents. + - **Scope** + - Correlate `inventoryLedger`, `inventoryStock`, receipt, and payment attempt state. + - Produce ranked candidate mismatch classes. + - **Acceptance** + - Detect at least five deterministic mismatch classes. + - Advisory output only for initial rollout. + - **Dependencies**: `AI-1`, `AI-2` + +- Ticket: `AI-3a` — `feat(commerce): add reconciliation class classifier` + - **User story**: As operator, I need confidence-labeled mismatch reasons with standardized names. + - **Scope** + - Add deterministic classifier and evidence output for candidate classes. + - **Acceptance** + - Fixture coverage for successful/resumption/error-recovery paths. + - **Dependencies**: `AI-3` + +- Ticket: `AI-3b` — `feat(commerce): add dry-run repair plan builder` + - **User story**: As operator, I want dry-run-safe repair plans before taking action. + - **Scope** + - Generate repair instructions with invariant checks and rollback notes. + - **Acceptance** + - Plans include preconditions + expected target state. + - **Dependencies**: `AI-3a` + +- Ticket: `AI-3c` — `feat(commerce): require explicit approval for reconciliation actions` + - **User story**: As security owner, I need human approval for any stock/order write. + - **Scope** + - Add explicit confirmation gating and actor tagging for each write action. + - **Acceptance** + - No mutable action without confirmation. + - **Dependencies**: `AI-3b` + +- Ticket: `AI-4` — `feat(commerce): add customer incident communication copilot` + - **User story**: As support, I want state-safe draft messaging to reduce manual support lag. + - **Scope** + - Add state-safe template output for internal + optional customer channels. + - **Acceptance** + - Coverage for delayed/recovering/escalation states. + - No contradiction with kernel state. + - **Dependencies**: `AI-1` + +- Ticket: `AI-4a` — `feat(commerce): map finalize states to communication templates` + - **User story**: As support enablement, I need explicit copy by state. + - **Scope** + - Build state->template matrix for `resumeState`, `receiptStatus`, error classes. + - **Acceptance** + - Complete matrix for all incident-facing states. + - **Dependencies**: `AI-4` + +- Ticket: `AI-4b` — `feat(commerce): add messaging safety gates` + - **User story**: As compliance owner, I need strict limits on draft messaging. + - **Scope** + - Redaction, no-auto-send default, locale-safe placeholder strategy. + - **Acceptance** + - Customer-facing output requires explicit allowlisted mode. + - **Dependencies**: `AI-4a` + +- Ticket: `AI-5` — `feat(commerce): add catalog intent QA analyzer` + - **User story**: As merchandiser, I want advisory catalog consistency findings. + - **Scope** + - Add advisory checks for copy/type/metadata alignment and stock/policy mismatches. + - **Acceptance** + - Findings include severity and field-level anchors. + - No mutation in initial release. + - **Dependencies**: `AI-1` + +- Ticket: `AI-5a` — `feat(commerce): add catalog QA rule pack` + - **User story**: As content lead, I need structured QA rules for reliable recommendations. + - **Scope** + - Add deterministic rule set with confidence and anchor mapping. + - **Acceptance** + - Rule suite returns stable outputs for same product snapshot. + - **Dependencies**: `AI-5` + +- Ticket: `AI-5b` — `feat(commerce): build reviewed suggestion application` + - **User story**: As editor, I need explicit review for catalog recommendations before apply. + - **Scope** + - Add approval flow and version increment on apply. + - **Acceptance** + - No edits without explicit operator confirmation and audit trail. + - **Dependencies**: `AI-5a` + +- Ticket: `AI-5c` — `feat(commerce): add catalog QA false-positive control` + - **User story**: As QA, I need noise controls before enabling this surface. + - **Scope** + - Add validation corpus and release threshold checks. + - **Acceptance** + - Rollout blocked automatically if false-positive threshold is exceeded. + - **Dependencies**: `AI-5b` + +## Dependencies and readiness gates + +- Feature-safe foundation: `queryFinalizationStatus` and finalize resume-state telemetry + remain authoritative. +- Delivery sequence should include: + - structured output schemas + - audit logs + - dry-run evidence bundles + - operator approval and rollback behavior. + +No core checkout/finalize semantics should be changed for any of these 5 features. diff --git a/packages/plugins/commerce/COMMERCE_DOCS_INDEX.md b/packages/plugins/commerce/COMMERCE_DOCS_INDEX.md new file mode 100644 index 000000000..78fa371ef --- /dev/null +++ b/packages/plugins/commerce/COMMERCE_DOCS_INDEX.md @@ -0,0 +1,111 @@ +# Commerce plugin documentation index + +## Operations and support + +For a quick reviewer entrypoint: `@THIRD_PARTY_REVIEW_PACKAGE.md` → `external_review.md` → `SHARE_WITH_REVIEWER.md`. + +### Pre-merge release gates (review-response work) + +- `pnpm --silent lint:quick` +- `pnpm typecheck` +- `pnpm --filter @emdash-cms/plugin-commerce test` (or `pnpm test` from `packages/plugins/commerce`) +- `HANDOVER.md` + external feedback checklist updated for each completed item +- Capture commit hash + summary before handoff + +- [Paid order but stock is wrong (technical)](./PAID_BUT_WRONG_STOCK_RUNBOOK.md) +- [Paid order but stock is wrong (support playbook)](./PAID_BUT_WRONG_STOCK_RUNBOOK_SUPPORT.md) + +## Architecture and implementation + +- `AI-EXTENSIBILITY.md` — future vector/LLM/MCP design notes +- `COMMERCE_AI_ROADMAP.md` — post-MVP LLM/AI feature roadmap (5 scoped items) +- `HANDOVER.md` — current execution handoff and stage context +- `COMMERCE_EXTENSION_SURFACE.md` — architecture contracts and extension rules +- `FINALIZATION_REVIEW_AUDIT.md` — pending receipt state transitions and replay safety audit +- `CI_REGRESSION_CHECKLIST.md` — regression gates for follow-on tickets + +### Strategy A (Contract Drift Hardening) status + +**Strategy A metadata** + +- Last updated: 2026-04-03 +- Owner: emDash Commerce plugin lead (handoff-ready docs update) +- Current phase owner: Strategy A follow-up only +- Status in this branch: 5A (same-event duplicate-flight concurrency assertions), 5B (pending-state resume-state visibility and non-terminal branch behavior), 5C (possession boundary assertions), 5D (scope lock reaffirmed), 5E (deterministic claim lease/expiry policy), and 5F (strict claim-lease proof artifacts captured). + +- Scope: **active for this iteration only** and **testable without new provider runtime**. +- Goal: keep `checkout`/`webhook` behavior unchanged while reducing contract drift across payment adapters. +- Constraint: no broader provider runtime refactor yet. +- Activation guardrail: defer provider- and MCP-command architecture work until either: + - a second payment provider is actively onboarded, or + - an `@emdash-cms/plugin-commerce-mcp` command surface is shipped. +- Relevant files: + - `src/services/commerce-provider-contracts.ts` + - `src/services/commerce-provider-contracts.test.ts` + +## Plugin code references + +- `package.json` — package scripts and dependencies +- `tsconfig.json` — TypeScript config +- `src/services/` and `src/orchestration/` — extension seams and finalize logic +- `src/handlers/` — route handlers (cart, checkout, webhooks, catalog) +- `src/orchestration/` — finalize orchestration and inventory/attempt updates +- `src/catalog-extensibility.ts` — kernel rules + extension seam contracts +- `src/storage.ts` — storage collection declarations (products/skus added for v1 catalog) +- `src/types.ts` and `src/schemas.ts` — catalog domain models and validation + +### Ticket starter: Strategy A (contract hardening) + +Use this when opening follow-up work: + +1. Set scope to Strategy A only (contract drift hardening, no topology change). +2. Execute the Strategy A checklist in `CI_REGRESSION_CHECKLIST.md` sections 0–5, with optional 5F follow-through. +3. Confirm docs updates are in scope: + - `COMMERCE_DOCS_INDEX.md` + - `COMMERCE_EXTENSION_SURFACE.md` + - `AI-EXTENSIBILITY.md` + - `HANDOVER.md` + - `FINALIZATION_REVIEW_AUDIT.md` +4. Run proof commands: + - `pnpm --filter @emdash-cms/plugin-commerce test services/commerce-provider-contracts.test.ts` + - `pnpm --filter @emdash-cms/plugin-commerce test` +5. Proof artifacts for strict lease rollout: + +- `COMMERCE_USE_LEASED_FINALIZE` is retained for replay parity and evidence reruns when needed; strict claim-lease checks are otherwise canonical. +- Runbooks and proof outputs are now captured directly in this repo’s regression log trail. + +## External review continuation roadmap + +After the latest third-party memo, continue systematically with +`CI_REGRESSION_CHECKLIST.md` sections 5A–5F (in order) before broadening +provider topology. +5A/5B/5C/5D/5E/5F have been implemented in this branch. +Strict lease behavior is now canonical and evidence is maintained in current strategy and regression docs. + +For post-5F planning, follow `COMMERCE_AI_ROADMAP.md` as the optional +reliability-support-catalog extension backlog. + +## Plugin HTTP routes + +| Route | Role | +| ------------------------ | ------------------------------------------------------------------------------------------------ | +| `cart/upsert` | Create or update a `StoredCart`; issues `ownerToken` on first creation | +| `cart/get` | Read-only cart snapshot; `ownerToken` when cart has `ownerTokenHash` | +| `checkout` | Create `payment_pending` order + attempt; idempotency; `ownerToken` if cart has `ownerTokenHash` | +| `checkout/get-order` | Read-only order snapshot; always requires matching `finalizeToken` | +| `webhooks/stripe` | Verify signature → finalize | +| `recommendations` | Disabled contract for UIs | +| `catalog/product/create` | Validate and create a product row | +| `catalog/product/get` | Retrieve one product by id | +| `catalog/products` | List products by type/status/visibility | +| `catalog/sku/create` | Create a SKU for an existing product | +| `catalog/sku/list` | List SKUs for one product | + +## Diagnostics and runbook surfaces + +- `queryFinalizationState` (via `src/services/commerce-extension-seams.ts`) for runbook and MCP reads — applies per-IP rate limit, ~10s KV cache, and in-isolate in-flight coalescing (see `COMMERCE_LIMITS` / `finalization-diagnostics-readthrough.ts`). +- `queryFinalizationStatus` (via `src/orchestration/finalize-payment.ts`) returns the same shape but **without** those guards; prefer `queryFinalizationState` for HTTP/MCP polling unless you are in a controlled test or internal path. + +All routes mount under `/_emdash/api/plugins/emdash-commerce/`. + +Implementation note: `src/index.ts` is the active source of truth for what the plugin exposes over HTTP today. diff --git a/packages/plugins/commerce/COMMERCE_EXTENSION_SURFACE.md b/packages/plugins/commerce/COMMERCE_EXTENSION_SURFACE.md new file mode 100644 index 000000000..a3fce0066 --- /dev/null +++ b/packages/plugins/commerce/COMMERCE_EXTENSION_SURFACE.md @@ -0,0 +1,203 @@ +# Commerce kernel rules and extension surface + +## Source of truth + +- Runtime route surface: `src/index.ts routes` (authoritative for what is currently exposed). +- Platform status: `HANDOVER.md`. +- Stability contracts: this file and `src/catalog-extensibility.ts` for extension types. + +## Closed-kernel rules + +The money path is intentionally closed: + +- `checkout` must remain the only place that creates order/payment attempt state in this stage. +- `webhooks/stripe` must be the only route that transitions payment state in production. +- `finalizePaymentFromWebhook` is the sole internal mutation entry for payment-success + and inventory-write side effects. +- `queryFinalizationStatus` / `queryFinalizationState` are read-only observability views. +- Order/token authorization and idempotency checks must remain unchanged unless a proven + bug justifies a narrow patch and regression test. + +These rules are captured in `COMMERCE_KERNEL_RULES` in `src/catalog-extensibility.ts`. + +## Approved extension seams + +### Recommendation seam (read-only) + +- `recommendations` route accepts an optional `CommerceRecommendationResolver`. +- Resolver contracts are defined in `CommerceRecommendationInput` / `CommerceRecommendationResult`. +- Resolver implementations must only return candidate `productIds` and must not mutate + commerce collections. +- `createRecommendationsRoute()` exports a route constructor for this seam. + +### Webhook adapter seam (provider integration) + +- `CommerceWebhookAdapter` and `handlePaymentWebhook` in + `src/handlers/webhook-handler.ts` define the only supported adapter seam for + third-party gateway integrations. +- Providers are responsible for request verification and input extraction. +- Core writes still happen in the shared finalize orchestration. +- `createPaymentWebhookRoute()` wraps an adapter into a route-level entry point. + +#### Webhook adapter contract requirements + +- verify authenticity and freshness before returning finalize inputs, +- return a stable `correlationId`, +- return a rate-limit suffix suitable for request burst protection. + +Adapters MUST NOT perform commerce writes (`orders`, `paymentAttempts`, +`webhookReceipts`, `inventoryLedger`, `inventoryStock`). All mutation decisions +must pass through `finalizePaymentFromWebhook`. + +#### Strategy A (contract drift hardening): active scope only + +**Strategy A metadata** + +- Last updated: 2026-04-03 +- Owner: emDash Commerce platform/core team +- Scope owner: contract layer only, no behavior change + +- Keep all checkout/webhook runtime behavior unchanged. +- Consolidate provider defaults, adapter shape, and MCP actor constants in a shared contract module (`src/services/commerce-provider-contracts.ts`). +- Do not introduce provider registry/routing multiplexing yet. +- Do not introduce an MCP command surface yet. +- Leave runtime gateway behavior on `webhooks/stripe` until a second provider is enabled. +- Hardening checkpoint in this branch: added regression assertions for same-event duplicate + webhook finalization convergence (5A), pending-state resume-status visibility (5B), + possession-guard coverage (5C), and deterministic claim lease/expiry behavior (5E) + with active ownership revalidation on all critical finalize-write stages. +- 5F strict lease proof artifacts were specified and validated in docs+tests. +- Optional post-5F operational/AI work is tracked in `COMMERCE_AI_ROADMAP.md` and remains + advisory until explicitly staged. +- Continue to enforce read-only rules for diagnostics via `queryFinalizationState`. + +### Canonical claim lease enforcement + +- Strict claim lease checks (ownership revalidation and malformed-lease replay behavior) are the active finalize path. +- `COMMERCE_USE_LEASED_FINALIZE` is retained only for temporary parity checks and + for re-running command families during verification when needed. +- `COMMERCE_USE_LEASED_FINALIZE` does **not** represent an alternative runtime mode in this branch; strict lease behavior remains canonical and should stay in production. +- Historical rollout steps and rollback criteria are retained for context in current + operational runbooks, but operational controls should treat strict behavior as baseline. + +### Read-only MCP service seam + +- `queryFinalizationState()` exposes a read-only status query path for MCP tooling. +- MCP tools should call this helper (or package HTTP route equivalents) rather than + touching storage collections directly. + +`queryFinalizationState` returns: + +- `isInventoryApplied` +- `isOrderPaid` +- `isPaymentAttemptSucceeded` +- `isReceiptProcessed` +- `receiptStatus` (`missing|pending|processed|error|duplicate`) +- `resumeState` (`not_started`, `pending_inventory`, `pending_order`, + `pending_attempt`, `pending_receipt`, `replay_processed`, + `replay_duplicate`, `error`, `event_unknown`) + +### Read-only validator and optional finalize-time invariants + +Operators can combine: + +- `queryFinalizationState` read model (order/receipt/attempt/ledger state), and +- read-only inventory/stock checks during incident review. + +For deeper drift detection, set `COMMERCE_ENABLE_FINALIZE_INVARIANT_CHECKS=1` so +completed finalize calls also log warning-level invariant signals when order paid, +attempt success, and ledger/stock application are unexpectedly out of sync. +This flag should be used as a temporary safety net during incident response only, +not as part of normal fast-path processing. + +### Paid-vs-receipt semantics for storefront and support tooling + +`isOrderPaid` is the order-facing signal. It should drive user-visible “payment +completed” messaging. + +`receiptStatus` is event-facing signal. It should drive retry/recovery visibility: + +- `missing`: there is no event receipt row yet. +- `pending`: event is in partial-finalization recovery and can be retried through safe re-invocation. +- `processed`: event has been handled once; duplicates should be treated as idempotent replay. +- `error`: explicit finalization failure; manual triage before more retries. +- `duplicate`: duplicate event replay path after idempotent precondition short-circuit. + +Optional storefront-safe fields to show in support dashboards: + +- `isReceiptProcessed` (boolean) +- `isPaymentAttemptSucceeded` (boolean) +- `resumeState` (action hint for support runbooks) +- `receiptErrorCode` when `receiptStatus === "error"` (operation-classified terminal error) + +For Stage-1, `receiptStatus === "error"` is intentionally treated as a runbook-only recovery +signal (no built-in admin transition API yet). Recovery tooling should require an explicit +human operator decision using `receiptErrorCode` and related checkpoints. + +This keeps storefront user messaging tied to order state while preserving webhook +forensics for operators. + +**Option B (moderate polling):** this helper applies a per-client-IP KV rate limit +(`COMMERCE_LIMITS.defaultFinalizationDiagnosticsPerIpPerWindow` per +`defaultRateWindowMs`), a short KV read-through cache +(`finalizationDiagnosticsCacheTtlMs`, default 10s), and in-isolate in-flight +coalescing for identical `(orderId, providerId, externalEventId)` keys. Direct +`queryFinalizationStatus` calls bypass these guards and are intended for tests +or tightly controlled internal use only. + +### How to tune Option B (when call volume grows) + +Use this as a practical playbook before scaling to precomputed status projections: + +- **Support/Admin polling (low frequency):** + - Keep defaults. + - Cache TTL: `10_000ms`. + - IP diagnostics limit: `60 / 60s`. +- **Team dashboard with moderate polling:** + - Raise `defaultFinalizationDiagnosticsPerIpPerWindow` in controlled increments (e.g. `120` or `180`) if rate-limit rejections appear in healthy workflows. + - Keep cache at `10_000ms` first; increase only if read spikes remain after rate-limit tuning. +- **Agent-driven batch checks (multiple operators/tools):** + - Increase cache TTL gradually (`15_000`–`30_000ms`) to flatten read spikes. + - Prefer caller-side jitter/backoff over unlimited polling loops. + +If you regularly see sustained saturation even after these knobs: + +- move diagnostics calls to larger `finalizationDiagnosticsCacheTtlMs` window, +- or adopt the next step (snapshot projection) for high-throughput, always-on polling. + +### Environment adapter checklist for `queryFinalizationState` + +For EmDash-native integrations (HTTP routes, cron workers, and any EmDash-hosted +tooling surface), adapter code should preserve the shared semantics by passing a +single coherent `RouteContext` into the seam: + +- Build a stable `Request` object and set `request.method` explicitly (the seam + expects standard handler semantics). +- Populate `requestMeta.ip` from the platform edge/request context. +- Bind `ctx.kv` to the plugin KV access layer (same key namespace across + environments). +- Keep `ctx.storage`, `ctx.log`, and `ctx.requestMeta` present and consistent. +- Forward auth/session context only as needed for route-level gates outside this seam; + the seam itself is read-only and does not mutate commerce storage. +- Keep per-environment wrappers thin: all diagnostics caching, rate limiting, and + coalescing live in `queryFinalizationState`. + +This keeps `queryFinalizationState` portable: one kernel path, many transport +adapters. + +### MCP-ready service entry point policy + +- MCP integrations are expected to call the same service paths and error codes as HTTP + route entry points. +- MCP-facing tools must not issue storage writes directly into commerce collections. +- Any future MCP command surface should treat this file’s rules as non-negotiable. + +## Failure behavior expectations + +- Receipt states remain: + - `pending`: resumable/finalize-retry path. + - `processed` or `error`: terminal and explicit. +- A finalized order must never be produced by third-party code; all finalize side effects + come from kernel services. +- Extension errors should be observable but must not degrade kernel invariants. +- Read-only seams are the only extension path for payment-state inspection. diff --git a/packages/plugins/commerce/FINALIZATION_REVIEW_AUDIT.md b/packages/plugins/commerce/FINALIZATION_REVIEW_AUDIT.md new file mode 100644 index 000000000..43e63c913 --- /dev/null +++ b/packages/plugins/commerce/FINALIZATION_REVIEW_AUDIT.md @@ -0,0 +1,67 @@ +# Finalization Receipt State and Replay Audit + +Use this as the single audit artifact for recovery-path behavior in +`finalizePaymentFromWebhook()`. + +## 1) Receipt-state exits after pending write + +After `webhookReceipts.put(receiptId, { status: "pending", ... })`, every branch +must resolve to one of three outcomes: + +- **`TERMINAL_ERROR`**: do not auto-retry on operator-triggered follow-up. +- **`RESUMABLE_PENDING`**: keep `pending` and retrying the same event should + continue safely. +- **`COMPLETED`**: write `processed` and return success. + +| Branch after pending write | Receipt status | Why this outcome | +| ------------------------------------------------------------------------------- | ------------------ | -------------------------------------------------------------------------------------------------------------------------------------- | +| Re-read of order fails (`post_pending_lookup`) | `error` | The order row is gone; this is a terminal integrity signal for investigation. | +| Order no longer finalizable (`paymentPhase` not `payment_pending`/`authorized`) | `error` | Concurrency or external mutation moved state; retrying blindly is unsafe. | +| Inventory preflight fails (version mismatch, insufficient stock, etc.) | `pending` | Side effects were intentionally not applied; retry can safely retry from scratch using same event context. | +| Order persistence fails (`orders.put` failure during finalization) | `pending` | Inventory may be applied, but payment-phase transition is incomplete; retry is expected. | +| Payment attempt persistence fails (`paymentAttempts.put` failure) | `pending` | Order may be paid, but attempt state is incomplete; retry is expected. | +| Finalization writes succeed, but `webhookReceipts.put(processed)` fails | `pending` (throws) | Caller receives a transport error; a retried call continues from the same idempotent state and should now complete receipt processing. | +| Full success path | `processed` | Terminal success; subsequent replay returns `replay` semantics where appropriate. | + +## 1b) Log events for recovery tooling + +Preferred operational events: + +- `commerce.finalize.receipt_pending` +- `commerce.finalize.order_not_found` +- `commerce.finalize.order_not_finalizable` +- `commerce.finalize.inventory_reconcile` +- `commerce.finalize.inventory_applied` +- `commerce.finalize.inventory_failed` +- `commerce.finalize.order_settlement_attempt` +- `commerce.finalize.payment_attempt_update_attempt` +- `commerce.finalize.receipt_processed` +- `commerce.finalize.completed` + +## 1c) 5E/5F lease enforcement follow-through + +`finalizePaymentFromWebhook()` applies strict lease boundary checks as the default behavior: + +- malformed or missing `claimExpiresAt` is treated as replay-safe (`claim_retry_failed`) instead of silently continuing side-effect writes, +- finalization remains bounded by live claim validation before each mutable write stage (`inventory`, `order`, `attempt`, `receipt`), +- strict mode still allows reclaim of valid stale claims (`now > claimExpiresAt`) and preserves in-flight lock semantics. + +Operational evidence for this stage is recorded in the current strategy and regression +checklists as active proof trails. + +## 2) Duplicate delivery & partial-failure replay matrix + +| Scenario | Expected outcome | Why it is safe today | +| ------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| Duplicate webhook event with same `(providerId, externalEventId)` in a shared runtime | Idempotent or replay-like behavior (status transitions + deterministic IDs). | Existing receipt key (`webhookReceiptDocId`) is stable; ledger/order writes are deterministic. | +| Same event replay while previous attempt is still `pending` | Resume from `pending` state; side effects remain bounded. | Decision/receipt/query logic is deterministic and keyed by the same event id. | +| Partial failure after some side effects (inventory/order/attempt) | Receipt stays `pending` unless missing/non-finalizable order case. | In-progress state is preserved and documented for safe retry. | +| Perfectly concurrent cross-worker delivery | Residual risk remains bounded. | Claim ownership now uses lease metadata plus ownership-version checks; safe revalidation points can short-circuit writes before side effects, but platform-specific timing around concurrent updates is still a residual watchpoint. | + +## 3) Operational references + +- Primary contract for this path: `packages/plugins/commerce/src/orchestration/finalize-payment.ts` +- Receipt state query helper: `queryFinalizationStatus` +- Current proof points: + - `src/orchestration/finalize-payment.test.ts` (pending branches, retry, and duplicate delivery) + - `src/services/commerce-extension-seams.test.ts` (status query contract) diff --git a/packages/plugins/commerce/PAID_BUT_WRONG_STOCK_RUNBOOK.md b/packages/plugins/commerce/PAID_BUT_WRONG_STOCK_RUNBOOK.md new file mode 100644 index 000000000..63b880541 --- /dev/null +++ b/packages/plugins/commerce/PAID_BUT_WRONG_STOCK_RUNBOOK.md @@ -0,0 +1,108 @@ +# Runbook: Paid order but inventory appears wrong + +Use this if a merchant reports: **“customer is marked paid, but stock is wrong.”** + +## 1) What we want to confirm first + +- Customer order ID +- Payment external event ID (from the payment provider/webhook) +- Approximate incident time (UTC) +- Logs for this order/event in the last 24h: + - `commerce.finalize.order_update_failed` + - `commerce.finalize.attempt_update_failed` + - `commerce.finalize.inventory_failed` + - `commerce.finalize.completed` + +## 2) Check order and webhook state + +- Open the order: + - If `paymentPhase = paid`, treat as “possibly finalized.” + - If `paymentPhase` is still `payment_pending`/`authorized`, a finalization retry may still be needed. +- Open webhook receipt row for the event: + - `processed` = finalize already completed for this event. + - `pending` = partial finalization happened and retry may continue safely. + - `error`/missing = inspect logs before retrying. + - `receiptErrorCode` (new): + - `ORDER_NOT_FOUND` = order record disappeared while finalizing; do not auto-retry. + - `ORDER_STATE_CONFLICT` = payment state shifted between reads; verify intent before retrying. + - `INVENTORY_CHANGED`, `INSUFFICIENT_STOCK`, `PRODUCT_UNAVAILABLE` = inventory is terminally mismatched; do not auto-retry without manual correction. +- Open payment attempt rows for this order/provider: + - `succeeded` means payment attempt did finalize. + - `pending` means finalization likely interrupted. + +## 2b) Optional status helper check (if tooling is available) + +- Use `queryFinalizationState` and map: + - `resumeState: not_started` → no event-side writes yet. + - `pending_inventory` → inventory application missing/partial. + - `pending_order` → order has not been set to `paid`. + - `pending_attempt` → payment attempt still `pending`. + - `pending_receipt` → order+attempt done, waiting on receipt write. +- `event_unknown` → order/payment/attempt indicators are already complete but this event id has no receipt row. + - `replay_processed`/`replay_duplicate` → terminal replay paths. + - `error` → investigate before retrying. + +## 3) Check stock/ledger consistency + +- Open inventory ledger rows with: + - `referenceType = "order"` + - `referenceId = ` +- Open current stock rows for SKUs in the order. + +## 4) Decision tree (do only one path) + +### A. Ledger has order entry **and** stock looks decremented correctly + +- If order is not yet `paid` (or attempt still `pending`) and receipt is `pending`: + - Retry finalize once. + - Re-check that order is `paid`, attempt is `succeeded`, receipt is `processed`. +- If order is already `paid` and receipt is `processed`: + - Do **not** force state changes. + - Report as successful reconciliation. + +### B. Ledger exists but stock did not move + +- If receipt is `pending`, retry finalize once. `pending` captures partial-write cases such as ledger write success before stock write. +- Re-check that receipt moves to `processed` and stock/attempt are corrected. +- If the single retry does not resolve, escalate to engineering with all captured evidence. + +### C. Ledger missing and stock not moved, but order is `paid` + +- Do **not** force stock edits in product admin on your own. +- Escalate immediately for manual reconciliation. + +## 5) Safe retry notes + +Retries should be run only when evidence says the order was likely in partial-write state. + +- Run a single retry. +- Re-check after it completes: + - order becomes `paid` +- If it fails again, stop and escalate. + +## 6) Escalation checklist + +- Create/attach a ticket with: + - orderId, payment event id, timestamps + - order state before/after + - receipt state (`processed`/`pending`/`error`) + - `receiptErrorCode` (if status is `error`) + - stock and ledger IDs involved + - whether retry was attempted and result code/message +- Assign to: on-call engineer + merchant support lead. + +## 7) Alerting recommendation + +Enable alerting if the same order/retry pattern happens repeatedly: + +- 2+ finalize retries in 10 minutes for the same order, or +- Same event ID repeatedly ending in `order_update_failed` / `attempt_update_failed`. + +## 8) Final communication to merchant + +Use this template: + +> We verified partial finalization behavior for this order. +> Current state is [paid | not paid], receipt state is [state], stock/ledger are [in-sync | out-of-sync]. +> Action taken: [retry / escalated]. +> If unresolved, next step is manual ledger-stock reconciliation with engineering. diff --git a/packages/plugins/commerce/PAID_BUT_WRONG_STOCK_RUNBOOK_SUPPORT.md b/packages/plugins/commerce/PAID_BUT_WRONG_STOCK_RUNBOOK_SUPPORT.md new file mode 100644 index 000000000..e5c67d594 --- /dev/null +++ b/packages/plugins/commerce/PAID_BUT_WRONG_STOCK_RUNBOOK_SUPPORT.md @@ -0,0 +1,75 @@ +# Support Playbook: Customer paid but stock looks wrong + +Use this quick checklist if a merchant or customer support agent reports, “The customer paid but the inventory is wrong.” + +## What to check first + +- Get the **Order ID** from support chat. +- Get the **payment event ID** from the webhook logs (if shown). +- Ask when the issue was first noticed (time and timezone). + +## Quick checks in the system + +1. Open the order. + - If order is already `paid`, we usually only need to confirm consistency. + - If order is not `paid`, it may be a failed retry and needs one finalize attempt. + +2. Open webhook receipt status for that event. + - `processed` = this event was already handled. + - `pending` = event is in partial-finalization recovery and may be safely retried once. + - `error` or missing = do not retry blindly; escalate. + - `receiptErrorCode` (new) guides escalation: + - `ORDER_NOT_FOUND` = order row disappeared during finalization; do not auto-retry. + - `ORDER_STATE_CONFLICT` = state changed between reads; investigate before manual intervention. + - `INVENTORY_CHANGED`, `INSUFFICIENT_STOCK`, `PRODUCT_UNAVAILABLE` = terminal inventory mismatch; manual correction required before retrying. + +3. Open payment attempt rows for the order. + - `succeeded` means finalize reached payment-attempt stage. + - `pending` means we may have hit a partial-write failure. + +4. Open inventory movement log for that order. + - Ledger rows should exist if stock was already decremented. + - Compare with current stock quantity. + +### Optional status helper path + +- Open `queryFinalizationState` when available and map `resumeState`: + - `pending_inventory` → retry begins by resolving inventory application. + - `pending_order` → retry continues with order transition. + - `pending_attempt` → retry continues with payment-attempt transition. + - `pending_receipt` → retry should finalize the receipt only. +- `event_unknown` → no event row exists; confirm order/payment/attempt are already consistent and do not retry. + - `replay_processed` / `replay_duplicate` → no retry; treat as already handled. + - `error` → investigate and escalate before retrying. + +## Decision: what to do + +### Case A: Ledger and stock look correct, order already paid + +- Do **not** change stock. +- Send confirmation back: this is a reconciliation pass with no manual change needed. + +### Case B: Receipt is pending and order is not fully finalized + +- Retry finalization **once**. +- Re-check: + - order now says `paid` + - payment attempt says `succeeded` + - receipt now says `processed` + +### Case C: Ledger says stock changed but stock still old, or data looks inconsistent + +- Retry once if the receipt is `pending` and the order is not fully final. +- If retry does not complete or state remains inconsistent, do **not** keep retrying; escalate to engineering for manual investigation. + +## When to escalate immediately + +- Same order retries more than twice in 10 minutes. +- Repeated failures with: + - `commerce.finalize.order_update_failed` + - `commerce.finalize.attempt_update_failed` +- A paid order has no matching stock/ledger movement. + +## What to write back to the merchant + +“We confirmed the order/payment state and inventory records. We’re either good after one controlled retry, or we’ve escalated a data consistency issue to engineering.” diff --git a/packages/plugins/commerce/package.json b/packages/plugins/commerce/package.json new file mode 100644 index 000000000..3c5f69f85 --- /dev/null +++ b/packages/plugins/commerce/package.json @@ -0,0 +1,36 @@ +{ + "name": "@emdash-cms/plugin-commerce", + "version": "0.1.0", + "description": "EmDash commerce — checkout + idempotent webhook finalize (kernel-first)", + "type": "module", + "main": "src/index.ts", + "exports": { + ".": "./src/index.ts", + "./kernel/errors": "./src/kernel/errors.ts", + "./kernel/api-errors": "./src/kernel/api-errors.ts", + "./kernel/finalize-decision": "./src/kernel/finalize-decision.ts", + "./kernel/limits": "./src/kernel/limits.ts", + "./kernel/idempotency-key": "./src/kernel/idempotency-key.ts", + "./kernel/provider-policy": "./src/kernel/provider-policy.ts", + "./kernel/rate-limit-window": "./src/kernel/rate-limit-window.ts" + }, + "files": ["src"], + "scripts": { + "test": "vitest run", + "typecheck": "tsc --noEmit" + }, + "dependencies": { + "ulidx": "^2.4.1" + }, + "peerDependencies": { + "astro": ">=6.0.0-beta.0", + "emdash": "workspace:*" + }, + "devDependencies": { + "astro": "catalog:", + "emdash": "workspace:*", + "typescript": "catalog:", + "vitest": "catalog:" + }, + "private": true +} diff --git a/packages/plugins/commerce/src/catalog-extensibility.ts b/packages/plugins/commerce/src/catalog-extensibility.ts new file mode 100644 index 000000000..ac673a02e --- /dev/null +++ b/packages/plugins/commerce/src/catalog-extensibility.ts @@ -0,0 +1,126 @@ +/** + * Contracts for catalog / content integration — vector search, LLM context, MCP. + * + * Commerce storage holds **IDs and numeric snapshots** on line items (`productId`, + * `variantId`, `unitPriceMinor`, `quantity`). Rich text, `shortDescription`, and + * embedding payloads belong on **catalog documents** (EmDash content or a future + * core vector index), not duplicated on orders. + * + * @see ../AI-EXTENSIBILITY.md + */ + +/** Optional fields a catalog product document may expose for search and agents. */ +export interface CommerceCatalogProductSearchFields { + /** Plain text for embeddings, snippets, and LLM grounding (alongside PT body for humans). */ + shortDescription?: string; + /** Stable id of the content node or blob used when generating embeddings. */ + searchDocumentId?: string; +} + +/** + * Read-only recommendation contract used by storefront features and read-only MCP + * tooling. The commerce kernel remains authoritative for checkout/finalization + * and inventory writes. + * + * Third-party recommender implementations must be side-effect free with respect + * to commerce documents. + */ +export type CommerceRecommendationInput = { + productId?: string; + variantId?: string; + cartId?: string; + limit?: number; +}; + +export type CommerceRecommendationResult = { + productIds: readonly string[]; + providerId?: string; + reason?: string; +}; + +export interface CommerceRecommendationResolver { + (ctx: CommerceRecommendationInput): Promise; +} + +/** + * Closed-kernel service boundary for recommendation providers. + * + * Providers are intentionally read-only and should only surface candidate product + * identifiers. They must not mutate carts, orders, attempts, or receipts. + */ +export interface CommerceRecommendationContract extends CommerceRecommendationResolver { + readonly providerId: string; + readonly readOnly: true; +} + +/** + * Reserved hook names for future event fan-out (loyalty, analytics, MCP). + * Not registered by the commerce kernel until those slices exist. + */ +export const COMMERCE_EXTENSION_HOOKS = { + /** After a read-only recommendation response is produced. */ + recommendationsResolved: "commerce:recommendations-resolved", +} as const; + +/** + * Reserved hook names for future event fan-out (loyalty, analytics, MCP). + * Not registered by the commerce kernel until those slices exist. + */ +export const COMMERCE_RECOMMENDATION_HOOKS = { + ...COMMERCE_EXTENSION_HOOKS, +} as const; + +export const COMMERCE_EXTENSION_SEAM_DOCS = { + recommendations: { + name: "createRecommendationsRoute", + role: "read-only", + readonlyInputs: ["productId", "variantId", "cartId", "limit"], + mutability: "No commerce writes are allowed.", + }, + webhooks: { + name: "createPaymentWebhookRoute", + role: "provider-adapter", + requiredAdapterMethods: [ + "verifyRequest", + "buildFinalizeInput", + "buildCorrelationId", + "buildRateLimitSuffix", + ], + mutability: + "Implementations may only emit events; only finalizePaymentFromWebhook writes payment state.", + }, + diagnostics: { + name: "queryFinalizationState", + role: "read-model", + output: [ + "isInventoryApplied", + "isOrderPaid", + "isPaymentAttemptSucceeded", + "receiptStatus", + "resumeState", + ], + mutability: "Read-only query surface for MCP/operations tooling.", + }, +} as const; + +/** + * Kernel invariants exposed to third-party integrators. + * + * The values are not meant as runtime policy controls; they are explicit API + * guarantees for integrators and MCP tool authors. + */ +export const COMMERCE_KERNEL_RULES = { + /** Checkout, webhook verification, and finalize are closed to extension bypass. */ + no_kernel_bypass: "commerce:kernel-no-bypass", + /** + * Third-party recommendation/catalog integrations are post-derivation only and + * cannot mutate commerce state. + */ + read_only_extensions: "commerce:read-only-extensions", + /** + * All external calls for order read and payment state must pass through stable + * exported services (`queryFinalizationStatus`, `finalizePaymentFromWebhook`, + * `queryFinalizationState`). + */ + service_entry_points_only: "commerce:service-entry-points-only", +} as const; diff --git a/packages/plugins/commerce/src/contracts/commerce-kernel-invariants.test.ts b/packages/plugins/commerce/src/contracts/commerce-kernel-invariants.test.ts new file mode 100644 index 000000000..5241e1527 --- /dev/null +++ b/packages/plugins/commerce/src/contracts/commerce-kernel-invariants.test.ts @@ -0,0 +1,190 @@ +import type { RouteContext } from "emdash"; +import { describe, expect, it, vi } from "vitest"; + +import { COMMERCE_EXTENSION_SEAM_DOCS, COMMERCE_KERNEL_RULES } from "../catalog-extensibility.js"; +import { webhookReceiptDocId } from "../orchestration/finalize-payment.js"; +import { + createRecommendationsRoute, + queryFinalizationState, +} from "../services/commerce-extension-seams.js"; +import type { + StoredInventoryLedgerEntry, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredWebhookReceipt, +} from "../types.js"; + +type QueryCollection = { + get(id: string): Promise; + query(options?: { where?: Record; limit?: number }): Promise<{ + items: Array<{ id: string; data: T }>; + hasMore: boolean; + }>; +}; + +function makeCollections() { + const baseOrder: StoredOrder = { + cartId: "cart_1", + paymentPhase: "paid", + currency: "USD", + lineItems: [], + finalizeTokenHash: "placeholder-finalize-token-hash", + totalMinor: 1000, + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }; + const paymentAttempt: StoredPaymentAttempt = { + orderId: "order_1", + providerId: "stripe", + status: "succeeded", + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }; + const ledgerRow: StoredInventoryLedgerEntry = { + productId: "prod_1", + variantId: "", + delta: -1, + referenceType: "order", + referenceId: "order_1", + createdAt: "2026-04-03T12:00:00.000Z", + }; + const stock: StoredInventoryStock = { + productId: "prod_1", + variantId: "", + version: 1, + quantity: 1, + updatedAt: "2026-04-03T12:00:00.000Z", + }; + const receipt: StoredWebhookReceipt = { + providerId: "stripe", + externalEventId: "evt_1", + orderId: "order_1", + status: "processed", + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }; + return { + orders: new Map([["order_1", baseOrder]]), + paymentAttempts: new Map([["attempt_1", paymentAttempt]]), + inventoryLedger: new Map([["ledger_1", ledgerRow]]), + inventoryStock: new Map([["stock_1", stock]]), + webhookReceipts: new Map([ + [webhookReceiptDocId("stripe", "evt_1"), receipt], + ]), + }; +} + +function asCollection(map: Map): QueryCollection { + return { + async get(id: string): Promise { + const row = map.get(id); + return row ? structuredClone(row) : null; + }, + async query(options?: { where?: Record; limit?: number }) { + const where = options?.where ?? {}; + const values = [...map.entries()].filter(([, row]) => + Object.entries(where).every( + ([field, value]) => (row as Record)[field] === value, + ), + ); + const items = values.slice(0, options?.limit ?? 50).map(([id, data]) => ({ + id, + data: structuredClone(data), + })); + return { items, hasMore: false }; + }, + }; +} + +function toCollections() { + const raw = makeCollections(); + return { + orders: asCollection(raw.orders), + paymentAttempts: asCollection(raw.paymentAttempts), + inventoryLedger: asCollection(raw.inventoryLedger), + inventoryStock: asCollection(raw.inventoryStock), + webhookReceipts: asCollection(raw.webhookReceipts), + }; +} + +class MemKv { + store = new Map(); + + async get(key: string): Promise { + const row = this.store.get(key); + return row === undefined ? null : (row as T); + } + + async set(key: string, value: unknown): Promise { + this.store.set(key, value); + } + + async delete(key: string): Promise { + return this.store.delete(key); + } + + async list(): Promise> { + return Array.from(this.store.entries(), ([key, value]) => ({ key, value })); + } +} + +describe("commerce kernel invariants", () => { + it("exports the kernel closure and read-only extension rules", () => { + expect(COMMERCE_KERNEL_RULES).toEqual({ + no_kernel_bypass: "commerce:kernel-no-bypass", + read_only_extensions: "commerce:read-only-extensions", + service_entry_points_only: "commerce:service-entry-points-only", + }); + expect(COMMERCE_EXTENSION_SEAM_DOCS.webhooks.mutability).toContain( + "finalizePaymentFromWebhook", + ); + expect(COMMERCE_EXTENSION_SEAM_DOCS.recommendations.mutability).toContain("No commerce writes"); + }); + + it("keeps diagnostic helper read-only by construction", async () => { + const ctx = { + request: new Request("https://example.test/diagnostics", { method: "POST" }), + storage: toCollections(), + requestMeta: { ip: "127.0.0.1" }, + kv: new MemKv(), + log: { + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), + debug: vi.fn(), + }, + } as unknown as RouteContext; + + const status = await queryFinalizationState(ctx, { + orderId: "order_1", + providerId: "stripe", + externalEventId: "evt_1", + }); + expect(status.receiptStatus).toBe("processed"); + expect(status.resumeState).toBe("replay_processed"); + }); + + it("replays recommendation seam as read-only response surface", async () => { + const route = createRecommendationsRoute({ + providerId: "acme-recs", + resolver: async () => ({ productIds: ["p1", "p2"] }), + }); + const out = await route({ + request: new Request("https://example.test/recommendations", { + method: "POST", + body: JSON.stringify({ limit: 3 }), + }), + input: { limit: 3 }, + } as never); + + expect(out).toEqual({ + ok: true, + enabled: true, + strategy: "provider", + productIds: ["p1", "p2"], + providerId: "acme-recs", + reason: "provider_result", + }); + }); +}); diff --git a/packages/plugins/commerce/src/contracts/storage-index-validation.test.ts b/packages/plugins/commerce/src/contracts/storage-index-validation.test.ts new file mode 100644 index 000000000..7dcc5269a --- /dev/null +++ b/packages/plugins/commerce/src/contracts/storage-index-validation.test.ts @@ -0,0 +1,139 @@ +import { describe, expect, it } from "vitest"; + +import { COMMERCE_STORAGE_CONFIG } from "../storage.js"; + +type IndexKind = string | readonly string[]; + +function includesIndex( + collection: + | "orders" + | "carts" + | "paymentAttempts" + | "productAssets" + | "productAssetLinks" + | "webhookReceipts" + | "idempotencyKeys" + | "products" + | "productSkus" + | "productAttributes" + | "productAttributeValues" + | "productSkuOptionValues" + | "digitalAssets" + | "digitalEntitlements" + | "categories" + | "productCategoryLinks" + | "productTags" + | "productTagLinks" + | "bundleComponents" + | "inventoryLedger" + | "inventoryStock", + index: readonly string[], + unique = false, +): boolean { + const cfg = COMMERCE_STORAGE_CONFIG[collection]; + const bucket = unique + ? "uniqueIndexes" in cfg + ? ((cfg as { uniqueIndexes?: readonly IndexKind[] }).uniqueIndexes ?? []) + : [] + : cfg.indexes; + return bucket.some((entry: IndexKind) => { + if (typeof entry === "string") { + return index.length === 1 && entry === index[0]; + } + return entry.length === index.length && entry.every((part, i) => part === index[i]); + }); +} + +describe("storage index contracts", () => { + it("supports payment attempt lookup path used by finalize/idempotency", () => { + expect(includesIndex("paymentAttempts", ["orderId", "providerId", "status"])).toBe(true); + }); + + it("supports inventory reconciliation lookup path for finalize", () => { + expect(includesIndex("inventoryLedger", ["referenceType", "referenceId"])).toBe(true); + }); + + it("contains required unique constraints for duplicate-safe writes", () => { + expect(includesIndex("webhookReceipts", ["providerId", "externalEventId"], true)).toBe(true); + expect(includesIndex("idempotencyKeys", ["keyHash", "route"], true)).toBe(true); + expect( + includesIndex( + "inventoryLedger", + ["referenceType", "referenceId", "productId", "variantId"], + true, + ), + ).toBe(true); + }); + + it("keeps deterministic index coverage for status-read diagnostics path", () => { + expect(includesIndex("inventoryStock", ["productId", "variantId"], true)).toBe(true); + expect(includesIndex("paymentAttempts", ["orderId", "providerId", "status"])).toBe(true); + }); + + it("supports catalog product lookup and uniqueness invariants", () => { + expect(includesIndex("products", ["slug"])).toBe(true); + expect(includesIndex("products", ["slug"], true)).toBe(true); + expect(includesIndex("products", ["status"])).toBe(true); + }); + + it("supports catalog SKU lookup and sku-code uniqueness invariants", () => { + expect(includesIndex("productSkus", ["productId"])).toBe(true); + expect(includesIndex("productSkus", ["skuCode"], true)).toBe(true); + }); + + it("supports catalog asset records and lookup invariants", () => { + expect(includesIndex("productAssets", ["provider", "externalAssetId"])).toBe(true); + expect(includesIndex("productAssets", ["provider", "externalAssetId"], true)).toBe(true); + }); + + it("supports catalog asset link lookup and idempotent linking", () => { + expect(includesIndex("productAssetLinks", ["targetType", "targetId"])).toBe(true); + expect(includesIndex("productAssetLinks", ["targetType", "targetId", "assetId"], true)).toBe( + true, + ); + }); + + it("supports variable attribute metadata lookups", () => { + expect(includesIndex("productAttributes", ["productId"])).toBe(true); + expect(includesIndex("productAttributes", ["productId", "kind"])).toBe(true); + expect(includesIndex("productAttributes", ["productId", "code"], true)).toBe(true); + expect(includesIndex("productAttributeValues", ["attributeId"])).toBe(true); + expect(includesIndex("productAttributeValues", ["attributeId", "code"], true)).toBe(true); + }); + + it("supports SKU option mapping invariants", () => { + expect(includesIndex("productSkuOptionValues", ["skuId"])).toBe(true); + expect(includesIndex("productSkuOptionValues", ["attributeId"])).toBe(true); + expect(includesIndex("productSkuOptionValues", ["skuId", "attributeId"], true)).toBe(true); + }); + + it("supports digital asset records and entitlements", () => { + expect(includesIndex("digitalAssets", ["provider", "externalAssetId"])).toBe(true); + expect(includesIndex("digitalAssets", ["provider", "externalAssetId"], true)).toBe(true); + expect(includesIndex("digitalEntitlements", ["skuId"])).toBe(true); + expect(includesIndex("digitalEntitlements", ["digitalAssetId"])).toBe(true); + expect(includesIndex("digitalEntitlements", ["skuId", "digitalAssetId"], true)).toBe(true); + }); + + it("supports bundle components and composition lookups", () => { + expect(includesIndex("bundleComponents", ["bundleProductId"])).toBe(true); + expect(includesIndex("bundleComponents", ["bundleProductId", "componentSkuId"], true)).toBe( + true, + ); + expect(includesIndex("bundleComponents", ["bundleProductId", "position"])).toBe(true); + }); + + it("supports catalog organization lookup indexes", () => { + expect(includesIndex("categories", ["slug"])).toBe(true); + expect(includesIndex("categories", ["slug"], true)).toBe(true); + expect(includesIndex("categories", ["parentId"])).toBe(true); + expect(includesIndex("productCategoryLinks", ["productId"])).toBe(true); + expect(includesIndex("productCategoryLinks", ["categoryId"])).toBe(true); + expect(includesIndex("productCategoryLinks", ["productId", "categoryId"], true)).toBe(true); + expect(includesIndex("productTags", ["slug"])).toBe(true); + expect(includesIndex("productTags", ["slug"], true)).toBe(true); + expect(includesIndex("productTagLinks", ["productId"])).toBe(true); + expect(includesIndex("productTagLinks", ["tagId"])).toBe(true); + expect(includesIndex("productTagLinks", ["productId", "tagId"], true)).toBe(true); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/cart.test.ts b/packages/plugins/commerce/src/handlers/cart.test.ts new file mode 100644 index 000000000..17e5e700c --- /dev/null +++ b/packages/plugins/commerce/src/handlers/cart.test.ts @@ -0,0 +1,637 @@ +/** + * Tests for cart/upsert and cart/get handlers, plus the end-to-end + * chain: cart/upsert → checkout → payment_pending order. + */ + +import type { RouteContext } from "emdash"; +import { beforeEach, describe, expect, it, vi } from "vitest"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { sha256HexAsync } from "../lib/crypto-adapter.js"; +import { inventoryStockDocId } from "../orchestration/finalize-payment.js"; +import type { CartGetInput, CartUpsertInput, CheckoutInput } from "../schemas.js"; +import type { + StoredBundleComponent, + StoredCart, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredIdempotencyKey, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductSku, + StoredProductSkuOptionValue, +} from "../types.js"; +import { cartGetHandler, cartUpsertHandler } from "./cart.js"; +import { checkoutHandler } from "./checkout.js"; + +const consumeKvRateLimit = vi.fn(async (_opts?: unknown) => true); +vi.mock("../lib/rate-limit-kv.js", () => ({ + __esModule: true, + consumeKvRateLimit: (opts: unknown) => consumeKvRateLimit(opts), +})); + +// --------------------------------------------------------------------------- +// Shared test infrastructure (mirrors checkout.test.ts pattern) +// --------------------------------------------------------------------------- + +class MemColl { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } +} + +function decodeStockDocId(id: string): { productId: string; variantId: string } | null { + const prefix = "stock:"; + if (!id.startsWith(prefix)) return null; + const rest = id.slice(prefix.length); + const idx = rest.indexOf(":"); + if (idx === -1) return null; + return { + productId: decodeURIComponent(rest.slice(0, idx)), + variantId: decodeURIComponent(rest.slice(idx + 1)), + }; +} + +/** + * Serves generous default stock for any `stock:product:variant` id so cart upsert + * tests do not need per-SKU seed rows. + */ +class PermissiveInventoryStockColl { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + if (row) return structuredClone(row); + const parsed = decodeStockDocId(id); + if (!parsed) return null; + return { + productId: parsed.productId, + variantId: parsed.variantId, + version: 1, + quantity: 50_000, + updatedAt: "2026-01-01T00:00:00.000Z", + }; + } + + async put(id: string, data: StoredInventoryStock): Promise { + this.rows.set(id, structuredClone(data)); + } +} + +class DefaultProductsColl extends MemColl { + override async get(id: string): Promise { + const row = this.rows.get(id); + if (row) return structuredClone(row); + const ts = "2026-01-01T00:00:00.000Z"; + return { + id, + type: "simple", + status: "active", + visibility: "public", + slug: id, + title: id, + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: ts, + updatedAt: ts, + }; + } +} + +class MemKv { + store = new Map(); + + async get(key: string): Promise { + const row = this.store.get(key); + return row === undefined ? null : (row as T); + } + + async set(key: string, value: T): Promise { + this.store.set(key, value); + } +} + +type CartGetInputForTest = Omit & { ownerToken?: string }; +type CheckoutInputForTest = Omit & { ownerToken?: string }; + +function asRouteContext(context: unknown): RouteContext { + return context as RouteContext; +} + +function upsertCtx( + input: CartUpsertInput, + carts: MemColl, + kv: MemKv, +): RouteContext { + return asRouteContext({ + request: new Request("https://example.test/cart/upsert", { method: "POST" }), + input, + storage: { + carts, + products: new DefaultProductsColl(), + bundleComponents: new MemColl(), + productSkus: new MemColl(), + inventoryStock: new PermissiveInventoryStockColl(), + }, + requestMeta: { ip: "127.0.0.1" }, + kv, + }); +} + +function getCtx( + input: CartGetInputForTest, + carts: MemColl, +): RouteContext { + return asRouteContext({ + request: new Request("https://example.test/cart/get", { method: "POST" }), + input: { + cartId: input.cartId, + ...(input.ownerToken !== undefined ? { ownerToken: input.ownerToken } : {}), + }, + storage: { carts }, + requestMeta: { ip: "127.0.0.1" }, + kv: new MemKv(), + }); +} + +function checkoutCtx( + input: CheckoutInputForTest, + carts: MemColl, + orders: MemColl, + paymentAttempts: MemColl, + idempotencyKeys: MemColl, + inventoryStock: MemColl, + kv: MemKv, +): RouteContext { + return asRouteContext({ + request: new Request("https://example.test/checkout", { + method: "POST", + headers: new Headers({ "Idempotency-Key": input.idempotencyKey ?? "" }), + }), + input: { + cartId: input.cartId, + idempotencyKey: input.idempotencyKey, + ...(input.ownerToken !== undefined ? { ownerToken: input.ownerToken } : {}), + }, + storage: { + carts, + orders, + paymentAttempts, + idempotencyKeys, + inventoryStock, + products: new DefaultProductsColl(), + bundleComponents: new MemColl(), + productSkus: new MemColl(), + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + }, + requestMeta: { ip: "127.0.0.1" }, + kv, + }); +} + +const LINE = { + productId: "p1", + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 1000, +} as const; + +// --------------------------------------------------------------------------- +// cart/upsert +// --------------------------------------------------------------------------- + +describe("cartUpsertHandler", () => { + beforeEach(() => { + consumeKvRateLimit.mockResolvedValue(true); + }); + + it("requires POST method", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const ctx = { + request: new Request("https://example.test/cart/upsert", { method: "GET" }), + input: { cartId: "c_method", currency: "USD", lineItems: [LINE] }, + storage: { carts }, + requestMeta: { ip: "127.0.0.1" }, + kv, + } as unknown as RouteContext; + await expect(cartUpsertHandler(ctx)).rejects.toMatchObject({ code: "METHOD_NOT_ALLOWED" }); + }); + + it("enforces cart line item cap", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const tooMany = Array.from({ length: COMMERCE_LIMITS.maxCartLineItems + 1 }, (_, i) => ({ + ...LINE, + productId: `p-${i}`, + })); + await expect( + cartUpsertHandler( + upsertCtx({ cartId: "c_caps", currency: "USD", lineItems: tooMany }, carts, kv), + ), + ).rejects.toMatchObject({ code: "payload_too_large" }); + }); + + it("rate-limits cart mutation bursts", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + consumeKvRateLimit.mockResolvedValueOnce(false); + + await expect( + cartUpsertHandler( + upsertCtx({ cartId: "c_rate", currency: "USD", lineItems: [LINE] }, carts, kv), + ), + ).rejects.toMatchObject({ code: "rate_limited" }); + }); + + it("creates a cart and returns an ownerToken on first upsert", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const result = await cartUpsertHandler( + upsertCtx({ cartId: "c1", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + expect(result.cartId).toBe("c1"); + expect(result.currency).toBe("USD"); + expect(result.lineItemCount).toBe(1); + expect(result.ownerToken).toBeDefined(); + expect(typeof result.ownerToken).toBe("string"); + expect((result.ownerToken ?? "").length).toBeGreaterThan(0); + expect(carts.rows.size).toBe(1); + }); + + it("does not return ownerToken on subsequent upserts", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const first = await cartUpsertHandler( + upsertCtx({ cartId: "c2", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + const token = first.ownerToken!; + + const second = await cartUpsertHandler( + upsertCtx( + { + cartId: "c2", + currency: "USD", + lineItems: [LINE, { ...LINE, productId: "p2" }], + ownerToken: token, + }, + carts, + kv, + ), + ); + + expect(second.ownerToken).toBeUndefined(); + expect(second.lineItemCount).toBe(2); + }); + + it("rejects mutation without ownerToken when cart has one", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await cartUpsertHandler( + upsertCtx({ cartId: "c3", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + // PluginRouteError stores the wire code (snake_case), not the internal code. + await expect( + cartUpsertHandler(upsertCtx({ cartId: "c3", currency: "USD", lineItems: [] }, carts, kv)), + ).rejects.toMatchObject({ code: "cart_token_required" }); + }); + + it("rejects mutation with wrong ownerToken", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await cartUpsertHandler( + upsertCtx({ cartId: "c4", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + await expect( + cartUpsertHandler( + upsertCtx( + { cartId: "c4", currency: "USD", lineItems: [], ownerToken: "a".repeat(48) }, + carts, + kv, + ), + ), + ).rejects.toMatchObject({ code: "cart_token_invalid" }); + }); + + it("preserves createdAt across updates", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const first = await cartUpsertHandler( + upsertCtx({ cartId: "c5", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + const storedAfterCreate = await carts.get("c5"); + const createdAt = storedAfterCreate!.createdAt; + + await cartUpsertHandler( + upsertCtx( + { cartId: "c5", currency: "USD", lineItems: [LINE], ownerToken: first.ownerToken }, + carts, + kv, + ), + ); + const storedAfterUpdate = await carts.get("c5"); + + expect(storedAfterUpdate!.createdAt).toBe(createdAt); + }); + + it("rejects invalid line item quantity", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await expect( + cartUpsertHandler( + upsertCtx( + { cartId: "c6", currency: "USD", lineItems: [{ ...LINE, quantity: 0 }] }, + carts, + kv, + ), + ), + ).rejects.toThrow(); + }); + + it("rejects negative unit price", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await expect( + cartUpsertHandler( + upsertCtx( + { cartId: "c7", currency: "USD", lineItems: [{ ...LINE, unitPriceMinor: -1 }] }, + carts, + kv, + ), + ), + ).rejects.toThrow(); + }); + + it("stores ownerTokenHash not the raw token", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const result = await cartUpsertHandler( + upsertCtx({ cartId: "c8", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + const stored = await carts.get("c8"); + expect(stored!.ownerTokenHash).toBe(await sha256HexAsync(result.ownerToken!)); + expect(stored!.ownerTokenHash).not.toBe(result.ownerToken); + }); +}); + +// --------------------------------------------------------------------------- +// cart/get +// --------------------------------------------------------------------------- + +describe("cartGetHandler", () => { + beforeEach(() => { + consumeKvRateLimit.mockResolvedValue(true); + }); + + it("requires POST method", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await carts.put("g_method", { + currency: "USD", + lineItems: [LINE], + ownerTokenHash: "owner-hash-method", + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }); + const ctx = { + request: new Request("https://example.test/cart/get", { method: "GET" }), + input: { cartId: "g_method" }, + storage: { carts }, + requestMeta: { ip: "127.0.0.1" }, + kv, + } as unknown as RouteContext; + await expect(cartGetHandler(ctx)).rejects.toMatchObject({ code: "METHOD_NOT_ALLOWED" }); + }); + + it("returns cart contents for a known cartId when ownerToken matches", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const created = await cartUpsertHandler( + upsertCtx({ cartId: "g1", currency: "EUR", lineItems: [LINE] }, carts, kv), + ); + + const result = await cartGetHandler( + getCtx({ cartId: "g1", ownerToken: created.ownerToken }, carts), + ); + + expect(result.cartId).toBe("g1"); + expect(result.currency).toBe("EUR"); + expect(result.lineItems).toHaveLength(1); + expect(result.lineItems[0]?.productId).toBe("p1"); + }); + + it("returns CART_NOT_FOUND for unknown cartId", async () => { + const carts = new MemColl(); + // PluginRouteError stores the wire code (snake_case). + await expect(cartGetHandler(getCtx({ cartId: "missing" }, carts))).rejects.toMatchObject({ + code: "cart_not_found", + }); + }); + + it("does not expose ownerTokenHash in the response", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + const created = await cartUpsertHandler( + upsertCtx({ cartId: "g2", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + const result = await cartGetHandler( + getCtx({ cartId: "g2", ownerToken: created.ownerToken }, carts), + ); + + expect(result).not.toHaveProperty("ownerTokenHash"); + }); + + it("rejects read without ownerToken when cart has ownerTokenHash", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await cartUpsertHandler( + upsertCtx({ cartId: "g3", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + await expect(cartGetHandler(getCtx({ cartId: "g3" }, carts))).rejects.toMatchObject({ + code: "cart_token_required", + }); + }); + + it("rejects read with wrong ownerToken", async () => { + const carts = new MemColl(); + const kv = new MemKv(); + await cartUpsertHandler( + upsertCtx({ cartId: "g4", currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + await expect( + cartGetHandler(getCtx({ cartId: "g4", ownerToken: "b".repeat(32) }, carts)), + ).rejects.toMatchObject({ code: "cart_token_invalid" }); + }); +}); + +// --------------------------------------------------------------------------- +// Integration chain: cart/upsert → checkout → payment_pending +// --------------------------------------------------------------------------- + +describe("cart → checkout integration chain", () => { + it("creates a payment_pending order from a cart upserted via the handler", async () => { + const cartId = "chain-cart-1"; + const idempotencyKey = "chain-idemp-key-strong-1"; + const now = "2026-04-03T12:00:00.000Z"; + + const carts = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const idempotencyKeys = new MemColl(); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 1, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ); + const kv = new MemKv(); + + // Step 1: upsert cart via handler (no manual storage poke) + const upsertResult = await cartUpsertHandler( + upsertCtx({ cartId, currency: "USD", lineItems: [LINE] }, carts, kv), + ); + expect(upsertResult.ownerToken).toBeDefined(); + + // Step 2: checkout against the upserted cart (possession proof matches cart/get/upsert) + const checkoutResult = await checkoutHandler( + checkoutCtx( + { cartId, idempotencyKey, ownerToken: upsertResult.ownerToken }, + carts, + orders, + paymentAttempts, + idempotencyKeys, + inventoryStock, + kv, + ), + ); + + expect(checkoutResult.paymentPhase).toBe("payment_pending"); + expect(checkoutResult.currency).toBe("USD"); + expect(checkoutResult.totalMinor).toBe(1000); + expect(typeof checkoutResult.orderId).toBe("string"); + expect(typeof checkoutResult.finalizeToken).toBe("string"); + expect(orders.rows.size).toBe(1); + expect(paymentAttempts.rows.size).toBe(1); + }); + + it("rejects checkout without ownerToken after cart upsert established possession", async () => { + const cartId = "chain-cart-no-token"; + const idempotencyKey = "chain-idemp-key-no-tok-1"; + const now = "2026-04-03T12:00:00.000Z"; + + const carts = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const idempotencyKeys = new MemColl(); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 1, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ); + const kv = new MemKv(); + + await cartUpsertHandler(upsertCtx({ cartId, currency: "USD", lineItems: [LINE] }, carts, kv)); + + await expect( + checkoutHandler( + checkoutCtx( + { cartId, idempotencyKey }, + carts, + orders, + paymentAttempts, + idempotencyKeys, + inventoryStock, + kv, + ), + ), + ).rejects.toMatchObject({ code: "cart_token_required" }); + }); + + it("checkout is idempotent for the same cart and key", async () => { + const cartId = "chain-cart-2"; + const idempotencyKey = "chain-idemp-key-strong-2"; + const now = "2026-04-03T12:00:00.000Z"; + + const carts = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const idempotencyKeys = new MemColl(); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 1, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ); + const kv = new MemKv(); + + const upserted = await cartUpsertHandler( + upsertCtx({ cartId, currency: "USD", lineItems: [LINE] }, carts, kv), + ); + + const ctx = checkoutCtx( + { cartId, idempotencyKey, ownerToken: upserted.ownerToken }, + carts, + orders, + paymentAttempts, + idempotencyKeys, + inventoryStock, + kv, + ); + + const first = await checkoutHandler(ctx); + const second = await checkoutHandler(ctx); + + expect(second).toEqual(first); + expect(orders.rows.size).toBe(1); + expect(paymentAttempts.rows.size).toBe(1); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/cart.ts b/packages/plugins/commerce/src/handlers/cart.ts new file mode 100644 index 000000000..630b8807e --- /dev/null +++ b/packages/plugins/commerce/src/handlers/cart.ts @@ -0,0 +1,176 @@ +/** + * Cart handlers: upsert and get. + * + * Ownership model + * --------------- + * On first creation the server issues an opaque `ownerToken` (random hex, 24 bytes). + * Only the SHA-256 hash is stored on the cart document (`ownerTokenHash`). + * The raw token is returned once in the creation response and must be presented + * by the caller on all subsequent reads (`cart/get`) and mutations (`cart/upsert`). + * + * This is intentionally the same pattern as `finalizeToken`/`finalizeTokenHash` + * on orders — it gives us a future-proof ownership surface without requiring a + * full auth session, and without any breaking API changes when sessions arrive. + * + * Rate limiting + * ------------- + * Mutations are rate-limited per cart token hash (not IP) so that a shared + * storefront origin does not exhaust a single IP bucket. + */ + +import type { RouteContext } from "emdash"; +import { PluginRouteError } from "emdash"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { projectCartLineItemsForStorage } from "../lib/cart-lines.js"; +import { assertCartOwnerToken } from "../lib/cart-owner-token.js"; +import { validateCartLineItems } from "../lib/cart-validation.js"; +import { validateLineItemsStockForCheckout } from "../lib/checkout-inventory-validation.js"; +import { randomHex, sha256HexAsync } from "../lib/crypto-adapter.js"; +import { consumeKvRateLimit } from "../lib/rate-limit-kv.js"; +import { requirePost } from "../lib/require-post.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { CartGetInput, CartUpsertInput } from "../schemas.js"; +import type { + StoredBundleComponent, + StoredCart, + StoredInventoryStock, + StoredProduct, + StoredProductSku, +} from "../types.js"; +import { asCollection } from "./catalog-conflict.js"; + +// --------------------------------------------------------------------------- +// cart/upsert +// --------------------------------------------------------------------------- + +export type CartUpsertResponse = { + cartId: string; + currency: string; + lineItemCount: number; + updatedAt: string; + /** + * Present on first creation for newly provisioned carts. + * The caller must store this token — it is never returned again. + * Required for all subsequent mutations. + */ + ownerToken?: string; +}; + +export async function cartUpsertHandler( + ctx: RouteContext, +): Promise { + requirePost(ctx); + + const nowMs = Date.now(); + const nowIso = new Date(nowMs).toISOString(); + + const carts = asCollection(ctx.storage.carts); + const existing = await carts.get(ctx.input.cartId); + let ownerToken: string | undefined; + let ownerTokenHash: string; + + if (existing) { + await assertCartOwnerToken(existing, ctx.input.ownerToken, "mutate"); + ownerTokenHash = existing.ownerTokenHash; + } else { + ownerToken = await randomHex(24); + ownerTokenHash = await sha256HexAsync(ownerToken); + } + + // --- Rate limit: keyed by cartId for first-time/new carts, token hash thereafter --- + const rateLimitByCartId = !existing; + const cartIdHash = await sha256HexAsync(ctx.input.cartId); + const rateLimitKey = rateLimitByCartId + ? `cart:id:${cartIdHash.slice(0, 32)}` + : `cart:token:${ownerTokenHash.slice(0, 32)}`; + + const allowed = await consumeKvRateLimit({ + kv: ctx.kv, + keySuffix: rateLimitKey, + limit: COMMERCE_LIMITS.defaultCartMutationsPerTokenPerWindow, + windowMs: COMMERCE_LIMITS.defaultRateWindowMs, + nowMs, + }); + if (!allowed) { + throwCommerceApiError({ + code: "RATE_LIMITED", + message: "Too many cart mutations; try again shortly", + }); + } + + // --- Validate line items --- + if (ctx.input.lineItems.length > COMMERCE_LIMITS.maxCartLineItems) { + throwCommerceApiError({ + code: "PAYLOAD_TOO_LARGE", + message: `Cart must not exceed ${COMMERCE_LIMITS.maxCartLineItems} line items`, + }); + } + const lineItemValidationMessage = validateCartLineItems(ctx.input.lineItems); + if (lineItemValidationMessage) { + throw PluginRouteError.badRequest(lineItemValidationMessage); + } + + const inventoryStock = asCollection(ctx.storage.inventoryStock); + await validateLineItemsStockForCheckout(ctx.input.lineItems, { + products: asCollection(ctx.storage.products), + bundleComponents: asCollection(ctx.storage.bundleComponents), + productSkus: asCollection(ctx.storage.productSkus), + inventoryStock, + }); + + // --- Persist --- + const cart: StoredCart = { + currency: ctx.input.currency, + lineItems: projectCartLineItemsForStorage(ctx.input.lineItems), + ownerTokenHash, + createdAt: existing?.createdAt ?? nowIso, + updatedAt: nowIso, + }; + + await carts.put(ctx.input.cartId, cart); + + const response: CartUpsertResponse = { + cartId: ctx.input.cartId, + currency: cart.currency, + lineItemCount: cart.lineItems.length, + updatedAt: cart.updatedAt, + }; + if (ownerToken) { + response.ownerToken = ownerToken; + } + return response; +} + +// --------------------------------------------------------------------------- +// cart/get +// --------------------------------------------------------------------------- + +export type CartGetResponse = { + cartId: string; + currency: string; + lineItems: StoredCart["lineItems"]; + createdAt: string; + updatedAt: string; +}; + +export async function cartGetHandler(ctx: RouteContext): Promise { + requirePost(ctx); + + const carts = asCollection(ctx.storage.carts); + const cart = await carts.get(ctx.input.cartId); + + if (!cart) { + throwCommerceApiError({ code: "CART_NOT_FOUND", message: "Cart not found" }); + } + + await assertCartOwnerToken(cart, ctx.input.ownerToken, "read"); + + return { + cartId: ctx.input.cartId, + currency: cart.currency, + lineItems: cart.lineItems, + createdAt: cart.createdAt, + updatedAt: cart.updatedAt, + }; +} diff --git a/packages/plugins/commerce/src/handlers/catalog-asset.ts b/packages/plugins/commerce/src/handlers/catalog-asset.ts new file mode 100644 index 000000000..730ca693a --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-asset.ts @@ -0,0 +1,239 @@ +import type { RouteContext } from "emdash"; +import { PluginRouteError } from "emdash"; + +import { randomHex } from "../lib/crypto-adapter.js"; +import { + mutateOrderedChildren, + normalizeOrderedChildren, + normalizeOrderedPosition, + sortOrderedRowsByPosition, +} from "../lib/ordered-rows.js"; +import { requirePost } from "../lib/require-post.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { + ProductAssetLinkInput, + ProductAssetReorderInput, + ProductAssetRegisterInput, + ProductAssetUnlinkInput, +} from "../schemas.js"; +import type { + ProductAssetLinkTarget, + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductSku, +} from "../types.js"; +import type { Collection } from "./catalog-conflict.js"; +import { asCollection, getNowIso, putWithConflictHandling } from "./catalog-conflict.js"; +import { queryAllPages } from "./catalog-read-model.js"; +import type { + ProductAssetResponse, + ProductAssetLinkResponse, + ProductAssetUnlinkResponse, +} from "./catalog.js"; + +async function queryAssetLinksForTarget( + productAssetLinks: Collection, + targetType: ProductAssetLinkTarget, + targetId: string, +): Promise { + const rows = await queryAllPages((cursor) => + productAssetLinks.query({ + where: { targetType, targetId }, + cursor, + limit: 100, + }), + ); + return normalizeOrderedChildren(sortOrderedRowsByPosition(rows.map((row) => row.data))); +} + +async function loadCatalogTargetExists( + products: Collection, + productSkus: Collection, + targetType: ProductAssetLinkTarget, + targetId: string, +) { + if (targetType === "product") { + const product = await products.get(targetId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + return; + } + + const sku = await productSkus.get(targetId); + if (!sku) { + throwCommerceApiError({ code: "VARIANT_UNAVAILABLE", message: "SKU not found" }); + } +} + +export async function handleRegisterProductAsset( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productAssets = asCollection(ctx.storage.productAssets); + const nowIso = getNowIso(); + + const id = `asset_${await randomHex(6)}`; + const asset: StoredProductAsset = { + id, + provider: ctx.input.provider, + externalAssetId: ctx.input.externalAssetId, + fileName: ctx.input.fileName, + altText: ctx.input.altText, + mimeType: ctx.input.mimeType, + byteSize: ctx.input.byteSize, + width: ctx.input.width, + height: ctx.input.height, + metadata: ctx.input.metadata, + createdAt: nowIso, + updatedAt: nowIso, + }; + + await putWithConflictHandling(productAssets, id, asset, { + where: { + provider: ctx.input.provider, + externalAssetId: ctx.input.externalAssetId, + }, + message: "Asset metadata already registered for provider asset key", + }); + return { asset }; +} + +export async function handleLinkCatalogAsset( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const role = ctx.input.role ?? "gallery_image"; + const position = ctx.input.position ?? 0; + const nowIso = getNowIso(); + const productAssets = asCollection(ctx.storage.productAssets); + const productAssetLinks = asCollection(ctx.storage.productAssetLinks); + const products = asCollection(ctx.storage.products); + const skus = asCollection(ctx.storage.productSkus); + + const targetType = ctx.input.targetType; + const targetId = ctx.input.targetId; + + const asset = await productAssets.get(ctx.input.assetId); + if (!asset) { + throwCommerceApiError({ code: "ASSET_NOT_FOUND", message: "Asset not found" }); + } + + await loadCatalogTargetExists(products, skus, targetType, targetId); + + const links = await queryAssetLinksForTarget(productAssetLinks, targetType, targetId); + if (role === "primary_image") { + const hasPrimary = links.some((link) => link.role === "primary_image"); + if (hasPrimary) { + throw PluginRouteError.badRequest("Target already has a primary image"); + } + } + + const linkId = `asset_link_${await randomHex(6)}`; + const requestedPosition = normalizeOrderedPosition(position); + + const link: StoredProductAssetLink = { + id: linkId, + targetType, + targetId, + assetId: ctx.input.assetId, + role, + position: requestedPosition, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(productAssetLinks, linkId, link, { + where: { + targetType, + targetId, + assetId: ctx.input.assetId, + }, + message: "Asset already linked to this target", + }); + + let normalized: StoredProductAssetLink[]; + try { + normalized = await mutateOrderedChildren({ + collection: productAssetLinks, + rows: links, + mutation: { + kind: "add", + row: link, + requestedPosition, + }, + nowIso, + }); + } catch (error) { + await productAssetLinks.delete(linkId); + throw error; + } + + const created = normalized.find((candidate) => candidate.id === linkId); + if (!created) { + throw PluginRouteError.badRequest("Asset link not found after create"); + } + return { link: created }; +} + +export async function handleUnlinkCatalogAsset( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const nowIso = getNowIso(); + const productAssetLinks = asCollection(ctx.storage.productAssetLinks); + const existing = await productAssetLinks.get(ctx.input.linkId); + if (!existing) { + throwCommerceApiError({ code: "ASSET_LINK_NOT_FOUND", message: "Asset link not found" }); + } + const links = await queryAssetLinksForTarget( + productAssetLinks, + existing.targetType, + existing.targetId, + ); + + await mutateOrderedChildren({ + collection: productAssetLinks, + rows: links, + mutation: { + kind: "remove", + removedRowId: ctx.input.linkId, + }, + nowIso, + }); + + return { deleted: true }; +} + +export async function handleReorderCatalogAsset( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productAssetLinks = asCollection(ctx.storage.productAssetLinks); + const nowIso = getNowIso(); + + const link = await productAssetLinks.get(ctx.input.linkId); + if (!link) { + throwCommerceApiError({ code: "ASSET_LINK_NOT_FOUND", message: "Asset link not found" }); + } + + const links = await queryAssetLinksForTarget(productAssetLinks, link.targetType, link.targetId); + const requestedPosition = normalizeOrderedPosition(ctx.input.position); + const normalized = await mutateOrderedChildren({ + collection: productAssetLinks, + rows: links, + mutation: { + kind: "move", + rowId: ctx.input.linkId, + requestedPosition, + notFoundMessage: "Asset link not found in target links", + }, + nowIso, + }); + + const updated = normalized.find((candidate) => candidate.id === ctx.input.linkId); + if (!updated) { + throw PluginRouteError.badRequest("Asset link not found after reorder"); + } + return { link: updated }; +} diff --git a/packages/plugins/commerce/src/handlers/catalog-association.ts b/packages/plugins/commerce/src/handlers/catalog-association.ts new file mode 100644 index 000000000..4199e4aff --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-association.ts @@ -0,0 +1,228 @@ +import type { RouteContext } from "emdash"; +import { PluginRouteError } from "emdash"; + +import { randomHex } from "../lib/crypto-adapter.js"; +import { requirePost } from "../lib/require-post.js"; +import { sortedImmutable } from "../lib/sort-immutable.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { + CategoryCreateInput, + CategoryListInput, + ProductCategoryLinkInput, + ProductCategoryUnlinkInput, + TagCreateInput, + TagListInput, + ProductTagLinkInput, + ProductTagUnlinkInput, +} from "../schemas.js"; +import type { + StoredCategory, + StoredProduct, + StoredProductCategoryLink, + StoredProductTag, + StoredProductTagLink, +} from "../types.js"; +import type { Collection } from "./catalog-conflict.js"; +import { asCollection, getNowIso, putWithConflictHandling } from "./catalog-conflict.js"; +import type { + CategoryResponse, + CategoryListResponse, + ProductCategoryLinkResponse, + ProductCategoryLinkUnlinkResponse, + TagResponse, + TagListResponse, + ProductTagLinkResponse, + ProductTagLinkUnlinkResponse, +} from "./catalog.js"; + +export async function handleCreateCategory( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const categories = asCollection(ctx.storage.categories); + const nowIso = getNowIso(); + + if (ctx.input.parentId) { + const parent = await categories.get(ctx.input.parentId); + if (!parent) { + throw PluginRouteError.badRequest(`Category parent not found: ${ctx.input.parentId}`); + } + } + + const id = `cat_${await randomHex(6)}`; + const category: StoredCategory = { + id, + name: ctx.input.name, + slug: ctx.input.slug, + parentId: ctx.input.parentId, + position: ctx.input.position, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(categories, id, category, { + where: { slug: ctx.input.slug }, + message: `Category slug already exists: ${ctx.input.slug}`, + }); + return { category }; +} + +export async function handleListCategories( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const categories = asCollection(ctx.storage.categories); + + const where: Record = {}; + if (ctx.input.parentId) { + where.parentId = ctx.input.parentId; + } + + const result = await categories.query({ + where, + limit: ctx.input.limit, + }); + const items = sortedImmutable( + result.items.map((row) => row.data), + (left, right) => left.position - right.position || left.slug.localeCompare(right.slug), + ); + return { items }; +} + +export async function handleCreateProductCategoryLink( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const categories = asCollection(ctx.storage.categories); + const productCategoryLinks = asCollection( + ctx.storage.productCategoryLinks, + ); + const nowIso = getNowIso(); + + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + const category = await categories.get(ctx.input.categoryId); + if (!category) { + throw PluginRouteError.badRequest(`Category not found: ${ctx.input.categoryId}`); + } + + const id = `prod_cat_link_${await randomHex(6)}`; + const link: StoredProductCategoryLink = { + id, + productId: ctx.input.productId, + categoryId: ctx.input.categoryId, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(productCategoryLinks, id, link, { + where: { + productId: ctx.input.productId, + categoryId: ctx.input.categoryId, + }, + message: "Product-category link already exists", + }); + return { link }; +} + +export async function handleRemoveProductCategoryLink( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productCategoryLinks = asCollection( + ctx.storage.productCategoryLinks, + ); + const link = await productCategoryLinks.get(ctx.input.linkId); + if (!link) { + throwCommerceApiError({ + code: "CATEGORY_LINK_NOT_FOUND", + message: "Product-category link not found", + }); + } + + await productCategoryLinks.delete(ctx.input.linkId); + return { deleted: true }; +} + +export async function handleCreateTag(ctx: RouteContext): Promise { + requirePost(ctx); + const tags = asCollection(ctx.storage.productTags); + const nowIso = getNowIso(); + + const id = `tag_${await randomHex(6)}`; + const tag: StoredProductTag = { + id, + name: ctx.input.name, + slug: ctx.input.slug, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(tags, id, tag, { + where: { slug: ctx.input.slug }, + message: `Tag slug already exists: ${ctx.input.slug}`, + }); + return { tag }; +} + +export async function handleListTags(ctx: RouteContext): Promise { + requirePost(ctx); + const tags = asCollection(ctx.storage.productTags); + const result = await tags.query({ + limit: ctx.input.limit, + }); + const items = sortedImmutable( + result.items.map((row) => row.data), + (left, right) => left.slug.localeCompare(right.slug), + ); + return { items }; +} + +export async function handleCreateProductTagLink( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const tags = asCollection(ctx.storage.productTags); + const productTagLinks = asCollection(ctx.storage.productTagLinks); + const nowIso = getNowIso(); + + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + const tag = await tags.get(ctx.input.tagId); + if (!tag) { + throw PluginRouteError.badRequest(`Tag not found: ${ctx.input.tagId}`); + } + + const id = `prod_tag_link_${await randomHex(6)}`; + const link: StoredProductTagLink = { + id, + productId: ctx.input.productId, + tagId: ctx.input.tagId, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(productTagLinks, id, link, { + where: { + productId: ctx.input.productId, + tagId: ctx.input.tagId, + }, + message: "Product-tag link already exists", + }); + return { link }; +} + +export async function handleRemoveProductTagLink( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productTagLinks = asCollection(ctx.storage.productTagLinks); + const link = await productTagLinks.get(ctx.input.linkId); + if (!link) { + throwCommerceApiError({ code: "TAG_LINK_NOT_FOUND", message: "Product-tag link not found" }); + } + await productTagLinks.delete(ctx.input.linkId); + return { deleted: true }; +} diff --git a/packages/plugins/commerce/src/handlers/catalog-bundle.ts b/packages/plugins/commerce/src/handlers/catalog-bundle.ts new file mode 100644 index 000000000..cfa844ac7 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-bundle.ts @@ -0,0 +1,252 @@ +import type { RouteContext } from "emdash"; +import { PluginRouteError } from "emdash"; + +import { computeBundleSummary } from "../lib/catalog-bundles.js"; +import { randomHex } from "../lib/crypto-adapter.js"; +import { + normalizeOrderedChildren, + normalizeOrderedPosition, + mutateOrderedChildren, + sortOrderedRowsByPosition, +} from "../lib/ordered-rows.js"; +import { requirePost } from "../lib/require-post.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { + BundleComponentAddInput, + BundleComponentRemoveInput, + BundleComponentReorderInput, + BundleComputeInput, +} from "../schemas.js"; +import type { + StoredBundleComponent, + StoredInventoryStock, + StoredProduct, + StoredProductSku, +} from "../types.js"; +import type { Collection } from "./catalog-conflict.js"; +import { + asCollection, + asOptionalCollection, + getNowIso, + putWithConflictHandling, +} from "./catalog-conflict.js"; +import { hydrateSkusWithInventoryStock } from "./catalog-read-model.js"; +import { queryAllPages } from "./catalog-read-model.js"; +import type { + BundleComponentResponse, + BundleComponentUnlinkResponse, + BundleComputeResponse, +} from "./catalog.js"; + +export async function queryBundleComponentsForProduct( + bundleComponents: Collection, + bundleProductId: string, +): Promise { + const links = await queryAllPages((cursor) => + bundleComponents.query({ + where: { bundleProductId }, + cursor, + limit: 100, + }), + ); + const rows = sortOrderedRowsByPosition(links.map((row) => row.data)); + return normalizeOrderedChildren(rows); +} + +export async function handleAddBundleComponent( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const productSkus = asCollection(ctx.storage.productSkus); + const bundleComponents = asCollection(ctx.storage.bundleComponents); + const nowIso = getNowIso(); + + const bundleProduct = await products.get(ctx.input.bundleProductId); + if (!bundleProduct) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Bundle product not found" }); + } + if (bundleProduct.type !== "bundle") { + throw PluginRouteError.badRequest("Target product is not a bundle"); + } + + const componentSku = await productSkus.get(ctx.input.componentSkuId); + if (!componentSku) { + throwCommerceApiError({ code: "VARIANT_UNAVAILABLE", message: "Component SKU not found" }); + } + if (componentSku.productId === bundleProduct.id) { + throw PluginRouteError.badRequest("Bundle cannot include component from itself"); + } + const componentProduct = await products.get(componentSku.productId); + if (!componentProduct) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Component product not found" }); + } + if (componentProduct.type === "bundle") { + throw PluginRouteError.badRequest( + "Bundle cannot include component products that are themselves bundles", + ); + } + + const existingComponents = await queryBundleComponentsForProduct( + bundleComponents, + bundleProduct.id, + ); + const requestedPosition = normalizeOrderedPosition(ctx.input.position); + const componentId = `bundle_comp_${await randomHex(6)}`; + const component: StoredBundleComponent = { + id: componentId, + bundleProductId: bundleProduct.id, + componentSkuId: componentSku.id, + quantity: ctx.input.quantity, + position: requestedPosition, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(bundleComponents, componentId, component, { + where: { bundleProductId: bundleProduct.id, componentSkuId: ctx.input.componentSkuId }, + message: "Bundle already contains this component SKU", + }); + + let normalized: StoredBundleComponent[]; + try { + normalized = await mutateOrderedChildren({ + collection: bundleComponents, + rows: existingComponents, + mutation: { + kind: "add", + row: component, + requestedPosition, + }, + nowIso, + }); + } catch (error) { + await bundleComponents.delete(componentId); + throw error; + } + + const added = normalized.find((candidate) => candidate.id === componentId); + if (!added) { + throw PluginRouteError.badRequest("Bundle component not found after add"); + } + return { component: added }; +} + +export async function handleRemoveBundleComponent( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const bundleComponents = asCollection(ctx.storage.bundleComponents); + const nowIso = getNowIso(); + + const existing = await bundleComponents.get(ctx.input.bundleComponentId); + if (!existing) { + throwCommerceApiError({ + code: "BUNDLE_COMPONENT_NOT_FOUND", + message: "Bundle component not found", + }); + } + const components = await queryBundleComponentsForProduct( + bundleComponents, + existing.bundleProductId, + ); + await mutateOrderedChildren({ + collection: bundleComponents, + rows: components, + mutation: { + kind: "remove", + removedRowId: ctx.input.bundleComponentId, + }, + nowIso, + }); + return { deleted: true }; +} + +export async function handleReorderBundleComponent( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const bundleComponents = asCollection(ctx.storage.bundleComponents); + const nowIso = getNowIso(); + + const component = await bundleComponents.get(ctx.input.bundleComponentId); + if (!component) { + throwCommerceApiError({ + code: "BUNDLE_COMPONENT_NOT_FOUND", + message: "Bundle component not found", + }); + } + + const components = await queryBundleComponentsForProduct( + bundleComponents, + component.bundleProductId, + ); + const requestedPosition = normalizeOrderedPosition(ctx.input.position); + const normalized = await mutateOrderedChildren({ + collection: bundleComponents, + rows: components, + mutation: { + kind: "move", + rowId: ctx.input.bundleComponentId, + requestedPosition, + notFoundMessage: "Bundle component not found in target bundle", + }, + nowIso, + }); + + const updated = normalized.find((row) => row.id === ctx.input.bundleComponentId); + if (!updated) { + throw PluginRouteError.badRequest("Bundle component not found after reorder"); + } + return { component: updated }; +} + +export async function handleBundleCompute( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const productSkus = asCollection(ctx.storage.productSkus); + const inventoryStock = asOptionalCollection(ctx.storage.inventoryStock); + const bundleComponents = asCollection(ctx.storage.bundleComponents); + + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + if (product.type !== "bundle") { + throw PluginRouteError.badRequest("Product is not a bundle"); + } + + const components = await queryBundleComponentsForProduct(bundleComponents, product.id); + const lines: Array<{ component: StoredBundleComponent; sku: StoredProductSku }> = []; + for (const component of components) { + const sku = await productSkus.get(component.componentSkuId); + if (!sku) { + throwCommerceApiError({ + code: "VARIANT_UNAVAILABLE", + message: "Bundle component SKU not found", + }); + } + const componentProduct = await products.get(sku.productId); + if (!componentProduct) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: "Bundle component product not found", + }); + } + const hydratedSkus = await hydrateSkusWithInventoryStock( + componentProduct, + [sku], + inventoryStock, + ); + lines.push({ component, sku: hydratedSkus[0] ?? sku }); + } + + return computeBundleSummary( + product.id, + product.bundleDiscountType, + product.bundleDiscountValueMinor, + product.bundleDiscountValueBps, + lines, + ); +} diff --git a/packages/plugins/commerce/src/handlers/catalog-conflict.ts b/packages/plugins/commerce/src/handlers/catalog-conflict.ts new file mode 100644 index 000000000..b9e14b759 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-conflict.ts @@ -0,0 +1,143 @@ +import type { StorageCollection } from "emdash"; +import { PluginRouteError } from "emdash"; + +export type Collection = StorageCollection; + +export type CollectionWithUniqueInsert = Collection & { + putIfAbsent?: (id: string, data: T) => Promise; +}; + +export type ConflictHint = { + where: Record; + message: string; +}; + +export const getNowIso = (): string => { + return new Date(Date.now()).toISOString(); +}; + +export const asCollection = (raw: unknown): Collection => { + return raw as Collection; +}; + +export const asOptionalCollection = (raw: unknown): Collection | null => { + return raw ? (raw as Collection) : null; +}; + +function looksLikeUniqueConstraintMessage(message: string): boolean { + const normalized = message.toLowerCase(); + return ( + normalized.includes("unique constraint failed") || + normalized.includes("uniqueness violation") || + normalized.includes("duplicate key value violates unique constraint") || + normalized.includes("duplicate entry") || + normalized.includes("constraint failed:") || + normalized.includes("sqlerrorcode=primarykey") + ); +} + +export function readErrorCode(error: unknown): string | undefined { + if (!error || typeof error !== "object") return undefined; + const maybeCode = (error as Record).code; + if (typeof maybeCode === "string" && maybeCode.length > 0) { + return maybeCode; + } + if (typeof maybeCode === "number") { + return String(maybeCode); + } + const maybeCause = (error as Record).cause; + return typeof maybeCause === "object" ? readErrorCode(maybeCause) : undefined; +} + +export const isUniqueConstraintViolation = (error: unknown, seen = new Set()): boolean => { + if (error == null || seen.has(error)) return false; + seen.add(error); + + if (readErrorCode(error) === "23505") return true; + + if (error instanceof Error) { + if (looksLikeUniqueConstraintMessage(error.message)) return true; + return isUniqueConstraintViolation((error as Error & { cause?: unknown }).cause, seen); + } + + if (typeof error === "object") { + const record = error as Record; + const message = record.message; + if (typeof message === "string" && looksLikeUniqueConstraintMessage(message)) return true; + const cause = record.cause; + if (cause) { + return isUniqueConstraintViolation(cause, seen); + } + } + + return false; +}; + +const throwConflict = (message: string): never => { + throw PluginRouteError.badRequest(message); +}; + +export async function assertNoConflict( + collection: Collection, + where: Record, + excludeId?: string, + message = "Resource already exists", +): Promise { + const result = await collection.query({ where, limit: 2 } as Parameters< + Collection["query"] + >[0]); + for (const item of result.items) { + if (item.id !== excludeId) { + throwConflict(message); + } + } +} + +export async function putWithConflictHandling( + collection: CollectionWithUniqueInsert, + id: string, + data: T, + conflict?: ConflictHint, +): Promise { + if (collection.putIfAbsent) { + try { + const inserted = await collection.putIfAbsent(id, data); + if (!inserted) { + throwConflict(conflict?.message ?? "Resource already exists"); + } + return; + } catch (error) { + if (isUniqueConstraintViolation(error) && conflict) { + throwConflict(conflict.message); + } + throw error; + } + } + + if (conflict) { + await assertNoConflict(collection, conflict.where, undefined, conflict.message); + } + + await collection.put(id, data); +} + +export async function putWithUpdateConflictHandling( + collection: CollectionWithUniqueInsert, + id: string, + data: T, + conflict?: ConflictHint, +): Promise { + if (conflict && !collection.putIfAbsent) { + await assertNoConflict(collection, conflict.where, id, conflict.message); + } + + try { + await collection.put(id, data); + return; + } catch (error) { + if (isUniqueConstraintViolation(error) && conflict) { + throwConflict(conflict.message); + } + throw error; + } +} diff --git a/packages/plugins/commerce/src/handlers/catalog-digital.ts b/packages/plugins/commerce/src/handlers/catalog-digital.ts new file mode 100644 index 000000000..9980831b8 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-digital.ts @@ -0,0 +1,112 @@ +import { PluginRouteError } from "emdash"; +import type { RouteContext } from "emdash"; + +import { randomHex } from "../lib/crypto-adapter.js"; +import { requirePost } from "../lib/require-post.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { + DigitalAssetCreateInput, + DigitalEntitlementCreateInput, + DigitalEntitlementRemoveInput, +} from "../schemas.js"; +import type { StoredDigitalAsset, StoredDigitalEntitlement, StoredProductSku } from "../types.js"; +import type { Collection } from "./catalog-conflict.js"; +import { asCollection, getNowIso, putWithConflictHandling } from "./catalog-conflict.js"; +import type { + DigitalAssetResponse, + DigitalEntitlementResponse, + DigitalEntitlementUnlinkResponse, +} from "./catalog.js"; + +export async function handleCreateDigitalAsset( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const provider = ctx.input.provider ?? "media"; + const isManualOnly = ctx.input.isManualOnly ?? false; + const isPrivate = ctx.input.isPrivate ?? true; + const productDigitalAssets = asCollection(ctx.storage.digitalAssets); + const nowIso = getNowIso(); + + const id = `digital_asset_${await randomHex(6)}`; + const asset: StoredDigitalAsset = { + id, + provider, + externalAssetId: ctx.input.externalAssetId, + label: ctx.input.label, + downloadLimit: ctx.input.downloadLimit, + downloadExpiryDays: ctx.input.downloadExpiryDays, + isManualOnly, + isPrivate, + metadata: ctx.input.metadata, + createdAt: nowIso, + updatedAt: nowIso, + }; + + await putWithConflictHandling(productDigitalAssets, id, asset, { + where: { provider, externalAssetId: ctx.input.externalAssetId }, + message: "Digital asset already registered for provider key", + }); + return { asset }; +} + +export async function handleCreateDigitalEntitlement( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productSkus = asCollection(ctx.storage.productSkus); + const productDigitalAssets = asCollection(ctx.storage.digitalAssets); + const productDigitalEntitlements = asCollection( + ctx.storage.digitalEntitlements, + ); + const nowIso = getNowIso(); + + const sku = await productSkus.get(ctx.input.skuId); + if (!sku) { + throwCommerceApiError({ code: "VARIANT_UNAVAILABLE", message: "SKU not found" }); + } + if (sku.status !== "active") { + throw PluginRouteError.badRequest( + `Cannot attach entitlement to inactive SKU ${ctx.input.skuId}`, + ); + } + + const digitalAsset = await productDigitalAssets.get(ctx.input.digitalAssetId); + if (!digitalAsset) { + throwCommerceApiError({ code: "DIGITAL_ASSET_NOT_FOUND", message: "Digital asset not found" }); + } + + const id = `entitlement_${await randomHex(6)}`; + const entitlement: StoredDigitalEntitlement = { + id, + skuId: ctx.input.skuId, + digitalAssetId: ctx.input.digitalAssetId, + grantedQuantity: ctx.input.grantedQuantity, + createdAt: nowIso, + updatedAt: nowIso, + }; + await putWithConflictHandling(productDigitalEntitlements, id, entitlement, { + where: { skuId: ctx.input.skuId, digitalAssetId: ctx.input.digitalAssetId }, + message: "SKU already has this digital entitlement", + }); + return { entitlement }; +} + +export async function handleRemoveDigitalEntitlement( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productDigitalEntitlements = asCollection( + ctx.storage.digitalEntitlements, + ); + + const existing = await productDigitalEntitlements.get(ctx.input.entitlementId); + if (!existing) { + throwCommerceApiError({ + code: "DIGITAL_ENTITLEMENT_NOT_FOUND", + message: "Digital entitlement not found", + }); + } + await productDigitalEntitlements.delete(ctx.input.entitlementId); + return { deleted: true }; +} diff --git a/packages/plugins/commerce/src/handlers/catalog-product.ts b/packages/plugins/commerce/src/handlers/catalog-product.ts new file mode 100644 index 000000000..16aeef186 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-product.ts @@ -0,0 +1,1003 @@ +import type { RouteContext } from "emdash"; +import { PluginRouteError } from "emdash"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { computeBundleSummary, type BundleComputeSummary } from "../lib/catalog-bundles.js"; +import { + applyProductSkuUpdatePatch, + applyProductStatusTransition, + applyProductUpdatePatch, +} from "../lib/catalog-domain.js"; +import type { VariantMatrixDTO } from "../lib/catalog-dto.js"; +import { + collectVariantDefiningAttributes, + normalizeSkuOptionSignature, + validateVariableSkuOptions, +} from "../lib/catalog-variants.js"; +import { randomHex } from "../lib/crypto-adapter.js"; +import { inventoryStockDocId } from "../lib/inventory-stock.js"; +import { requirePost } from "../lib/require-post.js"; +import { sortedImmutable } from "../lib/sort-immutable.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { + ProductCreateInput, + ProductGetInput, + ProductListInput, + ProductSkuCreateInput, + ProductSkuStateInput, + ProductSkuUpdateInput, + ProductSkuListInput, + ProductStateInput, + ProductUpdateInput, +} from "../schemas.js"; +import type { + StoredBundleComponent, + StoredCategory, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredInventoryStock, + StoredProduct, + StoredProductAsset, + StoredProductAttribute, + StoredProductAttributeValue, + StoredProductCategoryLink, + StoredProductAssetLink, + StoredProductSku, + StoredProductSkuOptionValue, + StoredProductTag, + StoredProductTagLink, + StoredProductTagLink as StoredProductTagLinkType, +} from "../types.js"; +import { queryBundleComponentsForProduct } from "./catalog-bundle.js"; +import type { Collection } from "./catalog-conflict.js"; +import { + assertNoConflict, + asCollection, + asOptionalCollection, + getNowIso, + putWithConflictHandling, + putWithUpdateConflictHandling, +} from "./catalog-conflict.js"; +import { + queryAllPages, + getManyByIds, + hydrateSkusWithInventoryStock, + loadProductReadMetadata, + loadProductsReadMetadata, + queryDigitalEntitlementSummariesBySkuIds, + queryProductImagesByRoleForTargets, + querySkuOptionValuesBySkuIds, + summarizeInventory, + summarizeSkuPricing, + toUniqueStringList, +} from "./catalog-read-model.js"; +import type { + ProductResponse, + ProductSkuListResponse, + ProductSkuResponse, + StorefrontProductAvailability, + StorefrontProductDetail, + StorefrontProductListResponse, + StorefrontSkuListResponse, + ProductListResponse, +} from "./catalog.js"; + +type ProductCategoryIdFilter = { categoryId: string }; +type ProductTagIdFilter = { tagId: string }; + +async function syncInventoryStockForSku( + inventoryStock: Collection | null, + product: StoredProduct, + sku: StoredProductSku, + nowIso: string, + includeProductLevelStock: boolean, +): Promise { + if (!inventoryStock) { + return; + } + + await inventoryStock.put(inventoryStockDocId(product.id, sku.id), { + productId: product.id, + variantId: sku.id, + quantity: sku.inventoryQuantity, + version: sku.inventoryVersion, + updatedAt: nowIso, + }); + + if (!includeProductLevelStock) { + return; + } + + await inventoryStock.put(inventoryStockDocId(product.id, ""), { + productId: product.id, + variantId: "", + quantity: sku.inventoryQuantity, + version: sku.inventoryVersion, + updatedAt: nowIso, + }); +} + +type BundleDiscountPatchInput = { + bundleDiscountType?: "none" | "fixed_amount" | "percentage"; + bundleDiscountValueMinor?: number; + bundleDiscountValueBps?: number; +}; + +function assertBundleDiscountPatchForProduct( + product: StoredProduct, + patch: BundleDiscountPatchInput, +): void { + const hasType = patch.bundleDiscountType !== undefined; + const hasMinorValue = patch.bundleDiscountValueMinor !== undefined; + const hasBpsValue = patch.bundleDiscountValueBps !== undefined; + const effectiveType = patch.bundleDiscountType ?? product.bundleDiscountType ?? "none"; + + if (product.type !== "bundle" && (hasType || hasMinorValue || hasBpsValue)) { + throw PluginRouteError.badRequest( + "Bundle discount fields are only supported for bundle products", + ); + } + + if (product.type !== "bundle") { + return; + } + + if (hasMinorValue && effectiveType !== "fixed_amount") { + throw PluginRouteError.badRequest( + "bundleDiscountValueMinor can only be used with fixed_amount bundles", + ); + } + if (hasBpsValue && effectiveType !== "percentage") { + throw PluginRouteError.badRequest( + "bundleDiscountValueBps can only be used with percentage bundles", + ); + } +} + +function assertSimpleProductSkuCapacity(product: StoredProduct, existingSkuCount: number): void { + if (product.type !== "simple") { + return; + } + if (existingSkuCount > 0) { + throw PluginRouteError.badRequest("Simple products can have at most one SKU"); + } +} + +function toWhere(input: { type?: string; status?: string; visibility?: string }) { + const where: Record = {}; + if (input.type) where.type = input.type; + if (input.status) where.status = input.status; + if (input.visibility) where.visibility = input.visibility; + return where; +} + +function toStorefrontProductRecord(product: StoredProduct) { + return { + id: product.id, + type: product.type, + status: product.status, + visibility: product.visibility, + slug: product.slug, + title: product.title, + shortDescription: product.shortDescription, + brand: product.brand, + vendor: product.vendor, + featured: product.featured, + sortOrder: product.sortOrder, + requiresShippingDefault: product.requiresShippingDefault, + taxClassDefault: product.taxClassDefault, + bundleDiscountType: product.bundleDiscountType, + bundleDiscountValueMinor: product.bundleDiscountValueMinor, + bundleDiscountValueBps: product.bundleDiscountValueBps, + createdAt: product.createdAt, + updatedAt: product.updatedAt, + }; +} + +function resolveProductAvailability(quantity: number): StorefrontProductAvailability { + if (quantity <= 0) { + return "out_of_stock"; + } + if (quantity <= COMMERCE_LIMITS.lowStockThreshold) { + return "low_stock"; + } + return "in_stock"; +} + +function assertStorefrontProductVisible(product: StoredProduct): void { + if (product.status !== "active" || product.visibility !== "public") { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not available" }); + } +} + +function normalizeStorefrontProductListInput(input: ProductListInput): ProductListInput { + return { + ...input, + status: "active", + visibility: "public", + }; +} + +function toStorefrontSkuSummary(sku: StoredProductSku) { + return { + id: sku.id, + productId: sku.productId, + skuCode: sku.skuCode, + status: sku.status, + unitPriceMinor: sku.unitPriceMinor, + compareAtPriceMinor: sku.compareAtPriceMinor, + requiresShipping: sku.requiresShipping, + isDigital: sku.isDigital, + availability: resolveProductAvailability(sku.inventoryQuantity), + }; +} + +function toStorefrontVariantMatrixRow(row: VariantMatrixDTO) { + const { inventoryQuantity } = row; + const { inventoryVersion, ...sanitized } = row; + return { + ...sanitized, + availability: resolveProductAvailability(inventoryQuantity), + }; +} + +function toStorefrontProductDetail(response: ProductResponse): StorefrontProductDetail { + return { + product: toStorefrontProductRecord(response.product), + skus: response.skus?.map(toStorefrontSkuSummary), + attributes: response.attributes, + variantMatrix: response.variantMatrix?.map(toStorefrontVariantMatrixRow), + categories: response.categories ?? [], + tags: response.tags ?? [], + primaryImage: response.primaryImage, + galleryImages: response.galleryImages, + }; +} + +function toStorefrontProductListResponse( + response: ProductListResponse, +): StorefrontProductListResponse { + return { + items: response.items.map((item) => ({ + product: toStorefrontProductRecord(item.product), + priceRange: item.priceRange, + availability: resolveProductAvailability(item.inventorySummary.totalInventoryQuantity), + primaryImage: item.primaryImage, + galleryImages: item.galleryImages, + lowStockSkuCount: item.lowStockSkuCount, + categories: item.categories, + tags: item.tags, + })), + }; +} + +function intersectProductIdSets(left: Set, right: Set): Set { + if (left.size > right.size) { + const swapped = left; + left = right; + right = swapped; + } + const result = new Set(); + for (const value of left) { + if (right.has(value)) { + result.add(value); + } + } + return result; +} + +async function collectLinkedProductIds( + links: Collection<{ productId: string }>, + where: ProductCategoryIdFilter | ProductTagIdFilter, +): Promise> { + const ids = new Set(); + let cursor: string | undefined; + while (true) { + const result = await links.query({ where, cursor, limit: 100 }); + for (const row of result.items) { + ids.add(row.data.productId); + } + if (!result.hasMore || !result.cursor) { + break; + } + cursor = result.cursor; + } + return ids; +} + +export async function handleCreateProduct( + ctx: RouteContext, +): Promise { + requirePost(ctx); + + const products = asCollection(ctx.storage.products); + const productAttributes = asCollection(ctx.storage.productAttributes); + const productAttributeValues = asCollection( + ctx.storage.productAttributeValues, + ); + const type = ctx.input.type ?? "simple"; + const status = ctx.input.status ?? "draft"; + const visibility = ctx.input.visibility ?? "hidden"; + const shortDescription = ctx.input.shortDescription ?? ""; + const longDescription = ctx.input.longDescription ?? ""; + const featured = ctx.input.featured ?? false; + const sortOrder = ctx.input.sortOrder ?? 0; + const requiresShippingDefault = ctx.input.requiresShippingDefault ?? true; + const bundleDiscountType = ctx.input.bundleDiscountType ?? "none"; + const inputAttributes = (ctx.input.attributes ?? []).map((attributeInput) => ({ + ...attributeInput, + kind: attributeInput.kind ?? "descriptive", + position: attributeInput.position ?? 0, + values: attributeInput.values ?? [], + })); + const nowMs = Date.now(); + const nowIso = new Date(nowMs).toISOString(); + + const id = `prod_${await randomHex(6)}`; + + if (type !== "variable" && inputAttributes.length > 0) { + throw PluginRouteError.badRequest("Only variable products can define attributes"); + } + + if (type === "variable" && inputAttributes.length === 0) { + throw PluginRouteError.badRequest("Variable products must define at least one attribute"); + } + + const variantAttributeCount = inputAttributes.filter( + (attribute) => attribute.kind === "variant_defining", + ).length; + if (type === "variable" && variantAttributeCount === 0) { + throw PluginRouteError.badRequest( + "Variable products must include at least one variant-defining attribute", + ); + } + + const attributeCodes = new Set(); + for (const attribute of inputAttributes) { + if (attributeCodes.has(attribute.code)) { + throw PluginRouteError.badRequest(`Duplicate attribute code: ${attribute.code}`); + } + attributeCodes.add(attribute.code); + + const valueCodes = new Set(); + for (const value of attribute.values) { + if (valueCodes.has(value.code)) { + throw PluginRouteError.badRequest( + `Duplicate value code ${value.code} for attribute ${attribute.code}`, + ); + } + valueCodes.add(value.code); + } + } + + const product: StoredProduct = { + id, + type, + status, + visibility, + slug: ctx.input.slug, + title: ctx.input.title, + shortDescription, + longDescription, + brand: ctx.input.brand, + vendor: ctx.input.vendor, + featured, + sortOrder, + requiresShippingDefault, + taxClassDefault: ctx.input.taxClassDefault, + bundleDiscountType, + bundleDiscountValueMinor: ctx.input.bundleDiscountValueMinor, + bundleDiscountValueBps: ctx.input.bundleDiscountValueBps, + metadataJson: {}, + createdAt: nowIso, + updatedAt: nowIso, + publishedAt: status === "active" ? nowIso : undefined, + archivedAt: status === "archived" ? nowIso : undefined, + }; + + await putWithConflictHandling(products, id, product, { + where: { slug: ctx.input.slug }, + message: `Product slug already exists: ${ctx.input.slug}`, + }); + + for (const attributeInput of inputAttributes) { + const attributeId = `${id}_attr_${await randomHex(6)}`; + const nowAttribute: StoredProductAttribute = { + id: attributeId, + productId: id, + name: attributeInput.name, + code: attributeInput.code, + kind: attributeInput.kind, + position: attributeInput.position, + createdAt: nowIso, + updatedAt: nowIso, + }; + await productAttributes.put(attributeId, nowAttribute); + + for (const valueInput of attributeInput.values) { + const valueId = `${attributeId}_val_${await randomHex(6)}`; + await productAttributeValues.put(valueId, { + id: valueId, + attributeId, + value: valueInput.value, + code: valueInput.code, + position: valueInput.position ?? 0, + createdAt: nowIso, + updatedAt: nowIso, + }); + } + } + + return { product }; +} + +export async function handleUpdateProduct( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const nowIso = getNowIso(); + + const existing = await products.get(ctx.input.productId); + if (!existing) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + const { productId, ...patch } = ctx.input; + assertBundleDiscountPatchForProduct(existing, patch); + + const product = applyProductUpdatePatch(existing, patch, nowIso); + const conflict = + patch.slug !== undefined + ? { + where: { slug: patch.slug }, + message: `Product slug already exists: ${patch.slug}`, + } + : undefined; + await putWithUpdateConflictHandling(products, productId, product, conflict); + return { product }; +} + +export async function handleSetProductState( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const nowIso = getNowIso(); + + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + + const updated = applyProductStatusTransition(product, ctx.input.status, nowIso); + await products.put(ctx.input.productId, updated); + return { product: updated }; +} + +export async function handleGetProduct( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const productSkus = asCollection(ctx.storage.productSkus); + const inventoryStock = asOptionalCollection(ctx.storage.inventoryStock); + const productAttributes = asCollection(ctx.storage.productAttributes); + const productSkuOptionValues = asCollection( + ctx.storage.productSkuOptionValues, + ); + const productAssets = asCollection(ctx.storage.productAssets); + const productAssetLinks = asCollection(ctx.storage.productAssetLinks); + const productCategories = asCollection(ctx.storage.categories); + const productCategoryLinks = asCollection( + ctx.storage.productCategoryLinks, + ); + const productTags = asCollection(ctx.storage.productTags); + const productTagLinks = asCollection(ctx.storage.productTagLinks); + const productDigitalAssets = asCollection(ctx.storage.digitalAssets); + const productDigitalEntitlements = asCollection( + ctx.storage.digitalEntitlements, + ); + const bundleComponents = asCollection(ctx.storage.bundleComponents); + + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + const { + skus: skuRows, + categories, + tags, + primaryImage, + galleryImages, + } = await loadProductReadMetadata( + { + productCategoryLinks, + productCategories, + productTagLinks, + productTags, + productAssets, + productAssetLinks, + productSkus, + inventoryStock, + }, + { + product, + includeGalleryImages: true, + }, + ); + const response: ProductResponse = { product, skus: skuRows, categories, tags }; + if (primaryImage) response.primaryImage = primaryImage; + if (galleryImages.length > 0) response.galleryImages = galleryImages; + + if (product.type === "variable") { + const attributes = ( + await productAttributes.query({ where: { productId: product.id } }) + ).items.map((row) => row.data); + const skuOptionValuesBySku = await querySkuOptionValuesBySkuIds( + productSkuOptionValues, + skuRows.map((sku) => sku.id), + ); + const variantImageBySku = await queryProductImagesByRoleForTargets( + productAssetLinks, + productAssets, + "sku", + skuRows.map((sku) => sku.id), + ["variant_image"], + ); + const variantMatrix: VariantMatrixDTO[] = []; + for (const skuRow of skuRows) { + const variantImage = variantImageBySku.get(skuRow.id)?.[0]; + const options = skuOptionValuesBySku.get(skuRow.id) ?? []; + variantMatrix.push({ + skuId: skuRow.id, + skuCode: skuRow.skuCode, + status: skuRow.status, + unitPriceMinor: skuRow.unitPriceMinor, + compareAtPriceMinor: skuRow.compareAtPriceMinor, + inventoryQuantity: skuRow.inventoryQuantity, + inventoryVersion: skuRow.inventoryVersion, + requiresShipping: skuRow.requiresShipping, + isDigital: skuRow.isDigital, + image: variantImage, + options, + }); + } + response.attributes = attributes; + response.variantMatrix = variantMatrix; + } + + if (product.type === "bundle") { + const components = await queryBundleComponentsForProduct(bundleComponents, product.id); + const componentSkus = await getManyByIds( + productSkus, + components.map((component) => component.componentSkuId), + ); + const componentProductIds = toUniqueStringList( + components + .map((component) => componentSkus.get(component.componentSkuId)?.productId) + .filter((value): value is string => Boolean(value)), + ); + const componentProducts = await getManyByIds(products, componentProductIds); + + const componentLines = await Promise.all( + components.map(async (component) => { + const componentSku = componentSkus.get(component.componentSkuId); + if (!componentSku) { + throwCommerceApiError({ + code: "VARIANT_UNAVAILABLE", + message: "Bundle component SKU not found", + }); + } + const componentProduct = componentProducts.get(componentSku.productId); + if (!componentProduct) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: "Bundle component product not found", + }); + } + const hydratedComponentSkus = await hydrateSkusWithInventoryStock( + componentProduct, + [componentSku], + inventoryStock, + ); + return { component, sku: hydratedComponentSkus[0] ?? componentSku }; + }), + ); + response.bundleSummary = computeBundleSummary( + product.id, + product.bundleDiscountType, + product.bundleDiscountValueMinor, + product.bundleDiscountValueBps, + componentLines, + ); + } + + const digitalEntitlements: ProductResponse["digitalEntitlements"] = []; + const entitlementsBySku = await queryDigitalEntitlementSummariesBySkuIds( + productDigitalEntitlements, + productDigitalAssets, + skuRows.map((sku) => sku.id), + ); + for (const sku of skuRows) { + const entitlements = entitlementsBySku.get(sku.id); + if (!entitlements || entitlements.length === 0) { + continue; + } + digitalEntitlements.push({ + skuId: sku.id, + entitlements, + }); + } + if (digitalEntitlements.length > 0) { + response.digitalEntitlements = digitalEntitlements; + } + return response; +} + +export async function handleListProducts( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const productSkus = asCollection(ctx.storage.productSkus); + const inventoryStock = asOptionalCollection(ctx.storage.inventoryStock); + const productAssets = asCollection(ctx.storage.productAssets); + const productAssetLinks = asCollection(ctx.storage.productAssetLinks); + const productCategories = asCollection(ctx.storage.categories); + const productCategoryLinks = asCollection( + ctx.storage.productCategoryLinks, + ); + const productTags = asCollection(ctx.storage.productTags); + const productTagLinks = asCollection(ctx.storage.productTagLinks); + const where = toWhere(ctx.input); + const includeCategoryId = ctx.input.categoryId; + const includeTagId = ctx.input.tagId; + const hasProductAttributeFilter = Object.keys(where).length > 0; + + let rows: StoredProduct[] = []; + if (includeCategoryId || includeTagId) { + let filteredProductIds: Set | null = null; + if (includeCategoryId) { + filteredProductIds = await collectLinkedProductIds(productCategoryLinks, { + categoryId: includeCategoryId, + }); + } + if (includeTagId) { + const tagProductIds = await collectLinkedProductIds(productTagLinks, { tagId: includeTagId }); + filteredProductIds = filteredProductIds + ? intersectProductIdSets(filteredProductIds, tagProductIds) + : tagProductIds; + } + if (!filteredProductIds || filteredProductIds.size === 0) { + return { items: [] }; + } + + if (!hasProductAttributeFilter) { + const rowsById = await getManyByIds(products, [...filteredProductIds]); + rows = [...rowsById.values()]; + } else { + let cursor: string | undefined; + while (true) { + const result = await products.query({ where, cursor, limit: 100 }); + for (const row of result.items) { + if (filteredProductIds.has(row.id)) { + rows.push(row.data); + } + } + if (!result.hasMore || !result.cursor) { + break; + } + cursor = result.cursor; + } + } + } else { + const result = await queryAllPages((cursor) => + products.query({ + where, + cursor, + limit: 100, + }), + ); + rows = result.map((row) => row.data); + } + + const sortedRows = sortedImmutable( + rows, + (left, right) => left.sortOrder - right.sortOrder || left.slug.localeCompare(right.slug), + ).slice(0, ctx.input.limit); + const metadataByProduct = await loadProductsReadMetadata( + { + productCategoryLinks, + productCategories, + productTagLinks, + productTags, + productAssets, + productAssetLinks, + productSkus, + inventoryStock, + }, + { + products: sortedRows, + includeGalleryImages: true, + }, + ); + const items: ProductListResponse["items"] = []; + for (const row of sortedRows) { + const { + skus: skuRows, + categories, + tags, + primaryImage, + galleryImages, + } = metadataByProduct.get(row.id) ?? { + skus: [], + categories: [], + tags: [], + galleryImages: [], + }; + + items.push({ + product: row, + priceRange: summarizeSkuPricing(skuRows), + inventorySummary: summarizeInventory(skuRows), + primaryImage, + galleryImages: galleryImages.length > 0 ? galleryImages : undefined, + lowStockSkuCount: skuRows.filter( + (sku) => + sku.status === "active" && sku.inventoryQuantity <= COMMERCE_LIMITS.lowStockThreshold, + ).length, + categories, + tags, + }); + } + + return { items }; +} + +export async function handleCreateProductSku( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const productSkus = asCollection(ctx.storage.productSkus); + const inventoryStock = asOptionalCollection(ctx.storage.inventoryStock); + const productAttributes = asCollection(ctx.storage.productAttributes); + const productAttributeValues = asCollection( + ctx.storage.productAttributeValues, + ); + const productSkuOptionValues = asCollection( + ctx.storage.productSkuOptionValues, + ); + const inputOptionValues = ctx.input.optionValues ?? []; + + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + if (product.status === "archived") { + throw PluginRouteError.badRequest("Cannot add SKUs to an archived product"); + } + + const existingSkuCount = (await productSkus.query({ where: { productId: product.id } })).items + .length; + assertSimpleProductSkuCapacity(product, existingSkuCount); + + if (product.type !== "variable" && inputOptionValues.length > 0) { + throw PluginRouteError.badRequest("Option values are only allowed for variable products"); + } + + if (product.type === "variable") { + const attributesResult = await productAttributes.query({ where: { productId: product.id } }); + const variantAttributes = collectVariantDefiningAttributes( + attributesResult.items.map((row) => row.data), + ); + if (variantAttributes.length === 0) { + throw PluginRouteError.badRequest(`Product ${product.id} has no variant-defining attributes`); + } + + const attributeIds = variantAttributes.map((attribute) => attribute.id); + const attributeValueRows = + attributeIds.length === 0 + ? [] + : ( + await productAttributeValues.query({ + where: { attributeId: { in: attributeIds } }, + }) + ).items.map((row) => row.data); + + const existingSkuResult = await productSkus.query({ where: { productId: product.id } }); + const existingSkuIds = existingSkuResult.items.map((row) => row.data.id); + const optionValueRows = + existingSkuIds.length === 0 + ? [] + : ( + await productSkuOptionValues.query({ + where: { skuId: { in: existingSkuIds } }, + }) + ).items.map((row) => row.data); + const optionValuesBySku = new Map< + string, + Array<{ attributeId: string; attributeValueId: string }> + >(); + for (const option of optionValueRows) { + const current = optionValuesBySku.get(option.skuId) ?? []; + current.push({ attributeId: option.attributeId, attributeValueId: option.attributeValueId }); + optionValuesBySku.set(option.skuId, current); + } + + const existingSignatures = new Set(); + for (const row of existingSkuResult.items) { + const options = optionValuesBySku.get(row.data.id) ?? []; + const signature = normalizeSkuOptionSignature(options); + if (options.length > 0) { + existingSignatures.add(signature); + } + } + + validateVariableSkuOptions({ + productId: product.id, + variantAttributes, + attributeValues: attributeValueRows, + optionValues: inputOptionValues, + existingSignatures, + }); + } + + const nowIso = getNowIso(); + const id = `sku_${ctx.input.productId}_${await randomHex(6)}`; + const status = ctx.input.status ?? "active"; + const requiresShipping = ctx.input.requiresShipping ?? true; + const isDigital = ctx.input.isDigital ?? false; + const inventoryVersion = ctx.input.inventoryVersion ?? 1; + const sku: StoredProductSku = { + id, + productId: ctx.input.productId, + skuCode: ctx.input.skuCode, + status, + unitPriceMinor: ctx.input.unitPriceMinor, + compareAtPriceMinor: ctx.input.compareAtPriceMinor, + inventoryQuantity: ctx.input.inventoryQuantity, + inventoryVersion, + requiresShipping, + isDigital, + createdAt: nowIso, + updatedAt: nowIso, + }; + + await putWithConflictHandling(productSkus, id, sku, { + where: { skuCode: ctx.input.skuCode }, + message: `SKU code already exists: ${ctx.input.skuCode}`, + }); + await syncInventoryStockForSku( + inventoryStock, + product, + sku, + nowIso, + product.type !== "variable" && existingSkuCount === 0, + ); + + if (product.type === "variable") { + for (const optionInput of inputOptionValues) { + const optionId = `${id}_opt_${await randomHex(6)}`; + const optionRow: StoredProductSkuOptionValue = { + id: optionId, + skuId: id, + attributeId: optionInput.attributeId, + attributeValueId: optionInput.attributeValueId, + createdAt: nowIso, + updatedAt: nowIso, + }; + await productSkuOptionValues.put(optionId, optionRow); + } + } + return { sku }; +} + +export async function handleUpdateProductSku( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const products = asCollection(ctx.storage.products); + const productSkus = asCollection(ctx.storage.productSkus); + const inventoryStock = asOptionalCollection(ctx.storage.inventoryStock); + const nowIso = getNowIso(); + + const existing = await productSkus.get(ctx.input.skuId); + if (!existing) { + throwCommerceApiError({ code: "VARIANT_UNAVAILABLE", message: "SKU not found" }); + } + + const { skuId, ...patch } = ctx.input; + const sku = applyProductSkuUpdatePatch(existing, patch, nowIso); + const conflict = + patch.skuCode !== undefined + ? { + where: { skuCode: patch.skuCode }, + message: `SKU code already exists: ${patch.skuCode}`, + } + : undefined; + await putWithUpdateConflictHandling(productSkus, skuId, sku, conflict); + const shouldSyncInventoryStock = + patch.inventoryQuantity !== undefined || patch.inventoryVersion !== undefined; + if (shouldSyncInventoryStock) { + const product = await products.get(existing.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + const productSkusForProduct = await productSkus.query({ where: { productId: product.id } }); + const includeProductLevelStock = + product.type !== "variable" && productSkusForProduct.items.length === 1; + await syncInventoryStockForSku(inventoryStock, product, sku, nowIso, includeProductLevelStock); + } + + return { sku }; +} + +export async function handleSetSkuStatus( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productSkus = asCollection(ctx.storage.productSkus); + + const existing = await productSkus.get(ctx.input.skuId); + if (!existing) { + throwCommerceApiError({ code: "VARIANT_UNAVAILABLE", message: "SKU not found" }); + } + + const updated: StoredProductSku = { + ...existing, + status: ctx.input.status, + updatedAt: getNowIso(), + }; + await productSkus.put(ctx.input.skuId, updated); + return { sku: updated }; +} + +export async function handleListProductSkus( + ctx: RouteContext, +): Promise { + requirePost(ctx); + const productSkus = asCollection(ctx.storage.productSkus); + + const result = await productSkus.query({ + where: { productId: ctx.input.productId }, + limit: ctx.input.limit, + }); + const items = result.items.map((row) => row.data); + + return { items }; +} + +export async function handleGetStorefrontProduct( + ctx: RouteContext, +): Promise { + const internal = await handleGetProduct(ctx); + assertStorefrontProductVisible(internal.product); + return toStorefrontProductDetail(internal); +} + +export async function handleListStorefrontProducts( + ctx: RouteContext, +): Promise { + const storefrontCtx = { + ...ctx, + input: normalizeStorefrontProductListInput(ctx.input), + } as RouteContext; + const internal = await handleListProducts(storefrontCtx); + return toStorefrontProductListResponse(internal); +} + +export async function handleListStorefrontProductSkus( + ctx: RouteContext, +): Promise { + const products = asCollection(ctx.storage.products); + const product = await products.get(ctx.input.productId); + if (!product) { + throwCommerceApiError({ code: "PRODUCT_UNAVAILABLE", message: "Product not found" }); + } + assertStorefrontProductVisible(product); + const internal = await handleListProductSkus(ctx); + return { + items: internal.items.filter((sku) => sku.status === "active").map(toStorefrontSkuSummary), + }; +} diff --git a/packages/plugins/commerce/src/handlers/catalog-read-model.ts b/packages/plugins/commerce/src/handlers/catalog-read-model.ts new file mode 100644 index 000000000..2eaa4951a --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog-read-model.ts @@ -0,0 +1,503 @@ +import type { + ProductCategoryDTO, + ProductPrimaryImageDTO, + ProductTagDTO, +} from "../lib/catalog-dto.js"; +import { inventoryStockDocId } from "../lib/inventory-stock.js"; +import { sortOrderedRowsByPosition } from "../lib/ordered-rows.js"; +import type { + ProductAssetRole, + ProductAssetLinkTarget, + StoredCategory, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredInventoryStock, + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductCategoryLink, + StoredProductSku, + StoredProductSkuOptionValue, + StoredProductTag, + StoredProductTagLink, +} from "../types.js"; + +export type StorageQueryResult = { + items: Array<{ id: string; data: T }>; + hasMore: boolean; + cursor?: string; +}; + +type ProductDigitalEntitlementSummaryRow = { + entitlementId: string; + digitalAssetId: string; + digitalAssetLabel?: string; + grantedQuantity: number; + downloadLimit?: number; + downloadExpiryDays?: number; + isManualOnly: boolean; + isPrivate: boolean; +}; + +type InFilter = { in: string[] }; + +export async function queryAllPages( + queryPage: (cursor?: string) => Promise>, +): Promise> { + const all: Array<{ id: string; data: T }> = []; + let cursor: string | undefined; + while (true) { + const page = await queryPage(cursor); + all.push(...page.items); + if (!page.hasMore || !page.cursor) { + break; + } + cursor = page.cursor; + } + return all; +} + +export function toUniqueStringList(values: string[]): string[] { + return [...new Set(values)]; +} + +export async function getManyByIds( + collection: Collection, + ids: string[], +): Promise> { + const uniqueIds = toUniqueStringList(ids); + const getMany = (collection as { getMany?: (ids: string[]) => Promise> }).getMany; + if (getMany) { + return getMany.call(collection, uniqueIds); + } + + const rows = await Promise.all(uniqueIds.map((id) => collection.get(id))); + const map = new Map(); + for (const [index, id] of uniqueIds.entries()) { + const row = rows[index]; + if (row) { + map.set(id, row); + } + } + return map; +} + +type ProductReadMetadata = { + skus: StoredProductSku[]; + categories: ProductCategoryDTO[]; + tags: ProductTagDTO[]; + primaryImage?: ProductPrimaryImageDTO; + galleryImages: ProductPrimaryImageDTO[]; +}; + +type ProductReadContext = { + product: StoredProduct; + includeGalleryImages?: boolean; +}; + +export type ProductReadCollections = { + productCategoryLinks: Collection; + productCategories: Collection; + productTagLinks: Collection; + productTags: Collection; + productAssets: Collection; + productAssetLinks: Collection; + productSkus: Collection; + inventoryStock: Collection | null; +}; + +export async function loadProductReadMetadata( + collections: ProductReadCollections, + context: ProductReadContext, +): Promise { + const { product, includeGalleryImages = false } = context; + const metadataByProduct = await loadProductsReadMetadata(collections, { + products: [product], + includeGalleryImages, + }); + return ( + metadataByProduct.get(product.id) ?? { + skus: [], + categories: [], + tags: [], + galleryImages: [], + } + ); +} + +export async function loadProductsReadMetadata( + collections: ProductReadCollections, + context: { + products: StoredProduct[]; + includeGalleryImages?: boolean; + }, +): Promise> { + const productIds = toUniqueStringList(context.products.map((product) => product.id)); + const includeGalleryImages = context.includeGalleryImages ?? false; + if (productIds.length === 0) { + return new Map(); + } + + const productsById = new Map( + context.products.map((product) => [product.id, product]), + ); + const skusResult = await queryAllPages((cursor) => + collections.productSkus.query({ + where: { productId: { in: productIds } }, + cursor, + limit: 100, + }), + ); + const skusByProduct = new Map(); + for (const row of skusResult) { + const current = skusByProduct.get(row.data.productId) ?? []; + current.push(row.data); + skusByProduct.set(row.data.productId, current); + } + + const hydratedSkusByProductEntries = await Promise.all( + productIds.map(async (productId) => { + const product = productsById.get(productId); + const skus = skusByProduct.get(productId) ?? []; + const hydratedSkus = product + ? await hydrateSkusWithInventoryStock(product, skus, collections.inventoryStock) + : []; + return [productId, hydratedSkus] as const; + }), + ); + const hydratedSkusByProduct = new Map(hydratedSkusByProductEntries); + + const categoriesByProduct = await queryCategoryDtosForProducts( + collections.productCategoryLinks, + collections.productCategories, + productIds, + ); + const tagsByProduct = await queryTagDtosForProducts( + collections.productTagLinks, + collections.productTags, + productIds, + ); + const primaryImageByProduct = await queryProductImagesByRoleForTargets( + collections.productAssetLinks, + collections.productAssets, + "product", + productIds, + ["primary_image"], + ); + const galleryImageByProduct = includeGalleryImages + ? await queryProductImagesByRoleForTargets( + collections.productAssetLinks, + collections.productAssets, + "product", + productIds, + ["gallery_image"], + ) + : new Map(); + + const metadataByProduct = new Map(); + for (const productId of productIds) { + metadataByProduct.set(productId, { + skus: hydratedSkusByProduct.get(productId) ?? [], + categories: categoriesByProduct.get(productId) ?? [], + tags: tagsByProduct.get(productId) ?? [], + primaryImage: primaryImageByProduct.get(productId)?.[0], + galleryImages: galleryImageByProduct.get(productId) ?? [], + }); + } + return metadataByProduct; +} + +export function summarizeInventory(skus: StoredProductSku[]) { + const skuCount = skus.length; + const activeSkus = skus.filter((sku) => sku.status === "active"); + const activeSkuCount = activeSkus.length; + const totalInventoryQuantity = skus.reduce((total, sku) => total + sku.inventoryQuantity, 0); + return { skuCount, activeSkuCount, totalInventoryQuantity }; +} + +export function summarizeSkuPricing(skus: StoredProductSku[]) { + if (skus.length === 0) return { minUnitPriceMinor: undefined, maxUnitPriceMinor: undefined }; + const prices = skus.filter((sku) => sku.status === "active").map((sku) => sku.unitPriceMinor); + if (prices.length === 0) { + return { minUnitPriceMinor: undefined, maxUnitPriceMinor: undefined }; + } + const min = Math.min(...prices); + const max = Math.max(...prices); + return { minUnitPriceMinor: min, maxUnitPriceMinor: max }; +} + +export async function collectLinkedProductIds( + links: Collection<{ productId: string }>, + where: Record, +): Promise> { + const ids = new Set(); + let cursor: string | undefined; + while (true) { + const result = await links.query({ where, cursor, limit: 100 }); + for (const row of result.items) { + ids.add(row.data.productId); + } + if (!result.hasMore || !result.cursor) { + break; + } + cursor = result.cursor; + } + return ids; +} + +export async function queryCategoryDtosForProducts( + productCategoryLinks: Collection, + categories: Collection, + productIds: string[], +): Promise> { + const normalizedProductIds = toUniqueStringList(productIds); + if (normalizedProductIds.length === 0) { + return new Map(); + } + + const links = await queryAllPages((cursor) => + productCategoryLinks.query({ + where: { productId: { in: normalizedProductIds } }, + cursor, + limit: 100, + }), + ); + const categoryRows = await getManyByIds( + categories, + toUniqueStringList(links.map((link) => link.data.categoryId)), + ); + const rowsByProduct = new Map(); + + for (const link of links) { + const category = categoryRows.get(link.data.categoryId); + if (!category) { + continue; + } + const current = rowsByProduct.get(link.data.productId) ?? []; + current.push(toProductCategoryDTO(category)); + rowsByProduct.set(link.data.productId, current); + } + return rowsByProduct; +} + +export async function queryTagDtosForProducts( + productTagLinks: Collection, + tags: Collection, + productIds: string[], +): Promise> { + const normalizedProductIds = toUniqueStringList(productIds); + if (normalizedProductIds.length === 0) { + return new Map(); + } + + const links = await queryAllPages((cursor) => + productTagLinks.query({ + where: { productId: { in: normalizedProductIds } }, + cursor, + limit: 100, + }), + ); + const tagRows = await getManyByIds( + tags, + toUniqueStringList(links.map((link) => link.data.tagId)), + ); + const rowsByProduct = new Map(); + + for (const link of links) { + const tag = tagRows.get(link.data.tagId); + if (!tag) { + continue; + } + const current = rowsByProduct.get(link.data.productId) ?? []; + current.push(toProductTagDTO(tag)); + rowsByProduct.set(link.data.productId, current); + } + return rowsByProduct; +} + +export async function queryProductImagesByRoleForTargets( + productAssetLinks: Collection, + productAssets: Collection, + targetType: ProductAssetLinkTarget, + targetIds: string[], + roles: ProductAssetRole[], +): Promise> { + const normalizedTargetIds = toUniqueStringList(targetIds); + const normalizedRoles = toUniqueStringList(roles); + if (normalizedTargetIds.length === 0 || normalizedRoles.length === 0) { + return new Map(); + } + + const targetIdFilter: string | InFilter = + normalizedTargetIds.length === 1 ? normalizedTargetIds[0]! : { in: normalizedTargetIds }; + const roleFilter: string | InFilter = + normalizedRoles.length === 1 ? normalizedRoles[0]! : { in: normalizedRoles }; + + const query: { where: Record } = { + where: { + targetType, + targetId: targetIdFilter, + role: roleFilter, + }, + }; + const links = await queryAllPages((cursor) => + productAssetLinks.query({ + ...query, + cursor, + limit: 100, + }), + ); + const assetIds = toUniqueStringList(links.map((link) => link.data.assetId)); + const assetsById = await getManyByIds(productAssets, assetIds); + const linksByTarget = new Map(); + for (const link of links) { + const normalized = linksByTarget.get(link.data.targetId) ?? []; + normalized.push(link.data); + linksByTarget.set(link.data.targetId, normalized); + } + + const imagesByTarget = new Map(); + for (const [targetId, targetLinks] of linksByTarget) { + const sortedLinks = sortOrderedRowsByPosition(targetLinks); + const rows: ProductPrimaryImageDTO[] = []; + for (const link of sortedLinks) { + const asset = assetsById.get(link.assetId); + if (!asset) { + continue; + } + rows.push({ + linkId: link.id, + assetId: asset.id, + provider: asset.provider, + externalAssetId: asset.externalAssetId, + fileName: asset.fileName, + altText: asset.altText, + }); + } + imagesByTarget.set(targetId, rows); + } + return imagesByTarget; +} + +export async function querySkuOptionValuesBySkuIds( + productSkuOptionValues: Collection, + skuIds: string[], +): Promise>> { + const normalizedSkuIds = toUniqueStringList(skuIds); + if (normalizedSkuIds.length === 0) { + return new Map(); + } + + const rows = await queryAllPages((cursor) => + productSkuOptionValues.query({ + where: { skuId: { in: normalizedSkuIds } }, + cursor, + limit: 100, + }), + ); + const bySkuId = new Map>(); + for (const row of rows) { + const current = bySkuId.get(row.data.skuId) ?? []; + current.push({ + attributeId: row.data.attributeId, + attributeValueId: row.data.attributeValueId, + }); + bySkuId.set(row.data.skuId, current); + } + return bySkuId; +} + +export async function queryDigitalEntitlementSummariesBySkuIds( + productDigitalEntitlements: Collection, + productDigitalAssets: Collection, + skuIds: string[], +): Promise> { + const normalizedSkuIds = toUniqueStringList(skuIds); + if (normalizedSkuIds.length === 0) { + return new Map(); + } + + const entitlementRows = await queryAllPages((cursor) => + productDigitalEntitlements.query({ + where: { skuId: { in: normalizedSkuIds } }, + cursor, + limit: 100, + }), + ); + const assetIds = toUniqueStringList(entitlementRows.map((row) => row.data.digitalAssetId)); + const assetsById = await getManyByIds(productDigitalAssets, assetIds); + const summariesBySku = new Map(); + for (const entitlement of entitlementRows) { + const asset = assetsById.get(entitlement.data.digitalAssetId); + if (!asset) { + continue; + } + const current = summariesBySku.get(entitlement.data.skuId) ?? []; + current.push({ + entitlementId: entitlement.data.id, + digitalAssetId: entitlement.data.digitalAssetId, + digitalAssetLabel: asset.label, + grantedQuantity: entitlement.data.grantedQuantity, + downloadLimit: asset.downloadLimit, + downloadExpiryDays: asset.downloadExpiryDays, + isManualOnly: asset.isManualOnly, + isPrivate: asset.isPrivate, + }); + summariesBySku.set(entitlement.data.skuId, current); + } + return summariesBySku; +} + +export function hydrateSkusWithInventoryStock( + product: StoredProduct, + skuRows: StoredProductSku[], + inventoryStock: Collection | null, +): Promise { + if (!inventoryStock) { + return Promise.resolve(skuRows); + } + + return Promise.all( + skuRows.map(async (sku) => { + const variantStock = await inventoryStock.get(inventoryStockDocId(product.id, sku.id)); + const productLevelStock = + product.type === "simple" && skuRows.length === 1 + ? await inventoryStock.get(inventoryStockDocId(product.id, "")) + : null; + const stock = variantStock ?? productLevelStock; + if (!stock) { + return sku; + } + return { + ...sku, + inventoryQuantity: stock.quantity, + inventoryVersion: stock.version, + }; + }), + ); +} + +function toProductCategoryDTO(row: StoredCategory): ProductCategoryDTO { + return { + id: row.id, + name: row.name, + slug: row.slug, + parentId: row.parentId, + position: row.position, + }; +} + +function toProductTagDTO(row: StoredProductTag): ProductTagDTO { + return { + id: row.id, + name: row.name, + slug: row.slug, + }; +} + +interface Collection { + get: (id: string) => Promise; + query: ( + options: Record, + ) => Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean; cursor?: string }>; +} diff --git a/packages/plugins/commerce/src/handlers/catalog.test.ts b/packages/plugins/commerce/src/handlers/catalog.test.ts new file mode 100644 index 000000000..3fba5eb8d --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog.test.ts @@ -0,0 +1,4298 @@ +import type { RouteContext } from "emdash"; +import { describe, expect, it } from "vitest"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { inventoryStockDocId } from "../lib/inventory-stock.js"; +import { sortedImmutable } from "../lib/sort-immutable.js"; +import type { + ProductAssetLinkInput, + ProductAssetReorderInput, + ProductAssetRegisterInput, + ProductAssetUnlinkInput, + ProductSkuCreateInput, + ProductSkuUpdateInput, + ProductCreateInput, + DigitalAssetCreateInput, + DigitalEntitlementCreateInput, + BundleComponentAddInput, + BundleComponentRemoveInput, + BundleComponentReorderInput, + BundleComputeInput, + CategoryCreateInput, + ProductCategoryLinkInput, + ProductCategoryUnlinkInput, + ProductListInput, + TagCreateInput, + ProductTagLinkInput, + ProductTagUnlinkInput, +} from "../schemas.js"; +import { + productAssetLinkInputSchema, + productAssetReorderInputSchema, + productAssetRegisterInputSchema, + productAssetUnlinkInputSchema, + productCreateInputSchema, + digitalAssetCreateInputSchema, + digitalEntitlementCreateInputSchema, + categoryCreateInputSchema, + categoryListInputSchema, + productCategoryLinkInputSchema, + productCategoryUnlinkInputSchema, + tagCreateInputSchema, + tagListInputSchema, + productTagLinkInputSchema, + productTagUnlinkInputSchema, + bundleComponentAddInputSchema, + productUpdateInputSchema, +} from "../schemas.js"; +import type { + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductAttribute, + StoredProductAttributeValue, + StoredBundleComponent, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredCategory, + StoredProductCategoryLink, + StoredProductTag, + StoredProductTagLink, + StoredProductSku, + StoredProductSkuOptionValue, + StoredInventoryStock, +} from "../types.js"; +import { + createProductHandler, + setProductStateHandler, + createProductSkuHandler, + getProductHandler, + getStorefrontProductHandler, + setSkuStatusHandler, + updateProductHandler, + updateProductSkuHandler, + linkCatalogAssetHandler, + reorderCatalogAssetHandler, + registerProductAssetHandler, + unlinkCatalogAssetHandler, + listProductsHandler, + listStorefrontProductsHandler, + listProductSkusHandler, + listStorefrontProductSkusHandler, + createCategoryHandler, + listCategoriesHandler, + createProductCategoryLinkHandler, + removeProductCategoryLinkHandler, + createTagHandler, + listTagsHandler, + createProductTagLinkHandler, + removeProductTagLinkHandler, + addBundleComponentHandler, + reorderBundleComponentHandler, + removeBundleComponentHandler, + bundleComputeHandler, + bundleComputeStorefrontHandler, + createDigitalAssetHandler, + createDigitalEntitlementHandler, + removeDigitalEntitlementHandler, +} from "./catalog.js"; + +const PRODUCT_ID_PREFIX = /^prod_/; +const SKU_ID_PREFIX = /^sku_/; +const ASSET_ID_PREFIX = /^asset_/; + +class MemColl { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } + + async getMany(ids: string[]): Promise> { + const rows = new Map(); + for (const id of ids) { + const row = this.rows.get(id); + if (row) { + rows.set(id, structuredClone(row)); + } + } + return rows; + } + + async delete(id: string): Promise { + return this.rows.delete(id); + } + + async deleteMany(ids: string[]): Promise { + let count = 0; + for (const id of ids) { + if (this.rows.delete(id)) { + count++; + } + } + return count; + } + + async query(options?: { + where?: Record; + limit?: number; + }): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }> { + const where = options?.where ?? {}; + const values = [...this.rows.entries()].filter(([, row]) => + Object.entries(where).every(([field, expected]) => { + const rowValue = (row as Record)[field]; + if (expected && typeof expected === "object" && !Array.isArray(expected)) { + const maybeInFilter = expected as { in?: unknown[] }; + if (Array.isArray(maybeInFilter.in)) { + return maybeInFilter.in.includes(rowValue); + } + } + return rowValue === expected; + }), + ); + const items = values + .slice(0, options?.limit ?? 50) + .map(([id, row]) => ({ id, data: structuredClone(row) })); + return { items, hasMore: false }; + } + + async putMany(items: Array<{ id: string; data: T }>): Promise { + for (const item of items) { + await this.put(item.id, structuredClone(item.data)); + } + } +} + +class ConstraintConflictMemColl extends MemColl { + constructor( + private readonly conflicts: (existing: T, next: T) => boolean, + rows: Map = new Map(), + ) { + super(rows); + } + + async putIfAbsent(id: string, data: T): Promise { + for (const existing of this.rows.values()) { + if (this.conflicts(existing, data)) { + return false; + } + } + await this.put(id, structuredClone(data)); + return true; + } + + override async query(_options?: { + [key: string]: unknown; + }): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }> { + return { items: [], hasMore: false }; + } +} + +class QueryCountingMemColl extends MemColl { + queryCount = 0; + + override async query(options?: { + where?: Record; + limit?: number; + }): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }> { + this.queryCount += 1; + return super.query(options); + } +} + +function catalogCtx( + input: TInput, + products: MemColl, + productSkus = new MemColl(), + productAssets = new MemColl(), + productAssetLinks = new MemColl(), + productAttributes = new MemColl(), + productAttributeValues = new MemColl(), + productSkuOptionValues = new MemColl(), + bundleComponents = new MemColl(), + categories = new MemColl(), + productCategoryLinks = new MemColl(), + productTags = new MemColl(), + productTagLinks = new MemColl(), + digitalAssets = new MemColl(), + digitalEntitlements = new MemColl(), + inventoryStock = new MemColl(), +): RouteContext { + return { + request: new Request("https://example.test/catalog", { method: "POST" }), + input, + storage: { + products, + productSkus, + productAssets, + productAssetLinks, + productAttributes, + productAttributeValues, + productSkuOptionValues, + bundleComponents, + categories, + productCategoryLinks, + productTags, + productTagLinks, + digitalAssets, + digitalEntitlements, + inventoryStock, + }, + requestMeta: { ip: "127.0.0.1" }, + kv: {}, + } as unknown as RouteContext; +} + +describe("catalog product handlers", () => { + it("creates a product and persists it in storage", async () => { + const products = new MemColl(); + const out = await createProductHandler( + catalogCtx( + { + type: "simple", + status: "draft", + visibility: "hidden", + slug: "simple-runner", + title: "Simple Runner", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + }, + products, + ), + ); + + expect(out.product.id).toMatch(PRODUCT_ID_PREFIX); + expect(products.rows.size).toBe(1); + }); + + it("rejects duplicate product slugs", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "dup", + title: "Existing", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const ctx = catalogCtx( + { + type: "simple", + status: "draft", + visibility: "hidden", + slug: "dup", + title: "Duplicate", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 1, + requiresShippingDefault: true, + }, + products, + ); + await expect(createProductHandler(ctx)).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("uses storage conflict on duplicate product slug insert", async () => { + const products = new ConstraintConflictMemColl((existing, next) => { + return existing.slug === next.slug; + }); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "dup", + title: "Existing", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const ctx = catalogCtx( + { + type: "simple", + status: "draft", + visibility: "hidden", + slug: "dup", + title: "Duplicate", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 1, + requiresShippingDefault: true, + }, + products, + ); + + await expect(createProductHandler(ctx)).rejects.toMatchObject({ + code: "BAD_REQUEST", + message: "Product slug already exists: dup", + }); + }); + + it("rejects duplicate slugs on product update", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "first", + title: "Existing One", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_2", { + id: "prod_2", + type: "simple", + status: "active", + visibility: "public", + slug: "second", + title: "Existing Two", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const duplicate = updateProductHandler( + catalogCtx( + { + productId: "prod_1", + slug: "second", + }, + products, + ), + ); + await expect(duplicate).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("creates variable products with variant attributes and values", async () => { + const products = new MemColl(); + const productAttributes = new MemColl(); + const productAttributeValues = new MemColl(); + + const out = await createProductHandler( + catalogCtx( + { + type: "variable", + status: "draft", + visibility: "hidden", + slug: "tee-shirt", + title: "Tee Shirt", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [ + { value: "Red", code: "red", position: 0 }, + { value: "Blue", code: "blue", position: 1 }, + ], + }, + { + name: "Size", + code: "size", + kind: "variant_defining", + position: 1, + values: [ + { value: "Small", code: "s", position: 0 }, + { value: "Large", code: "l", position: 1 }, + ], + }, + ], + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + ), + ); + + expect(out.product.type).toBe("variable"); + expect(productAttributes.rows.size).toBe(2); + expect(productAttributeValues.rows.size).toBe(4); + }); + + it("rejects variable products without variant-defining attributes", async () => { + const products = new MemColl(); + const out = createProductHandler( + catalogCtx( + { + type: "variable", + status: "draft", + visibility: "hidden", + slug: "bad-variable", + title: "Bad Variable", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Material", + code: "material", + kind: "descriptive", + position: 0, + values: [{ value: "Cotton", code: "cotton", position: 0 }], + }, + ], + }, + products, + ), + ); + await expect(out).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("rejects variable products with duplicate attribute codes", async () => { + const products = new MemColl(); + const out = createProductHandler( + catalogCtx( + { + type: "variable", + status: "draft", + visibility: "hidden", + slug: "dup-attr", + title: "Dup Attr", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [{ value: "Red", code: "red", position: 0 }], + }, + { + name: "Color Alt", + code: "color", + kind: "variant_defining", + position: 1, + values: [{ value: "Blue", code: "blue", position: 0 }], + }, + ], + }, + products, + ), + ); + await expect(out).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("rejects duplicate value codes within a variable attribute", async () => { + const products = new MemColl(); + const out = createProductHandler( + catalogCtx( + { + type: "variable", + status: "draft", + visibility: "hidden", + slug: "dup-value", + title: "Dup Value", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [ + { value: "Red", code: "red", position: 0 }, + { value: "Maroon", code: "red", position: 1 }, + ], + }, + ], + }, + products, + ), + ); + await expect(out).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("updates mutable product fields and preserves immutable fields", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "draft", + visibility: "hidden", + slug: "editable", + title: "Original", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + archivedAt: undefined, + publishedAt: undefined, + }); + + const out = await updateProductHandler( + catalogCtx( + { + productId: "prod_1", + title: "Updated Title", + featured: true, + }, + products, + ), + ); + + expect(out.product.title).toBe("Updated Title"); + expect(out.product.featured).toBe(true); + expect(out.product.id).toBe("prod_1"); + expect(out.product.type).toBe("simple"); + expect(out.product.createdAt).toBe("2026-01-01T00:00:00.000Z"); + }); + + it("rejects immutable product field updates", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "draft", + visibility: "hidden", + slug: "immutable", + title: "Original", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = updateProductHandler( + catalogCtx( + { + productId: "prod_1", + type: "bundle", + }, + products, + ), + ); + await expect(out).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("rejects bundle discount fields on non-bundle product updates", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "simple", + title: "Simple Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = updateProductHandler( + catalogCtx( + { + productId: "prod_1", + bundleDiscountType: "fixed_amount", + bundleDiscountValueMinor: 100, + }, + products, + ), + ); + await expect(out).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("sets product status transitions", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "draft", + visibility: "hidden", + slug: "stateful", + title: "State", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const archived = await setProductStateHandler( + catalogCtx( + { + productId: "prod_1", + status: "archived", + }, + products, + ), + ); + expect(archived.product.status).toBe("archived"); + expect(archived.product.archivedAt).toBeTypeOf("string"); + + const draft = await setProductStateHandler( + catalogCtx( + { + productId: "prod_1", + status: "draft", + }, + products, + ), + ); + expect(draft.product.status).toBe("draft"); + expect(draft.product.archivedAt).toBeUndefined(); + }); + + it("lists products filtered by status and type", async () => { + const products = new MemColl(); + await products.put("p1", { + id: "p1", + type: "simple", + status: "active", + visibility: "public", + slug: "alpha", + title: "Alpha", + shortDescription: "", + longDescription: "", + featured: true, + sortOrder: 10, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("p2", { + id: "p2", + type: "simple", + status: "draft", + visibility: "hidden", + slug: "beta", + title: "Beta", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 5, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = await listProductsHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: undefined, + limit: 20, + }, + products, + ), + ); + expect(out.items).toHaveLength(1); + expect(out.items[0]!.product.id).toBe("p1"); + }); + + it("counts low-stock SKUs using COMMERCE_LIMITS.lowStockThreshold", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const threshold = COMMERCE_LIMITS.lowStockThreshold; + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "low-stock-product", + title: "Low Stock Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_low", { + id: "sku_low", + productId: "prod_1", + skuCode: "LOW", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: threshold, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_safe", { + id: "sku_safe", + productId: "prod_1", + skuCode: "SAFE", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: threshold + 1, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = await listProductsHandler( + catalogCtx( + { + type: "simple", + visibility: "public", + limit: 10, + }, + products, + skus, + ), + ); + expect(out.items).toHaveLength(1); + expect(out.items[0]!.lowStockSkuCount).toBe(1); + }); + + it("uses inventory stock rows for list inventory summary calculations", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "low-stock-product", + title: "Low Stock Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_low", { + id: "sku_low", + productId: "prod_1", + skuCode: "LOW", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 100, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const listCtx = catalogCtx( + { type: "simple", visibility: "public", limit: 10 }, + products, + skus, + ); + const inventoryStock = ( + listCtx.storage as unknown as { inventoryStock: MemColl } + ).inventoryStock; + await inventoryStock.put(inventoryStockDocId("prod_1", ""), { + productId: "prod_1", + variantId: "", + version: 3, + quantity: 0, + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = await listProductsHandler(listCtx); + expect(out.items).toHaveLength(1); + expect(out.items[0]!.inventorySummary.totalInventoryQuantity).toBe(0); + expect(out.items[0]!.lowStockSkuCount).toBe(1); + }); + + it("returns storefront list products with active/public defaults and safe payload", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "active-product", + title: "Active Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_2", { + id: "prod_2", + type: "simple", + status: "active", + visibility: "hidden", + slug: "hidden-product", + title: "Hidden Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 1, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_3", { + id: "prod_3", + type: "simple", + status: "draft", + visibility: "public", + slug: "draft-product", + title: "Draft Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 2, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "SKU1", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = await listStorefrontProductsHandler( + catalogCtx( + { + type: "simple", + limit: 10, + }, + products, + skus, + ), + ); + expect(out.items).toHaveLength(1); + expect(out.items[0]).toMatchObject({ + product: { id: "prod_1", status: "active", visibility: "public" }, + }); + expect("inventorySummary" in out.items[0]!).toBe(false); + expect("longDescription" in out.items[0]!.product).toBe(false); + }); + + it("returns storefront product detail without raw inventory fields", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "safe-product", + title: "Safe Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "SKU1", + status: "active", + unitPriceMinor: 500, + inventoryQuantity: 100, + inventoryVersion: 4, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const detail = await getStorefrontProductHandler( + catalogCtx({ productId: "prod_1" }, products, skus), + ); + expect(detail.product).toMatchObject({ id: "prod_1", title: "Safe Product" }); + expect("longDescription" in detail.product).toBe(false); + expect(detail.skus?.[0]).toMatchObject({ id: "sku_1", availability: "in_stock" }); + expect("inventoryQuantity" in (detail.skus?.[0] as object)).toBe(false); + expect("inventoryVersion" in (detail.skus?.[0] as object)).toBe(false); + }); + + it("hides storefront product detail for non-public products", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_hidden", { + id: "prod_hidden", + type: "simple", + status: "active", + visibility: "hidden", + slug: "hidden-product", + title: "Hidden Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_hidden", { + id: "sku_hidden", + productId: "prod_hidden", + skuCode: "HID", + status: "active", + unitPriceMinor: 100, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await expect( + getStorefrontProductHandler(catalogCtx({ productId: "prod_hidden" }, products, skus)), + ).rejects.toThrow("Product not available"); + }); + + it("returns storefront sku list without raw inventory fields", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "stock-product", + title: "Stock Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "SKU1", + status: "active", + unitPriceMinor: 500, + inventoryQuantity: 100, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_2", { + id: "sku_2", + productId: "prod_1", + skuCode: "SKU2", + status: "inactive", + unitPriceMinor: 600, + inventoryQuantity: 0, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = await listStorefrontProductSkusHandler( + catalogCtx({ productId: "prod_1", limit: 100 }, products, skus), + ); + expect(out.items).toHaveLength(1); + expect(out.items[0]).toMatchObject({ id: "sku_1", availability: "in_stock" }); + expect("inventoryQuantity" in (out.items[0] as object)).toBe(false); + expect("inventoryVersion" in (out.items[0] as object)).toBe(false); + }); + + it("hides storefront SKU lists for non-public products", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_hidden", { + id: "prod_hidden", + type: "simple", + status: "active", + visibility: "hidden", + slug: "hidden-product", + title: "Hidden Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_hidden", { + id: "sku_hidden", + productId: "prod_hidden", + skuCode: "HID", + status: "active", + unitPriceMinor: 100, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await expect( + listStorefrontProductSkusHandler( + catalogCtx({ productId: "prod_hidden", limit: 100 }, products, skus), + ), + ).rejects.toThrow("Product not available"); + }); + + it("reads simple product SKU inventory from inventoryStock in product detail", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "stock-product", + title: "Stock Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "STOCK", + status: "active", + unitPriceMinor: 500, + inventoryQuantity: 100, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const getCtx = catalogCtx({ productId: "prod_1" }, products, skus); + const inventoryStock = ( + getCtx.storage as unknown as { inventoryStock: MemColl } + ).inventoryStock; + await inventoryStock.put(inventoryStockDocId("prod_1", "sku_1"), { + productId: "prod_1", + variantId: "sku_1", + version: 6, + quantity: 6, + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const detail = await getProductHandler(getCtx); + expect(detail.skus?.[0]).toMatchObject({ + id: "sku_1", + inventoryQuantity: 6, + inventoryVersion: 6, + }); + }); + + it("falls back to product-level inventory stock when a simple SKU stock row is missing", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const createCtx = catalogCtx( + { + productId: "prod_1", + skuCode: "STOCK", + status: "active", + unitPriceMinor: 500, + inventoryQuantity: 6, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "stock-product", + title: "Stock Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const created = await createProductSkuHandler(createCtx); + const inventoryStock = ( + createCtx.storage as unknown as { inventoryStock: MemColl } + ).inventoryStock; + await inventoryStock.delete(inventoryStockDocId(created.sku.productId, created.sku.id)); + + const readCtx = { ...createCtx, input: { productId: "prod_1" } } as unknown as RouteContext<{ + productId: string; + }>; + const detail = await getProductHandler(readCtx); + expect(detail.skus?.[0]).toMatchObject({ + id: created.sku.id, + inventoryQuantity: created.sku.inventoryQuantity, + inventoryVersion: 1, + }); + expect(await inventoryStock.get(inventoryStockDocId("prod_1", ""))).toMatchObject({ + productId: "prod_1", + variantId: "", + quantity: created.sku.inventoryQuantity, + version: 1, + }); + }); + + it("returns the same category/tag/image metadata from product detail and listing", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAssets = new MemColl(); + const productAssetLinks = new MemColl(); + const productCategories = new MemColl(); + const productCategoryLinks = new MemColl(); + const productTags = new MemColl(); + const productTagLinks = new MemColl(); + + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "seeded-product", + title: "Seeded Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "INV-1", + status: "active", + unitPriceMinor: 1200, + inventoryQuantity: 4, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await productCategories.put("cat_1", { + id: "cat_1", + name: "Featured", + slug: "featured", + parentId: undefined, + position: 0, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productCategoryLinks.put("pcat_1", { + id: "pcat_1", + productId: "prod_1", + categoryId: "cat_1", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await productTags.put("tag_1", { + id: "tag_1", + name: "Sale", + slug: "sale", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productTagLinks.put("ptag_1", { + id: "ptag_1", + productId: "prod_1", + tagId: "tag_1", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await productAssets.put("asset_primary", { + id: "asset_primary", + provider: "media", + externalAssetId: "media-primary", + fileName: "primary.jpg", + altText: "Primary image", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productAssets.put("asset_gallery", { + id: "asset_gallery", + provider: "media", + externalAssetId: "media-gallery", + fileName: "gallery.jpg", + altText: "Gallery image", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_primary", + targetType: "product", + targetId: "prod_1", + role: "primary_image", + position: 0, + }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_gallery", + targetType: "product", + targetId: "prod_1", + role: "gallery_image", + position: 0, + }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + + const detail = await getProductHandler( + catalogCtx( + { productId: "prod_1" }, + products, + skus, + productAssets, + productAssetLinks, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + productCategories, + productCategoryLinks, + productTags, + productTagLinks, + ), + ); + + const list = await listProductsHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + limit: 10, + }, + products, + skus, + productAssets, + productAssetLinks, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + productCategories, + productCategoryLinks, + productTags, + productTagLinks, + ), + ); + + expect(list.items).toHaveLength(1); + const listed = list.items[0]!; + expect(listed.product.id).toBe("prod_1"); + expect(listed.categories).toEqual(detail.categories); + expect(listed.tags).toEqual(detail.tags); + expect(listed.primaryImage).toEqual(detail.primaryImage); + expect(listed.galleryImages).toEqual(detail.galleryImages); + expect(listed.inventorySummary.totalInventoryQuantity).toBe( + detail.skus?.[0]?.inventoryQuantity, + ); + expect(listed.lowStockSkuCount).toBe( + detail.skus?.filter( + (sku) => + sku.status === "active" && sku.inventoryQuantity <= COMMERCE_LIMITS.lowStockThreshold, + ).length ?? 0, + ); + }); + + it("returns product_unavailable when productId does not exist", async () => { + const out = getProductHandler(catalogCtx({ productId: "missing" }, new MemColl())); + await expect(out).rejects.toMatchObject({ code: "product_unavailable" }); + }); + + it("returns entitlement summaries in product detail view", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const digitalAssets = new MemColl(); + const digitalEntitlements = new MemColl(); + + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "digital-product", + title: "Digital Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "DIGI", + status: "active", + unitPriceMinor: 199, + inventoryQuantity: 100, + inventoryVersion: 1, + requiresShipping: false, + isDigital: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const asset = await createDigitalAssetHandler( + catalogCtx( + { + externalAssetId: "media-101", + provider: "media", + label: "Product Manual", + downloadLimit: 1, + downloadExpiryDays: 30, + isManualOnly: true, + isPrivate: true, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + + await createDigitalEntitlementHandler( + catalogCtx( + { + skuId: "sku_1", + digitalAssetId: asset.asset.id, + grantedQuantity: 2, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + + const out = await getProductHandler( + catalogCtx( + { productId: "prod_1" }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + + expect(out.digitalEntitlements).toEqual([ + { + skuId: "sku_1", + entitlements: [ + { + entitlementId: expect.any(String), + digitalAssetId: asset.asset.id, + digitalAssetLabel: "Product Manual", + grantedQuantity: 2, + downloadLimit: 1, + downloadExpiryDays: 30, + isManualOnly: true, + isPrivate: true, + }, + ], + }, + ]); + }); +}); + +describe("catalog SKU handlers", () => { + it("creates SKU rows and lists them by productId", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const createSkuCtx = catalogCtx( + { + productId: "parent", + skuCode: "SIMPLE-A", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ); + const created = await createProductSkuHandler(createSkuCtx); + expect(created.sku.skuCode).toBe("SIMPLE-A"); + expect(created.sku.id).toMatch(SKU_ID_PREFIX); + + const listCtx = catalogCtx({ productId: "parent", limit: 10 }, products, skus); + const listed = await listProductSkusHandler(listCtx); + expect(listed.items).toHaveLength(1); + expect(listed.items[0]!.id).toBe(created.sku.id); + }); + + it("rejects creating more than one SKU for simple products", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await createProductSkuHandler( + catalogCtx( + { + productId: "parent", + skuCode: "SIMPLE-A", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ), + ); + + const second = createProductSkuHandler( + catalogCtx( + { + productId: "parent", + skuCode: "SIMPLE-B", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ), + ); + + await expect(second).rejects.toMatchObject({ code: "BAD_REQUEST" }); + expect(skus.rows.size).toBe(1); + }); + + it("stores variant option mappings and returns a variable matrix on get", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAttributes = new MemColl(); + const productAttributeValues = new MemColl(); + const productSkuOptionValues = new MemColl(); + + const product = await createProductHandler( + catalogCtx( + { + type: "variable", + status: "active", + visibility: "public", + slug: "variable-shirt", + title: "Variable Shirt", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [ + { value: "Red", code: "red", position: 0 }, + { value: "Blue", code: "blue", position: 1 }, + ], + }, + { + name: "Size", + code: "size", + kind: "variant_defining", + position: 1, + values: [ + { value: "Small", code: "s", position: 0 }, + { value: "Large", code: "l", position: 1 }, + ], + }, + ], + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + ), + ); + + const colorAttribute = [...productAttributes.rows.values()].find( + (attribute) => attribute.code === "color", + ); + expect(colorAttribute).toBeDefined(); + const sizeAttribute = [...productAttributes.rows.values()].find( + (attribute) => attribute.code === "size", + ); + expect(sizeAttribute).toBeDefined(); + const valueByCode = new Map( + Array.from(productAttributeValues.rows.values(), (row) => [row.code, row.id]), + ); + + const skuA = await createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "VSHIRT-RS", + status: "active", + unitPriceMinor: 2100, + inventoryQuantity: 15, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { + attributeId: colorAttribute!.id, + attributeValueId: valueByCode.get("red")!, + }, + { + attributeId: sizeAttribute!.id, + attributeValueId: valueByCode.get("s")!, + }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + + const skuB = await createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "VSHIRT-BL", + status: "active", + unitPriceMinor: 2200, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { + attributeId: colorAttribute!.id, + attributeValueId: valueByCode.get("blue")!, + }, + { + attributeId: sizeAttribute!.id, + attributeValueId: valueByCode.get("l")!, + }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + + expect(skuA.sku.skuCode).toBe("VSHIRT-RS"); + expect(skuB.sku.skuCode).toBe("VSHIRT-BL"); + expect(productSkuOptionValues.rows.size).toBe(4); + + const detail = await getProductHandler( + catalogCtx( + { productId: product.product.id }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + expect(detail.attributes).toHaveLength(2); + expect(detail.variantMatrix).toHaveLength(2); + expect(detail.variantMatrix?.every((row) => row.options.length === 2)).toBe(true); + }); + + it("creates matching inventoryStock rows when creating a simple SKU", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const createCtx = catalogCtx( + { + productId: "parent", + skuCode: "SIMPLE-STOCK", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 12, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ); + const created = await createProductSkuHandler(createCtx); + const inventoryStock = ( + createCtx.storage as unknown as { inventoryStock: MemColl } + ).inventoryStock; + + const variantStock = await inventoryStock.get( + inventoryStockDocId(created.sku.productId, created.sku.id), + ); + const productStock = await inventoryStock.get(inventoryStockDocId(created.sku.productId, "")); + expect(inventoryStock.rows.size).toBe(2); + expect(variantStock).toMatchObject({ + productId: "parent", + variantId: created.sku.id, + quantity: 12, + version: 1, + }); + expect(productStock).toMatchObject({ + productId: "parent", + variantId: "", + quantity: 12, + version: 1, + }); + }); + + it("updates matching inventoryStock rows when SKU inventory fields change", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const inventoryStock = new MemColl(); + const productSkuCtx = (input: ProductSkuCreateInput | ProductSkuUpdateInput) => + catalogCtx( + input as Parameters[0], + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + inventoryStock, + ); + + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + const created = await createProductSkuHandler( + productSkuCtx({ + productId: "parent", + skuCode: "SIMPLE-STOCK", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 12, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }) as Parameters[0], + ); + + await updateProductSkuHandler( + productSkuCtx({ + skuId: created.sku.id, + inventoryQuantity: 3, + inventoryVersion: 4, + }) as Parameters[0], + ); + + const variantStock = await inventoryStock.get( + inventoryStockDocId(created.sku.productId, created.sku.id), + ); + const productStock = await inventoryStock.get(inventoryStockDocId(created.sku.productId, "")); + expect(variantStock).toMatchObject({ + productId: "parent", + variantId: created.sku.id, + quantity: 3, + version: 4, + }); + expect(productStock).toMatchObject({ + productId: "parent", + variantId: "", + quantity: 3, + version: 4, + }); + }); + + it("creates only variant-level inventoryStock for variable SKUs", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAttributes = new MemColl(); + const productAttributeValues = new MemColl(); + const productSkuOptionValues = new MemColl(); + + const product = await createProductHandler( + catalogCtx( + { + type: "variable", + status: "active", + visibility: "public", + slug: "variable-stock-product", + title: "Variable stock product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [{ value: "Red", code: "red", position: 0 }], + }, + ], + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + ), + ); + const colorAttribute = [...productAttributes.rows.values()].find( + (attribute) => attribute.productId === product.product.id, + ); + const colorValue = [...productAttributeValues.rows.values()].find( + (value) => value.attributeId === colorAttribute!.id, + ); + const createCtx = catalogCtx( + { + productId: product.product.id, + skuCode: "VAR-1", + status: "active", + unitPriceMinor: 1099, + inventoryQuantity: 7, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [{ attributeId: colorAttribute!.id, attributeValueId: colorValue!.id }], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ); + const created = await createProductSkuHandler(createCtx); + const inventoryStock = ( + createCtx.storage as unknown as { inventoryStock: MemColl } + ).inventoryStock; + + const variantStock = await inventoryStock.get( + inventoryStockDocId(created.sku.productId, created.sku.id), + ); + const productLevelStock = await inventoryStock.get( + inventoryStockDocId(created.sku.productId, ""), + ); + expect(inventoryStock.rows.size).toBe(1); + expect(variantStock).toMatchObject({ + productId: product.product.id, + variantId: created.sku.id, + quantity: 7, + version: 1, + }); + expect(productLevelStock).toBeNull(); + }); + + it("rejects variable SKU creation when option coverage is incomplete", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAttributes = new MemColl(); + const productAttributeValues = new MemColl(); + const productSkuOptionValues = new MemColl(); + + const product = await createProductHandler( + catalogCtx( + { + type: "variable", + status: "active", + visibility: "public", + slug: "incomplete-variable", + title: "Incomplete variable", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [ + { value: "Red", code: "red", position: 0 }, + { value: "Blue", code: "blue", position: 1 }, + ], + }, + { + name: "Size", + code: "size", + kind: "variant_defining", + position: 1, + values: [{ value: "Small", code: "s", position: 0 }], + }, + ], + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + ), + ); + + const missing = createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "MISS-1", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 1, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { + attributeId: [...productAttributes.rows.values()][0]!.id, + attributeValueId: [...productAttributeValues.rows.values()][0]!.id, + }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + await expect(missing).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("rejects option mappings on non-variable products", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "simple-parent", + title: "Simple Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = createProductSkuHandler( + catalogCtx( + { + productId: "parent", + skuCode: "BAD-MAP", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 1, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [{ attributeId: "attr_1", attributeValueId: "val_1" }], + }, + products, + skus, + ), + ); + + await expect(out).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("rejects duplicate and duplicate-combination SKU option mappings for variable products", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAttributes = new MemColl(); + const productAttributeValues = new MemColl(); + const productSkuOptionValues = new MemColl(); + + const product = await createProductHandler( + catalogCtx( + { + type: "variable", + status: "active", + visibility: "public", + slug: "combo-variable", + title: "Combo variable", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [{ value: "Red", code: "red", position: 0 }], + }, + ], + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + ), + ); + + const colorAttribute = [...productAttributes.rows.values()][0]!; + const colorValue = [...productAttributeValues.rows.values()][0]!; + + const duplicateAttributeValue = createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "DUP-1", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 1, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: colorValue.id }, + { attributeId: colorAttribute.id, attributeValueId: colorValue.id }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + await expect(duplicateAttributeValue).rejects.toMatchObject({ code: "BAD_REQUEST" }); + + await createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "V1", + status: "active", + unitPriceMinor: 1100, + inventoryQuantity: 2, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [{ attributeId: colorAttribute.id, attributeValueId: colorValue.id }], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + + const duplicateCombination = createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "V2", + status: "active", + unitPriceMinor: 1150, + inventoryQuantity: 2, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [{ attributeId: colorAttribute.id, attributeValueId: colorValue.id }], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + await expect(duplicateCombination).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("batches variable SKU validation reads for better scalability", async () => { + const products = new QueryCountingMemColl(); + const skus = new (class extends QueryCountingMemColl { + async putIfAbsent(id: string, data: StoredProductSku): Promise { + await this.put(id, data); + return true; + } + })(); + const productAttributes = new QueryCountingMemColl(); + const productAttributeValues = new QueryCountingMemColl(); + const productSkuOptionValues = new QueryCountingMemColl(); + + const product = await createProductHandler( + catalogCtx( + { + type: "variable", + status: "active", + visibility: "public", + slug: "scalable-variable", + title: "Scalable variable", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + attributes: [ + { + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + values: [ + { value: "Red", code: "red", position: 0 }, + { value: "Blue", code: "blue", position: 1 }, + ], + }, + { + name: "Size", + code: "size", + kind: "variant_defining", + position: 1, + values: [ + { value: "Small", code: "s", position: 0 }, + { value: "Large", code: "l", position: 1 }, + ], + }, + ], + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + ), + ); + + const colorAttribute = [...productAttributes.rows.values()].find( + (attribute) => attribute.code === "color", + ); + const sizeAttribute = [...productAttributes.rows.values()].find( + (attribute) => attribute.code === "size", + ); + const colorValues = [...productAttributeValues.rows.values()].filter( + (value) => value.attributeId === colorAttribute?.id, + ); + const sizeValues = [...productAttributeValues.rows.values()].filter( + (value) => value.attributeId === sizeAttribute?.id, + ); + if (!colorAttribute || !sizeAttribute || colorValues.length < 2 || sizeValues.length < 2) { + throw new Error("Test fixture missing required attributes"); + } + + await createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "V-ONE", + status: "active", + unitPriceMinor: 1100, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: colorValues[0]!.id }, + { attributeId: sizeAttribute.id, attributeValueId: sizeValues[0]!.id }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + + await createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "V-TWO", + status: "active", + unitPriceMinor: 1200, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: colorValues[1]!.id }, + { attributeId: sizeAttribute.id, attributeValueId: sizeValues[1]!.id }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + + products.queryCount = 0; + skus.queryCount = 0; + productAttributes.queryCount = 0; + productAttributeValues.queryCount = 0; + productSkuOptionValues.queryCount = 0; + + await createProductSkuHandler( + catalogCtx( + { + productId: product.product.id, + skuCode: "V-THREE", + status: "active", + unitPriceMinor: 1300, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: colorValues[0]!.id }, + { attributeId: sizeAttribute.id, attributeValueId: sizeValues[1]!.id }, + ], + }, + products, + skus, + new MemColl(), + new MemColl(), + productAttributes, + productAttributeValues, + productSkuOptionValues, + ), + ); + + expect(skus.queryCount).toBe(2); + expect(productAttributes.queryCount).toBe(1); + expect(productAttributeValues.queryCount).toBe(1); + expect(productSkuOptionValues.queryCount).toBe(1); + }); + + it("updates SKU fields without changing immutable identifiers", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const created = await createProductSkuHandler( + catalogCtx( + { + productId: "parent", + skuCode: "SIMPLE-A", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ), + ); + + const updated = await updateProductSkuHandler( + catalogCtx( + { + skuId: created.sku.id, + unitPriceMinor: 1499, + }, + products, + skus, + ), + ); + expect(updated.sku.unitPriceMinor).toBe(1499); + expect(updated.sku.productId).toBe("parent"); + }); + + it("rejects duplicate sku code on SKU update", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("parent_two", { + id: "parent_two", + type: "simple", + status: "active", + visibility: "public", + slug: "parent-two", + title: "Parent Two", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "parent", + skuCode: "SKU-ONE", + status: "active", + unitPriceMinor: 1200, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_2", { + id: "sku_2", + productId: "parent_two", + skuCode: "SKU-TWO", + status: "active", + unitPriceMinor: 1500, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const duplicate = updateProductSkuHandler( + catalogCtx( + { + skuId: "sku_2", + skuCode: "SKU-ONE", + }, + products, + skus, + ), + ); + await expect(duplicate).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("sets SKU active/inactive state", async () => { + const products = new MemColl(); + const skus = new MemColl(); + await products.put("parent", { + id: "parent", + type: "simple", + status: "active", + visibility: "public", + slug: "parent", + title: "Parent", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const created = await createProductSkuHandler( + catalogCtx( + { + productId: "parent", + skuCode: "SIMPLE-A", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + }, + products, + skus, + ), + ); + + const archived = await setSkuStatusHandler( + catalogCtx( + { + skuId: created.sku.id, + status: "inactive", + }, + products, + skus, + ), + ); + expect(archived.sku.status).toBe("inactive"); + }); +}); + +describe("catalog asset handlers", () => { + it("rejects binary-upload payload keys at the contract boundary", () => { + expect( + productAssetRegisterInputSchema.safeParse({ + externalAssetId: "media-1", + provider: "media", + file: "should-not-be-uploaded", + }).success, + ).toBe(false); + + expect( + productAssetLinkInputSchema.safeParse({ + assetId: "asset_1", + targetType: "product", + targetId: "prod_1", + role: "gallery_image", + position: 0, + stream: "binary", + }).success, + ).toBe(false); + + expect( + productAssetUnlinkInputSchema.safeParse({ + linkId: "link_1", + file: "should-not-be-uploaded", + }).success, + ).toBe(false); + + expect( + productAssetReorderInputSchema.safeParse({ + linkId: "link_1", + position: 0, + body: "not-expected", + }).success, + ).toBe(false); + }); + + it("returns asset_not_found when linking an unknown asset", async () => { + const products = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "base", + title: "Base", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const missingAsset = linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_missing", + targetType: "product", + targetId: "prod_1", + role: "gallery_image", + position: 0, + }, + products, + ), + ); + await expect(missingAsset).rejects.toMatchObject({ code: "asset_not_found" }); + }); + + it("registers provider-agnostic asset metadata without binary payload", async () => { + const productAssets = new MemColl(); + + const out = await registerProductAssetHandler( + catalogCtx( + { + externalAssetId: "media-123", + provider: "media", + fileName: "hero.jpg", + mimeType: "image/jpeg", + byteSize: 123_456, + }, + new MemColl(), + new MemColl(), + productAssets, + ), + ); + + expect(out.asset.id).toMatch(ASSET_ID_PREFIX); + expect(out.asset.provider).toBe("media"); + expect(out.asset.externalAssetId).toBe("media-123"); + expect(out.asset.mimeType).toBe("image/jpeg"); + }); + + it("links media metadata rows to a product and enforces one primary image per target", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAssets = new MemColl(); + const productAssetLinks = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "base", + title: "Base", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await productAssets.put("asset_1", { + id: "asset_1", + provider: "media", + externalAssetId: "media-1", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productAssets.put("asset_2", { + id: "asset_2", + provider: "media", + externalAssetId: "media-2", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const first = await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_1", + targetType: "product", + targetId: "prod_1", + role: "primary_image", + position: 0, + }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + expect(first.link.role).toBe("primary_image"); + expect(first.link.targetType).toBe("product"); + expect(first.link.targetId).toBe("prod_1"); + + const duplicatePrimary = linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_2", + targetType: "product", + targetId: "prod_1", + role: "primary_image", + position: 1, + }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + await expect(duplicatePrimary).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("links asset rows to SKU targets and supports reordering", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const productAssets = new MemColl(); + const productAssetLinks = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "base", + title: "Base", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "SKU-1", + status: "active", + unitPriceMinor: 1299, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + for (let index = 0; index < 2; index++) { + await productAssets.put(`asset_${index + 1}`, { + id: `asset_${index + 1}`, + provider: "media", + externalAssetId: `media-${index + 1}`, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + } + + const first = await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_1", + targetType: "sku", + targetId: "sku_1", + role: "gallery_image", + position: 0, + }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + const second = await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_2", + targetType: "sku", + targetId: "sku_1", + role: "gallery_image", + position: 1, + }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + + const reordered = await reorderCatalogAssetHandler( + catalogCtx( + { linkId: second.link.id, position: 0 }, + products, + skus, + productAssets, + productAssetLinks, + ), + ); + expect(reordered.link.position).toBe(0); + + const byTarget = await productAssetLinks.query({ + where: { targetType: "sku", targetId: "sku_1" }, + }); + const inOrder = byTarget.items.map((item) => item.data); + const ordered = sortedImmutable(inOrder, (left, right) => left.position - right.position); + expect(ordered[0]?.id).toBe(second.link.id); + expect(ordered[1]?.id).toBe(first.link.id); + }); + + it("unlinks an asset and removes its link row", async () => { + const products = new MemColl(); + const productAssets = new MemColl(); + const productAssetLinks = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "base", + title: "Base", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productAssets.put("asset_1", { + id: "asset_1", + provider: "media", + externalAssetId: "media-1", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const linked = await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_1", + targetType: "product", + targetId: "prod_1", + role: "gallery_image", + }, + products, + new MemColl(), + productAssets, + productAssetLinks, + ), + ); + + const out = await unlinkCatalogAssetHandler( + catalogCtx( + { + linkId: linked.link.id, + }, + products, + new MemColl(), + productAssets, + productAssetLinks, + ), + ); + expect(out.deleted).toBe(true); + + const removed = await productAssetLinks.get(linked.link.id); + expect(removed).toBeNull(); + }); + + it("returns asset_link_not_found when unlinking an unknown link", async () => { + const out = unlinkCatalogAssetHandler( + catalogCtx( + { linkId: "missing-link" }, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + ), + ); + await expect(out).rejects.toMatchObject({ code: "asset_link_not_found" }); + }); + + it("normalizes remaining asset link positions after unlink", async () => { + const products = new MemColl(); + const productAssets = new MemColl(); + const productAssetLinks = new MemColl(); + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "base", + title: "Base", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productAssets.put("asset_1", { + id: "asset_1", + provider: "media", + externalAssetId: "media-1", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await productAssets.put("asset_2", { + id: "asset_2", + provider: "media", + externalAssetId: "media-2", + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const firstLink = await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_1", + targetType: "product", + targetId: "prod_1", + role: "gallery_image", + }, + products, + new MemColl(), + productAssets, + productAssetLinks, + ), + ); + const secondLink = await linkCatalogAssetHandler( + catalogCtx( + { + assetId: "asset_2", + targetType: "product", + targetId: "prod_1", + role: "gallery_image", + }, + products, + new MemColl(), + productAssets, + productAssetLinks, + ), + ); + + const removed = await unlinkCatalogAssetHandler( + catalogCtx( + { + linkId: firstLink.link.id, + }, + products, + new MemColl(), + productAssets, + productAssetLinks, + ), + ); + expect(removed.deleted).toBe(true); + + const remaining = await productAssetLinks.query({ + where: { targetType: "product", targetId: "prod_1" }, + }); + expect(remaining.items).toHaveLength(1); + expect(remaining.items[0]!.data.id).toBe(secondLink.link.id); + expect(remaining.items[0]!.data.position).toBe(0); + }); +}); + +describe("catalog digital entitlement handlers", () => { + it("rejects binary-upload payload keys at the contract boundary", () => { + expect( + digitalAssetCreateInputSchema.safeParse({ + externalAssetId: "media-1", + provider: "media", + file: "should-not-be-uploaded", + }).success, + ).toBe(false); + expect( + digitalEntitlementCreateInputSchema.safeParse({ + skuId: "sku_1", + digitalAssetId: "asset_1", + grantedQuantity: 1, + file: "should-not-be-uploaded", + }).success, + ).toBe(false); + }); + + it("creates digital assets and entitlements, and enforces unique mapping per SKU+asset", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const digitalAssets = new MemColl(); + const digitalEntitlements = new MemColl(); + + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "digital-product", + title: "Digital Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "DIGI", + status: "active", + unitPriceMinor: 199, + inventoryQuantity: 100, + inventoryVersion: 1, + requiresShipping: false, + isDigital: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const asset = await createDigitalAssetHandler( + catalogCtx( + { + externalAssetId: "media-101", + provider: "media", + label: "Product Manual", + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + + const first = await createDigitalEntitlementHandler( + catalogCtx( + { + skuId: "sku_1", + digitalAssetId: asset.asset.id, + grantedQuantity: 1, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + expect(first.entitlement.skuId).toBe("sku_1"); + expect(first.entitlement.digitalAssetId).toBe(asset.asset.id); + await expect( + createDigitalEntitlementHandler( + catalogCtx( + { + skuId: "sku_1", + digitalAssetId: asset.asset.id, + grantedQuantity: 1, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ), + ).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); + + it("returns digital_asset_not_found when creating entitlements for missing digital asset", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const digitalAssets = new MemColl(); + const digitalEntitlements = new MemColl(); + + await products.put("prod_1", { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "digital-product", + title: "Digital Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_1", { + id: "sku_1", + productId: "prod_1", + skuCode: "DIGI", + status: "active", + unitPriceMinor: 199, + inventoryQuantity: 100, + inventoryVersion: 1, + requiresShipping: false, + isDigital: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const missing = createDigitalEntitlementHandler( + catalogCtx( + { + skuId: "sku_1", + digitalAssetId: "asset_missing", + grantedQuantity: 1, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + await expect(missing).rejects.toMatchObject({ code: "digital_asset_not_found" }); + }); + + it("removes entitlement assignments", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const digitalAssets = new MemColl(); + const digitalEntitlements = new MemColl(); + + await digitalEntitlements.put("ent_1", { + id: "ent_1", + skuId: "sku_1", + digitalAssetId: "asset_1", + grantedQuantity: 1, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const out = await removeDigitalEntitlementHandler( + catalogCtx( + { entitlementId: "ent_1" }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + digitalAssets, + digitalEntitlements, + ), + ); + expect(out.deleted).toBe(true); + + const missing = await digitalEntitlements.get("ent_1"); + expect(missing).toBeNull(); + }); + + it("returns digital_entitlement_not_found when removing a missing entitlement", async () => { + const out = removeDigitalEntitlementHandler( + catalogCtx( + { entitlementId: "missing-entitlement" }, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + ), + ); + await expect(out).rejects.toMatchObject({ code: "digital_entitlement_not_found" }); + }); +}); + +describe("catalog bundle handlers", () => { + it("adds components and computes discount-aware bundle summary", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const bundleComponents = new MemColl(); + + await products.put("prod_bundle", { + id: "prod_bundle", + type: "bundle", + status: "active", + visibility: "public", + slug: "starter-bundle", + title: "Starter Bundle", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + bundleDiscountType: "fixed_amount", + bundleDiscountValueMinor: 50, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_component_1", { + id: "prod_component_1", + type: "simple", + status: "active", + visibility: "public", + slug: "sock", + title: "Sock", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_component_2", { + id: "prod_component_2", + type: "simple", + status: "active", + visibility: "public", + slug: "blanket", + title: "Blanket", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_sock", { + id: "sku_sock", + productId: "prod_component_1", + skuCode: "SOCK", + status: "active", + unitPriceMinor: 100, + compareAtPriceMinor: 120, + inventoryQuantity: 6, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_blanket", { + id: "sku_blanket", + productId: "prod_component_2", + skuCode: "BLNK", + status: "active", + unitPriceMinor: 75, + inventoryQuantity: 4, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_sock", + quantity: 2, + position: 0, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_blanket", + quantity: 1, + position: 1, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + + const summary = await bundleComputeHandler( + catalogCtx( + { + productId: "prod_bundle", + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + + expect(summary.subtotalMinor).toBe(275); + expect(summary.discountAmountMinor).toBe(50); + expect(summary.finalPriceMinor).toBe(225); + expect(summary.availability).toBe(3); + expect(summary.components).toHaveLength(2); + }); + + it("sanitizes storefront bundle compute response", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const inventoryStock = new MemColl(); + const bundleComponents = new MemColl(); + + await products.put("prod_bundle", { + id: "prod_bundle", + type: "bundle", + status: "active", + visibility: "public", + slug: "winter-bundle", + title: "Winter Bundle", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_component", { + id: "prod_component", + type: "simple", + status: "active", + visibility: "public", + slug: "component", + title: "Component", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_component", { + id: "sku_component", + productId: "prod_component", + skuCode: "CMP", + status: "active", + unitPriceMinor: 50, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_component", + quantity: 2, + position: 0, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + inventoryStock, + ), + ); + + await inventoryStock.put("stock_component", { + productId: "prod_component", + variantId: "sku_component", + quantity: 10, + version: 1, + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const summary = await bundleComputeStorefrontHandler( + catalogCtx( + { + productId: "prod_bundle", + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + inventoryStock, + ), + ); + + expect(summary.components).toHaveLength(1); + const component = summary.components[0]; + expect((component as unknown as Record).componentSkuId).toBeUndefined(); + expect((component as unknown as Record).componentProductId).toBeUndefined(); + }); + + it("supports component reorder and removal with position normalizing", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const bundleComponents = new MemColl(); + + await products.put("prod_bundle", { + id: "prod_bundle", + type: "bundle", + status: "active", + visibility: "public", + slug: "winter-bundle", + title: "Winter Bundle", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_component_1", { + id: "prod_component_1", + type: "simple", + status: "active", + visibility: "public", + slug: "boot", + title: "Boot", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_component_2", { + id: "prod_component_2", + type: "simple", + status: "active", + visibility: "public", + slug: "cap", + title: "Cap", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_component_3", { + id: "prod_component_3", + type: "simple", + status: "active", + visibility: "public", + slug: "mitt", + title: "Mittens", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await skus.put("sku_boot", { + id: "sku_boot", + productId: "prod_component_1", + skuCode: "BOOT", + status: "active", + unitPriceMinor: 120, + compareAtPriceMinor: 150, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_cap", { + id: "sku_cap", + productId: "prod_component_2", + skuCode: "CAP", + status: "active", + unitPriceMinor: 40, + inventoryQuantity: 4, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_mitt", { + id: "sku_mitt", + productId: "prod_component_3", + skuCode: "MITT", + status: "active", + unitPriceMinor: 10, + inventoryQuantity: 8, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + const addedFirst = await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_boot", + quantity: 1, + position: 0, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + const addedSecond = await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_cap", + quantity: 1, + position: 1, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + const addedThird = await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_mitt", + quantity: 1, + position: 2, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + + const reordered = await reorderBundleComponentHandler( + catalogCtx( + { + bundleComponentId: addedThird.component.id, + position: 0, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + expect(reordered.component.position).toBe(0); + + const removed = await removeBundleComponentHandler( + catalogCtx( + { bundleComponentId: addedSecond.component.id }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + expect(removed.deleted).toBe(true); + + const list = await bundleComponents.query({ + where: { bundleProductId: "prod_bundle" }, + }); + expect(list.items.find((row) => row.id === addedFirst.component.id)?.data.position).toBe(1); + expect(list.items.find((row) => row.id === addedThird.component.id)?.data.position).toBe(0); + }); + + it("returns bundle_component_not_found when removing an unknown bundle component", async () => { + const out = removeBundleComponentHandler( + catalogCtx( + { bundleComponentId: "missing-component" }, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + ), + ); + await expect(out).rejects.toMatchObject({ code: "bundle_component_not_found" }); + }); + + it("rejects invalid bundle component composition", async () => { + const products = new MemColl(); + const skus = new MemColl(); + const bundleComponents = new MemColl(); + + await products.put("prod_bundle", { + id: "prod_bundle", + type: "bundle", + status: "active", + visibility: "public", + slug: "nested-bundle", + title: "Nested Bundle", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await products.put("prod_bundle_invalid", { + id: "prod_bundle_invalid", + type: "bundle", + status: "active", + visibility: "public", + slug: "nested-bundle-invalid", + title: "Nested Bundle Invalid", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("bundle_invalid_sku", { + id: "bundle_invalid_sku", + productId: "prod_bundle_invalid", + skuCode: "BUNDLE-SKU", + status: "active", + unitPriceMinor: 50, + inventoryQuantity: 10, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + + await expect( + addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "bundle_invalid_sku", + quantity: 1, + position: 0, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ), + ).rejects.toMatchObject({ code: "BAD_REQUEST" }); + + await products.put("prod_simple", { + id: "prod_simple", + type: "simple", + status: "active", + visibility: "public", + slug: "simple-component", + title: "Simple Component", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await skus.put("sku_simple", { + id: "sku_simple", + productId: "prod_simple", + skuCode: "SIMPLE", + status: "active", + unitPriceMinor: 30, + inventoryQuantity: 20, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }); + await addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_simple", + quantity: 1, + position: 0, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ); + await expect( + addBundleComponentHandler( + catalogCtx( + { + bundleProductId: "prod_bundle", + componentSkuId: "sku_simple", + quantity: 2, + position: 1, + }, + products, + skus, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + bundleComponents, + ), + ), + ).rejects.toMatchObject({ code: "BAD_REQUEST" }); + expect( + bundleComponentAddInputSchema.safeParse({ + bundleProductId: "prod_bundle", + componentSkuId: "sku_simple", + quantity: 0, + position: 0, + }).success, + ).toBe(false); + }); +}); + +describe("catalog organization", () => { + it("creates categories and filters listing by category", async () => { + const products = new MemColl(); + const categories = new MemColl(); + const productCategoryLinks = new MemColl(); + + const category = await createCategoryHandler( + catalogCtx( + { + name: "Electronics", + slug: "electronics", + position: 0, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + + const listedCategories = await listCategoriesHandler( + catalogCtx( + { + limit: 10, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + expect(listedCategories.items.map((item) => item.slug)).toEqual(["electronics"]); + + const cameraProduct = await createProductHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + slug: "camera", + title: "Camera", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + }, + products, + ), + ); + await createProductHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + slug: "lamp", + title: "Lamp", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 1, + requiresShippingDefault: true, + }, + products, + ), + ); + + const first = await listProductsHandler( + catalogCtx( + { + type: "simple", + limit: 10, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + expect(first.items.map((item) => item.product.slug)).toEqual(["camera", "lamp"]); + + await createProductCategoryLinkHandler( + catalogCtx( + { + productId: cameraProduct.product.id, + categoryId: category.category.id, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + + const filtered = await listProductsHandler( + catalogCtx( + { + type: "simple", + categoryId: category.category.id, + limit: 10, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + expect(filtered.items.map((item) => item.product.slug)).toEqual(["camera"]); + }); + + it("includes paged category members even when matched outside the product query default window", async () => { + const products = new MemColl(); + const categories = new MemColl(); + const productCategoryLinks = new MemColl(); + + const category = await createCategoryHandler( + catalogCtx( + { + name: "Catalog", + slug: "catalog", + position: 0, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + + let tailProductId = ""; + let tailProductSlug = ""; + for (let index = 0; index < 60; index += 1) { + const response = await createProductHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + slug: `product-${String(index).padStart(2, "0")}`, + title: `Product ${index}`, + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: index, + requiresShippingDefault: true, + }, + products, + ), + ); + if (index === 59) { + tailProductId = response.product.id; + tailProductSlug = response.product.slug; + } + } + + await createProductCategoryLinkHandler( + catalogCtx( + { + productId: tailProductId, + categoryId: category.category.id, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + + const filtered = await listProductsHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + categoryId: category.category.id, + limit: 50, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + categories, + productCategoryLinks, + ), + ); + + expect(filtered.items.map((item) => item.product.slug)).toEqual([tailProductSlug]); + }); + + it("creates tags and filters listing by tag", async () => { + const products = new MemColl(); + const tags = new MemColl(); + const productTagLinks = new MemColl(); + + const tag = await createTagHandler( + catalogCtx( + { + name: "Featured", + slug: "featured", + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + tags, + productTagLinks, + ), + ); + + const listedTags = await listTagsHandler( + catalogCtx( + { + limit: 10, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + tags, + productTagLinks, + ), + ); + expect(listedTags.items.map((item) => item.slug)).toEqual(["featured"]); + + const tumblerProduct = await createProductHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + slug: "tumbler", + title: "Tumbler", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + }, + products, + ), + ); + await createProductHandler( + catalogCtx( + { + type: "simple", + status: "active", + visibility: "public", + slug: "matte", + title: "Matte", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 1, + requiresShippingDefault: true, + }, + products, + ), + ); + + await createProductTagLinkHandler( + catalogCtx( + { + productId: tumblerProduct.product.id, + tagId: tag.tag.id, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + tags, + productTagLinks, + ), + ); + + const filtered = await listProductsHandler( + catalogCtx( + { + type: "simple", + tagId: tag.tag.id, + limit: 10, + }, + products, + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + new MemColl(), + productTagLinks, + ), + ); + expect(filtered.items.map((item) => item.product.slug)).toEqual(["tumbler"]); + }); + + it("returns category_link_not_found when unlinking a missing product-category link", async () => { + const out = removeProductCategoryLinkHandler( + catalogCtx({ linkId: "missing-link" }, new MemColl()), + ); + await expect(out).rejects.toMatchObject({ code: "category_link_not_found" }); + }); + + it("returns tag_link_not_found when unlinking a missing product-tag link", async () => { + const out = removeProductTagLinkHandler( + catalogCtx({ linkId: "missing-link" }, new MemColl()), + ); + await expect(out).rejects.toMatchObject({ code: "tag_link_not_found" }); + }); + + it("validates category and tag schema helpers", () => { + expect( + productCreateInputSchema.safeParse({ + type: "simple", + status: "draft", + visibility: "public", + slug: "simple-with-bundle-discount", + title: "Simple with discount", + bundleDiscountType: "fixed_amount", + bundleDiscountValueMinor: 100, + }).success, + ).toBe(false); + expect( + productCreateInputSchema.safeParse({ + type: "bundle", + status: "draft", + visibility: "public", + slug: "bundle-with-discount", + title: "Bundle with discount", + bundleDiscountType: "fixed_amount", + bundleDiscountValueMinor: 100, + }).success, + ).toBe(true); + expect( + categoryCreateInputSchema.safeParse({ name: "Tools", slug: "tools", position: 0 }).success, + ).toBe(true); + expect(categoryListInputSchema.safeParse({}).success).toBe(true); + expect( + productCategoryLinkInputSchema.safeParse({ productId: "p", categoryId: "c" }).success, + ).toBe(true); + expect(productCategoryUnlinkInputSchema.safeParse({ linkId: "link_1" }).success).toBe(true); + expect(tagCreateInputSchema.safeParse({ name: "Gift", slug: "gift" }).success).toBe(true); + expect(tagListInputSchema.safeParse({}).success).toBe(true); + expect(productTagLinkInputSchema.safeParse({ productId: "p", tagId: "t" }).success).toBe(true); + expect(productTagUnlinkInputSchema.safeParse({ linkId: "link_1" }).success).toBe(true); + expect( + productUpdateInputSchema.safeParse({ + productId: "p", + bundleDiscountType: "percentage", + bundleDiscountValueMinor: 500, + }).success, + ).toBe(false); + expect( + productUpdateInputSchema.safeParse({ + productId: "p", + bundleDiscountType: "fixed_amount", + bundleDiscountValueBps: 100, + }).success, + ).toBe(false); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/catalog.ts b/packages/plugins/commerce/src/handlers/catalog.ts new file mode 100644 index 000000000..c5b8e5f83 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/catalog.ts @@ -0,0 +1,480 @@ +/** + * Catalog management handlers for commerce plugin v1 foundation. + * + * This file implements the Phase 1 foundation slice from the catalog + * specification: products and product SKUs with basic write/read paths and + * invariant checks for catalog mutability and uniqueness constraints. + */ + +import type { RouteContext } from "emdash"; + +import { type BundleComputeSummary } from "../lib/catalog-bundles.js"; +import type { + CatalogListingDTO, + ProductCategoryDTO, + ProductDetailDTO, + ProductDigitalEntitlementSummary, + ProductInventorySummaryDTO, + ProductPrimaryImageDTO, + ProductPriceRangeDTO, + ProductTagDTO, + VariantMatrixDTO, +} from "../lib/catalog-dto.js"; +import type { + ProductCreateInput, + ProductAssetLinkInput, + ProductAssetReorderInput, + ProductAssetRegisterInput, + ProductAssetUnlinkInput, + ProductSkuStateInput, + ProductSkuUpdateInput, + ProductGetInput, + ProductListInput, + ProductSkuCreateInput, + DigitalAssetCreateInput, + DigitalEntitlementCreateInput, + DigitalEntitlementRemoveInput, + BundleComponentAddInput, + BundleComponentRemoveInput, + BundleComponentReorderInput, + BundleComputeInput, + ProductStateInput, + ProductUpdateInput, + ProductSkuListInput, + CategoryCreateInput, + CategoryListInput, + ProductCategoryLinkInput, + ProductCategoryUnlinkInput, + TagCreateInput, + TagListInput, + ProductTagLinkInput, + ProductTagUnlinkInput, +} from "../schemas.js"; +import type { + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductAttribute, + StoredProductAttributeValue, + StoredCategory, + StoredProductCategoryLink, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredProductTag, + StoredProductTagLink, + StoredBundleComponent, + StoredInventoryStock, + StoredProductSkuOptionValue, + StoredProductSku, +} from "../types.js"; +import { + handleLinkCatalogAsset, + handleReorderCatalogAsset, + handleRegisterProductAsset, + handleUnlinkCatalogAsset, +} from "./catalog-asset.js"; +import { + handleCreateCategory, + handleCreateTag, + handleListCategories, + handleCreateProductCategoryLink, + handleCreateProductTagLink, + handleRemoveProductCategoryLink, + handleRemoveProductTagLink, + handleListTags, +} from "./catalog-association.js"; +import { + handleAddBundleComponent, + handleBundleCompute, + handleRemoveBundleComponent, + handleReorderBundleComponent, +} from "./catalog-bundle.js"; +import { + handleCreateDigitalAsset, + handleCreateDigitalEntitlement, + handleRemoveDigitalEntitlement, +} from "./catalog-digital.js"; +import { + handleCreateProduct, + handleGetProduct, + handleListProducts, + handleSetProductState, + handleUpdateProduct, + handleCreateProductSku, + handleUpdateProductSku, + handleSetSkuStatus, + handleListProductSkus, + handleGetStorefrontProduct, + handleListStorefrontProducts, + handleListStorefrontProductSkus, +} from "./catalog-product.js"; +function toStorefrontBundleComputeResponse( + response: BundleComputeSummary, +): StorefrontBundleComputeResponse { + return { + productId: response.productId, + subtotalMinor: response.subtotalMinor, + discountType: response.discountType, + discountValueMinor: response.discountValueMinor, + discountValueBps: response.discountValueBps, + discountAmountMinor: response.discountAmountMinor, + finalPriceMinor: response.finalPriceMinor, + availability: response.availability, + components: response.components.map((component) => ({ + componentId: component.componentId, + componentSkuCode: component.componentSkuCode, + componentPriceMinor: component.componentPriceMinor, + quantityPerBundle: component.quantityPerBundle, + subtotalContributionMinor: component.subtotalContributionMinor, + availableBundleQuantity: component.availableBundleQuantity, + })), + }; +} + +export type ProductSkuResponse = { + sku: StoredProductSku; +}; + +export type ProductSkuListResponse = { + items: StoredProductSku[]; +}; + +export type ProductResponse = Omit & { + skus?: ProductDetailDTO["skus"]; + categories?: ProductDetailDTO["categories"]; + tags?: ProductDetailDTO["tags"]; +}; + +export type ProductAssetResponse = { + asset: StoredProductAsset; +}; + +export type ProductAssetLinkResponse = { + link: StoredProductAssetLink; +}; + +export type ProductAssetUnlinkResponse = { + deleted: boolean; +}; + +export type DigitalAssetResponse = { + asset: StoredDigitalAsset; +}; + +export type DigitalEntitlementResponse = { + entitlement: StoredDigitalEntitlement; +}; + +export type DigitalEntitlementUnlinkResponse = { + deleted: boolean; +}; + +export type BundleComponentResponse = { + component: StoredBundleComponent; +}; + +export type BundleComponentUnlinkResponse = { + deleted: boolean; +}; + +export type BundleComputeResponse = BundleComputeSummary; + +export type StorefrontBundleComputeComponentSummary = Omit< + BundleComputeSummary["components"][number], + "componentSkuId" | "componentProductId" +>; + +export type StorefrontBundleComputeResponse = Omit & { + components: StorefrontBundleComputeComponentSummary[]; +}; + +export type ProductListResponse = { + items: CatalogListingDTO[]; +}; + +export type StorefrontProductAvailability = "in_stock" | "low_stock" | "out_of_stock"; + +export type StorefrontProductRecord = { + id: string; + type: StoredProduct["type"]; + status: StoredProduct["status"]; + visibility: StoredProduct["visibility"]; + slug: string; + title: string; + shortDescription: string; + brand?: string; + vendor?: string; + featured: boolean; + sortOrder: number; + requiresShippingDefault: boolean; + taxClassDefault?: string; + bundleDiscountType?: StoredProduct["bundleDiscountType"]; + bundleDiscountValueMinor?: number; + bundleDiscountValueBps?: number; + createdAt: string; + updatedAt: string; +}; + +export type StorefrontVariantMatrixRow = Omit< + VariantMatrixDTO, + "inventoryQuantity" | "inventoryVersion" +> & { + availability: StorefrontProductAvailability; +}; + +export type StorefrontSkuSummary = { + id: string; + productId: string; + skuCode: string; + status: StoredProductSku["status"]; + unitPriceMinor: number; + compareAtPriceMinor?: number; + requiresShipping: boolean; + isDigital: boolean; + availability: StorefrontProductAvailability; +}; + +export type StorefrontProductDetail = { + product: StorefrontProductRecord; + skus?: StorefrontSkuSummary[]; + attributes?: StoredProductAttribute[]; + variantMatrix?: StorefrontVariantMatrixRow[]; + categories: ProductCategoryDTO[]; + tags: ProductTagDTO[]; + primaryImage?: ProductPrimaryImageDTO; + galleryImages?: ProductPrimaryImageDTO[]; +}; + +export type StorefrontProductListResponse = { + items: Array< + Omit & { + product: StorefrontProductRecord; + availability?: StorefrontProductAvailability; + } + >; +}; + +export type StorefrontSkuListResponse = { + items: StorefrontSkuSummary[]; +}; + +export type CategoryResponse = { + category: StoredCategory; +}; + +export type CategoryListResponse = { + items: StoredCategory[]; +}; + +export type ProductCategoryLinkResponse = { + link: StoredProductCategoryLink; +}; + +export type ProductCategoryLinkUnlinkResponse = { + deleted: boolean; +}; + +export type TagResponse = { + tag: StoredProductTag; +}; + +export type TagListResponse = { + items: StoredProductTag[]; +}; + +export type ProductTagLinkResponse = { + link: StoredProductTagLink; +}; + +export type ProductTagLinkUnlinkResponse = { + deleted: boolean; +}; + +export async function createProductHandler( + ctx: RouteContext, +): Promise { + return handleCreateProduct(ctx); +} + +export async function updateProductHandler( + ctx: RouteContext, +): Promise { + return handleUpdateProduct(ctx); +} + +export async function setProductStateHandler( + ctx: RouteContext, +): Promise { + return handleSetProductState(ctx); +} + +export async function getProductHandler( + ctx: RouteContext, +): Promise { + return handleGetProduct(ctx); +} + +export async function listProductsHandler( + ctx: RouteContext, +): Promise { + return handleListProducts(ctx); +} + +export async function createCategoryHandler( + ctx: RouteContext, +): Promise { + return handleCreateCategory(ctx); +} + +export async function listCategoriesHandler( + ctx: RouteContext, +): Promise { + return handleListCategories(ctx); +} + +export async function createProductCategoryLinkHandler( + ctx: RouteContext, +): Promise { + return handleCreateProductCategoryLink(ctx); +} + +export async function removeProductCategoryLinkHandler( + ctx: RouteContext, +): Promise { + return handleRemoveProductCategoryLink(ctx); +} + +export async function createTagHandler(ctx: RouteContext): Promise { + return handleCreateTag(ctx); +} + +export async function listTagsHandler(ctx: RouteContext): Promise { + return handleListTags(ctx); +} + +export async function createProductTagLinkHandler( + ctx: RouteContext, +): Promise { + return handleCreateProductTagLink(ctx); +} + +export async function removeProductTagLinkHandler( + ctx: RouteContext, +): Promise { + return handleRemoveProductTagLink(ctx); +} + +export async function createProductSkuHandler( + ctx: RouteContext, +): Promise { + return handleCreateProductSku(ctx); +} + +export async function updateProductSkuHandler( + ctx: RouteContext, +): Promise { + return handleUpdateProductSku(ctx); +} + +export async function setSkuStatusHandler( + ctx: RouteContext, +): Promise { + return handleSetSkuStatus(ctx); +} + +export async function listProductSkusHandler( + ctx: RouteContext, +): Promise { + return handleListProductSkus(ctx); +} + +export async function getStorefrontProductHandler( + ctx: RouteContext, +): Promise { + return handleGetStorefrontProduct(ctx); +} + +export async function listStorefrontProductsHandler( + ctx: RouteContext, +): Promise { + return handleListStorefrontProducts(ctx); +} + +export async function listStorefrontProductSkusHandler( + ctx: RouteContext, +): Promise { + return handleListStorefrontProductSkus(ctx); +} + +export async function registerProductAssetHandler( + ctx: RouteContext, +): Promise { + return handleRegisterProductAsset(ctx); +} + +export async function linkCatalogAssetHandler( + ctx: RouteContext, +): Promise { + return handleLinkCatalogAsset(ctx); +} + +export async function unlinkCatalogAssetHandler( + ctx: RouteContext, +): Promise { + return handleUnlinkCatalogAsset(ctx); +} + +export async function reorderCatalogAssetHandler( + ctx: RouteContext, +): Promise { + return handleReorderCatalogAsset(ctx); +} + +export async function addBundleComponentHandler( + ctx: RouteContext, +): Promise { + return handleAddBundleComponent(ctx); +} + +export async function removeBundleComponentHandler( + ctx: RouteContext, +): Promise { + return handleRemoveBundleComponent(ctx); +} + +export async function reorderBundleComponentHandler( + ctx: RouteContext, +): Promise { + return handleReorderBundleComponent(ctx); +} + +export async function bundleComputeHandler( + ctx: RouteContext, +): Promise { + return handleBundleCompute(ctx); +} + +export async function bundleComputeStorefrontHandler( + ctx: RouteContext, +): Promise { + const internal = await bundleComputeHandler(ctx); + return toStorefrontBundleComputeResponse(internal); +} + +export async function createDigitalAssetHandler( + ctx: RouteContext, +): Promise { + return handleCreateDigitalAsset(ctx); +} + +export async function createDigitalEntitlementHandler( + ctx: RouteContext, +): Promise { + return handleCreateDigitalEntitlement(ctx); +} + +export async function removeDigitalEntitlementHandler( + ctx: RouteContext, +): Promise { + return handleRemoveDigitalEntitlement(ctx); +} diff --git a/packages/plugins/commerce/src/handlers/checkout-get-order.test.ts b/packages/plugins/commerce/src/handlers/checkout-get-order.test.ts new file mode 100644 index 000000000..e1326a304 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/checkout-get-order.test.ts @@ -0,0 +1,104 @@ +import type { RouteContext } from "emdash"; +import { describe, expect, it } from "vitest"; + +import { sha256HexAsync } from "../lib/crypto-adapter.js"; +import type { CheckoutGetOrderInput } from "../schemas.js"; +import type { StoredOrder } from "../types.js"; +import { checkoutGetOrderHandler } from "./checkout-get-order.js"; + +type MemColl = { + get(id: string): Promise; + put(id: string, data: T): Promise; + rows: Map; +}; + +class MemCollImpl implements MemColl { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } +} + +function ctxFor(orderId: string, finalizeToken?: string): RouteContext { + return { + request: new Request("https://example.test/checkout/get-order", { method: "POST" }), + input: { orderId, finalizeToken }, + storage: { orders: new MemCollImpl() }, + } as unknown as RouteContext; +} + +describe("checkoutGetOrderHandler", () => { + const now = "2026-04-03T12:00:00.000Z"; + const token = "a".repeat(32); + const orderBase: StoredOrder = { + cartId: "cart_1", + paymentPhase: "payment_pending", + currency: "USD", + finalizeTokenHash: "placeholder-finalize-token-hash", + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 100, + }, + ], + totalMinor: 100, + createdAt: now, + updatedAt: now, + }; + + it("returns a public order snapshot when finalize token matches", async () => { + const orderId = "ord_1"; + const order: StoredOrder = { + ...orderBase, + finalizeTokenHash: await sha256HexAsync(token), + }; + const mem = new MemCollImpl(new Map([[orderId, order]])); + const out = await checkoutGetOrderHandler({ + ...ctxFor(orderId, token), + storage: { orders: mem }, + } as unknown as RouteContext); + + expect(out.order).toEqual({ + cartId: order.cartId, + paymentPhase: order.paymentPhase, + currency: order.currency, + lineItems: order.lineItems, + totalMinor: order.totalMinor, + createdAt: order.createdAt, + updatedAt: order.updatedAt, + }); + expect("finalizeTokenHash" in out.order).toBe(false); + }); + + it("rejects missing token when order requires one", async () => { + const orderId = "ord_2"; + const order: StoredOrder = { ...orderBase, finalizeTokenHash: await sha256HexAsync(token) }; + const mem = new MemCollImpl(new Map([[orderId, order]])); + await expect( + checkoutGetOrderHandler({ + ...ctxFor(orderId), + storage: { orders: mem }, + } as unknown as RouteContext), + ).rejects.toMatchObject({ code: "order_token_required" }); + }); + + it("rejects wrong finalize token when order requires one", async () => { + const orderId = "ord_3"; + const order: StoredOrder = { ...orderBase, finalizeTokenHash: await sha256HexAsync(token) }; + const mem = new MemCollImpl(new Map([[orderId, order]])); + await expect( + checkoutGetOrderHandler({ + ...ctxFor(orderId, "wrong_finalization_token_1234567890"), + storage: { orders: mem }, + } as unknown as RouteContext), + ).rejects.toMatchObject({ code: "order_token_invalid" }); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/checkout-get-order.ts b/packages/plugins/commerce/src/handlers/checkout-get-order.ts new file mode 100644 index 000000000..2d6ec11b8 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/checkout-get-order.ts @@ -0,0 +1,57 @@ +/** + * Read-only order snapshot for storefront SSR (Astro) and form posts. + * Every order carries `finalizeTokenHash` (checkout always sets it), and callers + * must present the raw `finalizeToken` to read it. + */ + +import type { RouteContext } from "emdash"; + +import { equalSha256HexDigestAsync, sha256HexAsync } from "../lib/crypto-adapter.js"; +import { requirePost } from "../lib/require-post.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { CheckoutGetOrderInput } from "../schemas.js"; +import type { StoredOrder } from "../types.js"; +import { asCollection } from "./catalog-conflict.js"; + +export type CheckoutGetOrderResponse = { + order: Omit; +}; + +function toPublicOrder(order: StoredOrder): CheckoutGetOrderResponse["order"] { + const { finalizeTokenHash: _omit, ...rest } = order; + return rest; +} + +export async function checkoutGetOrderHandler( + ctx: RouteContext, +): Promise { + requirePost(ctx); + + const orders = asCollection(ctx.storage.orders); + const order = await orders.get(ctx.input.orderId); + if (!order) { + throwCommerceApiError({ code: "ORDER_NOT_FOUND", message: "Order not found" }); + } + + const expectedHash = order.finalizeTokenHash; + if (!expectedHash) { + throwCommerceApiError({ code: "ORDER_NOT_FOUND", message: "Order token missing from storage" }); + } + + const token = ctx.input.finalizeToken?.trim(); + if (!token) { + throwCommerceApiError({ + code: "ORDER_TOKEN_REQUIRED", + message: "finalizeToken is required to read this order", + }); + } + const digest = await sha256HexAsync(token); + if (!(await equalSha256HexDigestAsync(digest, expectedHash))) { + throwCommerceApiError({ + code: "ORDER_TOKEN_INVALID", + message: "Invalid finalize token for this order", + }); + } + + return { order: toPublicOrder(order) }; +} diff --git a/packages/plugins/commerce/src/handlers/checkout-state.test.ts b/packages/plugins/commerce/src/handlers/checkout-state.test.ts new file mode 100644 index 000000000..bf52ada66 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/checkout-state.test.ts @@ -0,0 +1,494 @@ +import type { StorageCollection } from "emdash"; +import { describe, expect, it } from "vitest"; + +import { sha256HexAsync } from "../lib/crypto-adapter.js"; +import type { StoredIdempotencyKey, StoredOrder, StoredPaymentAttempt } from "../types.js"; +import { + CHECKOUT_PENDING_KIND, + CHECKOUT_ROUTE, + type CheckoutPendingState, + computeCheckoutReplayIntegrity, + deterministicOrderId, + deterministicPaymentAttemptId, + decideCheckoutReplayState, + restorePendingCheckout, + resolvePaymentProviderId, + validateCachedCheckoutCompleted, +} from "./checkout-state.js"; + +type MemCollection = { + get(id: string): Promise; + put(id: string, data: T): Promise; + rows: Map; +}; +function asStorageCollection(collection: MemCollection): StorageCollection { + return collection as unknown as StorageCollection; +} + +class MemColl implements MemCollection { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } +} + +const NOW = "2026-04-02T12:00:00.000Z"; +const REPLAY_INTEGRITY_HEX64 = /^[a-f0-9]{64}$/; + +function checkoutPendingFixture( + overrides: Partial = {}, +): CheckoutPendingState { + return { + kind: CHECKOUT_PENDING_KIND, + orderId: "order-1", + paymentAttemptId: "attempt-1", + providerId: "stripe", + cartId: "cart-1", + paymentPhase: "payment_pending", + finalizeToken: "pending-token-123", + totalMinor: 1500, + currency: "USD", + lineItems: [ + { + productId: "p-1", + variantId: "v-1", + quantity: 2, + inventoryVersion: 4, + unitPriceMinor: 750, + }, + ], + createdAt: NOW, + ...overrides, + }; +} + +describe("decideCheckoutReplayState", () => { + it("returns not_cached when there is no idempotency row", () => { + expect(decideCheckoutReplayState(null)).toEqual({ kind: "not_cached" }); + }); + + it("returns not_cached when cached body is not a recognized response", () => { + const cached = { + route: CHECKOUT_ROUTE, + keyHash: "k1", + httpStatus: 200, + responseBody: { random: "payload" }, + createdAt: NOW, + } as unknown as StoredIdempotencyKey; + expect(decideCheckoutReplayState(cached)).toEqual({ kind: "not_cached" }); + }); + + it("returns cached_completed for finalized idempotency payload", () => { + const cachedResponse = { + orderId: "order-1", + paymentPhase: "payment_pending" as const, + paymentAttemptId: "attempt-1", + totalMinor: 1500, + currency: "USD", + finalizeToken: "pending-token-123", + replayIntegrity: "1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef", + }; + const cached = { + route: CHECKOUT_ROUTE, + keyHash: "k2", + httpStatus: 200, + responseBody: cachedResponse, + createdAt: NOW, + } as StoredIdempotencyKey; + + expect(decideCheckoutReplayState(cached)).toMatchObject({ + kind: "cached_completed", + response: { + orderId: "order-1", + paymentPhase: "payment_pending", + paymentAttemptId: "attempt-1", + totalMinor: 1500, + currency: "USD", + finalizeToken: "pending-token-123", + replayIntegrity: cachedResponse.replayIntegrity, + }, + }); + }); + + it("returns not_cached when replayIntegrity is missing from completed payload", () => { + const cached = { + route: CHECKOUT_ROUTE, + keyHash: "k2", + httpStatus: 200, + responseBody: { + orderId: "order-1", + paymentPhase: "payment_pending", + paymentAttemptId: "attempt-1", + totalMinor: 1500, + currency: "USD", + finalizeToken: "pending-token-123", + }, + createdAt: NOW, + } as unknown as StoredIdempotencyKey; + + expect(decideCheckoutReplayState(cached)).toEqual({ kind: "not_cached" }); + }); + + it("returns cached_pending for pending checkout recovery payload", () => { + const pending = checkoutPendingFixture(); + const cached = { + route: CHECKOUT_ROUTE, + keyHash: "k3", + httpStatus: 202, + responseBody: pending, + createdAt: NOW, + } as StoredIdempotencyKey; + + const decision = decideCheckoutReplayState(cached); + expect(decision).toMatchObject({ + kind: "cached_pending", + pending: pending, + }); + }); +}); + +describe("restorePendingCheckout", () => { + it("reconstructs missing order + attempt, then promotes cache response to completed", async () => { + const pending = checkoutPendingFixture(); + const cached: StoredIdempotencyKey = { + route: CHECKOUT_ROUTE, + keyHash: "k4", + httpStatus: 202, + responseBody: pending, + createdAt: NOW, + }; + const orders = new MemColl(); + const attempts = new MemColl(); + const idempotencyKeys = new MemColl(); + + const response = await restorePendingCheckout( + "idemp:abc", + cached, + pending, + NOW, + asStorageCollection(idempotencyKeys), + asStorageCollection(orders), + asStorageCollection(attempts), + ); + + expect(response).toMatchObject({ + orderId: pending.orderId, + paymentPhase: "payment_pending", + paymentAttemptId: pending.paymentAttemptId, + totalMinor: pending.totalMinor, + currency: pending.currency, + finalizeToken: pending.finalizeToken, + }); + expect(response.replayIntegrity).toMatch(REPLAY_INTEGRITY_HEX64); + const order = await orders.get(pending.orderId); + expect(order).toEqual({ + cartId: pending.cartId, + paymentPhase: pending.paymentPhase, + currency: pending.currency, + lineItems: pending.lineItems, + totalMinor: pending.totalMinor, + finalizeTokenHash: expect.any(String), + createdAt: pending.createdAt, + updatedAt: NOW, + }); + const attempt = await attempts.get(pending.paymentAttemptId); + expect(attempt).toEqual({ + orderId: pending.orderId, + providerId: "stripe", + status: "pending", + createdAt: pending.createdAt, + updatedAt: NOW, + }); + const completedRow = await idempotencyKeys.get("idemp:abc"); + expect(completedRow?.httpStatus).toBe(200); + expect(completedRow?.responseBody).toMatchObject({ + orderId: pending.orderId, + paymentAttemptId: pending.paymentAttemptId, + paymentPhase: "payment_pending", + currency: "USD", + }); + }); + + it("keeps existing order and attempt when they already exist", async () => { + const pending = checkoutPendingFixture(); + const cached: StoredIdempotencyKey = { + route: CHECKOUT_ROUTE, + keyHash: "k5", + httpStatus: 202, + responseBody: pending, + createdAt: NOW, + }; + const existingOrder: StoredOrder = { + cartId: pending.cartId, + paymentPhase: "payment_pending", + currency: "USD", + lineItems: pending.lineItems, + totalMinor: 1500, + finalizeTokenHash: await sha256HexAsync(pending.finalizeToken), + createdAt: "2026-04-01T00:00:00.000Z", + updatedAt: "2026-04-01T00:00:00.000Z", + }; + const existingAttempt: StoredPaymentAttempt = { + orderId: pending.orderId, + providerId: "stripe", + status: "pending", + createdAt: "2026-04-01T00:00:00.000Z", + updatedAt: "2026-04-01T00:00:00.000Z", + }; + const orders = new MemColl(new Map([[pending.orderId, existingOrder]])); + const attempts = new MemColl( + new Map([[pending.paymentAttemptId, existingAttempt]]), + ); + const idempotencyKeys = new MemColl(); + + const response = await restorePendingCheckout( + "idemp:existing", + cached, + pending, + NOW, + asStorageCollection(idempotencyKeys), + asStorageCollection(orders), + asStorageCollection(attempts), + ); + + expect(response).toMatchObject({ + orderId: pending.orderId, + paymentAttemptId: pending.paymentAttemptId, + }); + expect(response.replayIntegrity).toMatch(REPLAY_INTEGRITY_HEX64); + expect(await orders.get(pending.orderId)).toEqual(existingOrder); + expect(await attempts.get(pending.paymentAttemptId)).toEqual(existingAttempt); + }); + + it("fails replay restore if existing order no longer matches pending payload", async () => { + const pending = checkoutPendingFixture(); + const cached: StoredIdempotencyKey = { + route: CHECKOUT_ROUTE, + keyHash: "k6", + httpStatus: 202, + responseBody: pending, + createdAt: NOW, + }; + const existingOrder: StoredOrder = { + cartId: "other-cart", + paymentPhase: "payment_pending", + currency: "USD", + lineItems: pending.lineItems, + totalMinor: pending.totalMinor, + finalizeTokenHash: await sha256HexAsync(pending.finalizeToken), + createdAt: NOW, + updatedAt: NOW, + }; + const existingAttempt: StoredPaymentAttempt = { + orderId: pending.orderId, + providerId: resolvePaymentProviderId(pending.providerId), + status: "pending", + createdAt: NOW, + updatedAt: NOW, + }; + const orders = new MemColl(new Map([[pending.orderId, existingOrder]])); + const attempts = new MemColl( + new Map([[pending.paymentAttemptId, existingAttempt]]), + ); + const idempotencyKeys = new MemColl(); + + await expect( + restorePendingCheckout( + "idemp:order-mismatch", + cached, + pending, + NOW, + asStorageCollection(idempotencyKeys), + asStorageCollection(orders), + asStorageCollection(attempts), + ), + ).rejects.toMatchObject({ code: "order_state_conflict" }); + expect(await idempotencyKeys.get("idemp:order-mismatch")).toBeNull(); + expect(await orders.get(pending.orderId)).toEqual(existingOrder); + expect(await attempts.get(pending.paymentAttemptId)).toEqual(existingAttempt); + }); + + it("fails replay restore if existing attempt no longer matches pending payload", async () => { + const pending = checkoutPendingFixture(); + const cached: StoredIdempotencyKey = { + route: CHECKOUT_ROUTE, + keyHash: "k7", + httpStatus: 202, + responseBody: pending, + createdAt: NOW, + }; + const existingOrder: StoredOrder = { + cartId: pending.cartId, + paymentPhase: pending.paymentPhase, + currency: pending.currency, + lineItems: pending.lineItems, + totalMinor: pending.totalMinor, + finalizeTokenHash: await sha256HexAsync(pending.finalizeToken), + createdAt: NOW, + updatedAt: NOW, + }; + const existingAttempt: StoredPaymentAttempt = { + orderId: pending.orderId, + providerId: resolvePaymentProviderId(pending.providerId), + status: "succeeded", + createdAt: NOW, + updatedAt: NOW, + }; + const orders = new MemColl(new Map([[pending.orderId, existingOrder]])); + const attempts = new MemColl( + new Map([[pending.paymentAttemptId, existingAttempt]]), + ); + const idempotencyKeys = new MemColl(); + + await expect( + restorePendingCheckout( + "idemp:attempt-mismatch", + cached, + pending, + NOW, + asStorageCollection(idempotencyKeys), + asStorageCollection(orders), + asStorageCollection(attempts), + ), + ).rejects.toMatchObject({ code: "order_state_conflict" }); + expect(await idempotencyKeys.get("idemp:attempt-mismatch")).toBeNull(); + expect(await orders.get(pending.orderId)).toEqual(existingOrder); + expect(await attempts.get(pending.paymentAttemptId)).toEqual(existingAttempt); + }); +}); + +describe("validateCachedCheckoutCompleted", () => { + it("returns false when order or attempt is missing", async () => { + const cached = { + orderId: "o1", + paymentPhase: "payment_pending" as const, + paymentAttemptId: "a1", + totalMinor: 100, + currency: "USD", + finalizeToken: "tok_______________________________", + replayIntegrity: "1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef", + }; + expect(await validateCachedCheckoutCompleted("kh", cached, null, null)).toBe(false); + }); + + it("returns false when replayIntegrity is missing", async () => { + const token = "tok_______________________________"; + const order: StoredOrder = { + cartId: "c1", + paymentPhase: "payment_pending", + currency: "USD", + lineItems: [], + totalMinor: 100, + finalizeTokenHash: await sha256HexAsync(token), + createdAt: NOW, + updatedAt: NOW, + }; + const attempt: StoredPaymentAttempt = { + orderId: "o1", + providerId: "stripe", + status: "pending", + createdAt: NOW, + updatedAt: NOW, + }; + const cached = { + orderId: "o1", + paymentPhase: "payment_pending" as const, + paymentAttemptId: "a1", + totalMinor: 100, + currency: "USD", + finalizeToken: token, + }; + expect(await validateCachedCheckoutCompleted("kh", cached as never, order, attempt)).toBe( + false, + ); + }); + + it("returns false when replayIntegrity does not match payload", async () => { + const token = "tok_______________________________"; + const order: StoredOrder = { + cartId: "c1", + paymentPhase: "payment_pending", + currency: "USD", + lineItems: [], + totalMinor: 100, + finalizeTokenHash: await sha256HexAsync(token), + createdAt: NOW, + updatedAt: NOW, + }; + const attempt: StoredPaymentAttempt = { + orderId: "o1", + providerId: "stripe", + status: "pending", + createdAt: NOW, + updatedAt: NOW, + }; + const cached = { + orderId: "o1", + paymentPhase: "payment_pending" as const, + paymentAttemptId: "a1", + totalMinor: 100, + currency: "USD", + finalizeToken: token, + replayIntegrity: "deadbeef", + }; + expect(await validateCachedCheckoutCompleted("kh", cached, order, attempt)).toBe(false); + }); + + it("returns true when replayIntegrity matches and rows align", async () => { + const token = "tok_______________________________"; + const keyHash = "keyh"; + const order: StoredOrder = { + cartId: "c1", + paymentPhase: "payment_pending", + currency: "USD", + lineItems: [], + totalMinor: 100, + finalizeTokenHash: await sha256HexAsync(token), + createdAt: NOW, + updatedAt: NOW, + }; + const attempt: StoredPaymentAttempt = { + orderId: "o1", + providerId: "stripe", + status: "pending", + createdAt: NOW, + updatedAt: NOW, + }; + const cached = { + orderId: "o1", + paymentPhase: "payment_pending" as const, + paymentAttemptId: "a1", + totalMinor: 100, + currency: "USD", + finalizeToken: token, + }; + const replayIntegrity = await computeCheckoutReplayIntegrity(keyHash, cached); + expect( + await validateCachedCheckoutCompleted( + keyHash, + { ...cached, replayIntegrity }, + order, + attempt, + ), + ).toBe(true); + }); +}); + +describe("checkout id helpers", () => { + it("normalizes payment provider ids", () => { + expect(resolvePaymentProviderId(undefined)).toBe("stripe"); + expect(resolvePaymentProviderId(" ")).toBe("stripe"); + expect(resolvePaymentProviderId("paypal")).toBe("paypal"); + }); + + it("builds deterministic ids from checkout hash keys", () => { + expect(deterministicOrderId("abc123")).toBe("checkout-order:abc123"); + expect(deterministicPaymentAttemptId("abc123")).toBe("checkout-attempt:abc123"); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/checkout-state.ts b/packages/plugins/commerce/src/handlers/checkout-state.ts new file mode 100644 index 000000000..3a25213fd --- /dev/null +++ b/packages/plugins/commerce/src/handlers/checkout-state.ts @@ -0,0 +1,268 @@ +import type { StorageCollection } from "emdash"; + +import { sha256HexAsync } from "../lib/crypto-adapter.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { CheckoutInput } from "../schemas.js"; +import { resolvePaymentProviderId as resolvePaymentProviderIdFromContracts } from "../services/commerce-provider-contracts.js"; +import type { + StoredIdempotencyKey, + StoredOrder, + StoredPaymentAttempt, + OrderLineItem, +} from "../types.js"; + +export const CHECKOUT_ROUTE = "checkout"; +export const CHECKOUT_PENDING_KIND = "checkout_pending"; + +export type CheckoutPendingState = { + kind: typeof CHECKOUT_PENDING_KIND; + orderId: string; + paymentAttemptId: string; + providerId?: string; + cartId: string; + paymentPhase: "payment_pending"; + finalizeToken: string; + totalMinor: number; + currency: string; + lineItems: OrderLineItem[]; + createdAt: string; +}; + +export type CheckoutResponse = { + orderId: string; + paymentPhase: "payment_pending"; + paymentAttemptId: string; + totalMinor: number; + currency: string; + finalizeToken: string; + /** + * Replay seal persisted on completed cache entries. Required for replay validation + * and reconstructed checkpoints, but omitted from client wire responses. + */ + replayIntegrity?: string; +}; + +/** Wire shape returned to clients (no internal replay seal). */ +export type CheckoutClientResponse = Omit; + +export function toCheckoutClientResponse(response: CheckoutResponse): CheckoutClientResponse { + const { replayIntegrity: _replayIntegrity, ...out } = response; + return out; +} + +export type CheckoutReplayDecision = + | { kind: "cached_completed"; response: CheckoutResponse } + | { kind: "cached_pending"; pending: CheckoutPendingState } + | { kind: "not_cached" }; + +export const resolvePaymentProviderId = resolvePaymentProviderIdFromContracts; + +export function isObjectLike(value: unknown): value is Record { + return !!value && typeof value === "object" && !Array.isArray(value); +} + +export function isCheckoutCompletedResponse(value: unknown): value is CheckoutResponse { + if (!isObjectLike(value)) return false; + const candidate = value as Record; + return ( + candidate.kind !== CHECKOUT_PENDING_KIND && + candidate.orderId != null && + typeof candidate.orderId === "string" && + candidate.paymentPhase === "payment_pending" && + candidate.paymentAttemptId != null && + typeof candidate.paymentAttemptId === "string" && + typeof candidate.totalMinor === "number" && + typeof candidate.currency === "string" && + typeof candidate.finalizeToken === "string" && + candidate.cartId === undefined && + candidate.lineItems === undefined && + typeof candidate.replayIntegrity === "string" && + candidate.replayIntegrity.length > 0 + ); +} + +export function isCheckoutPendingState(value: unknown): value is CheckoutPendingState { + if (!isObjectLike(value)) return false; + const candidate = value as Record; + return ( + candidate.kind === CHECKOUT_PENDING_KIND && + typeof candidate.orderId === "string" && + typeof candidate.paymentAttemptId === "string" && + typeof candidate.cartId === "string" && + candidate.paymentPhase === "payment_pending" && + typeof candidate.finalizeToken === "string" && + typeof candidate.totalMinor === "number" && + typeof candidate.currency === "string" && + Array.isArray(candidate.lineItems) + ); +} + +export function decideCheckoutReplayState( + response: StoredIdempotencyKey | null, +): CheckoutReplayDecision { + if (!response) return { kind: "not_cached" }; + if (isCheckoutCompletedResponse(response.responseBody)) { + return { kind: "cached_completed", response: response.responseBody }; + } + if (isCheckoutPendingState(response.responseBody)) { + return { kind: "cached_pending", pending: response.responseBody }; + } + return { kind: "not_cached" }; +} + +function checkoutResponseFromPendingState(state: CheckoutPendingState): CheckoutResponse { + return { + orderId: state.orderId, + paymentPhase: "payment_pending", + paymentAttemptId: state.paymentAttemptId, + totalMinor: state.totalMinor, + currency: state.currency, + finalizeToken: state.finalizeToken, + }; +} + +type CheckoutReplayIntegrityInput = Pick< + CheckoutResponse, + "orderId" | "paymentAttemptId" | "totalMinor" | "currency" | "paymentPhase" | "finalizeToken" +>; + +/** Deterministic seal for completed-checkout idempotency replay validation. */ +export async function computeCheckoutReplayIntegrity( + keyHash: string, + response: CheckoutReplayIntegrityInput, +): Promise { + return sha256HexAsync( + `${keyHash}|${response.orderId}|${response.paymentAttemptId}|${response.totalMinor}|${response.currency}|${response.paymentPhase}|${response.finalizeToken}`, + ); +} + +/** + * Returns true when cached completed response matches live order + attempt rows. + * `replayIntegrity` must be present for a completed response to be accepted. + */ +export async function validateCachedCheckoutCompleted( + keyHash: string, + cached: CheckoutResponse, + order: StoredOrder | null, + attempt: StoredPaymentAttempt | null, +): Promise { + if (!order || !attempt) return false; + if (attempt.orderId !== cached.orderId) return false; + if (order.paymentPhase !== cached.paymentPhase) return false; + if (order.totalMinor !== cached.totalMinor) return false; + if (order.currency !== cached.currency) return false; + if ((await sha256HexAsync(cached.finalizeToken)) !== order.finalizeTokenHash) return false; + if (!cached.replayIntegrity || cached.replayIntegrity.length === 0) return false; + + const expected = await computeCheckoutReplayIntegrity(keyHash, cached); + if (expected !== cached.replayIntegrity) return false; + return true; +} + +export async function restorePendingCheckout( + idempotencyDocId: string, + cached: StoredIdempotencyKey, + pending: CheckoutPendingState, + nowIso: string, + idempotencyKeys: StorageCollection, + orders: StorageCollection, + attempts: StorageCollection, +): Promise { + const expectedProviderId = resolvePaymentProviderId(pending.providerId); + const finalizeTokenHash = await sha256HexAsync(pending.finalizeToken); + + const existingOrder = await orders.get(pending.orderId); + if (!existingOrder) { + await orders.put(pending.orderId, { + cartId: pending.cartId, + paymentPhase: pending.paymentPhase, + currency: pending.currency, + lineItems: pending.lineItems, + totalMinor: pending.totalMinor, + finalizeTokenHash, + createdAt: pending.createdAt, + updatedAt: nowIso, + }); + } else { + const orderLineItemsMatch = + existingOrder.lineItems.length === pending.lineItems.length && + existingOrder.lineItems.every((existingItem, index) => { + const pendingItem = pending.lineItems[index]; + if (!pendingItem) return false; + return ( + existingItem.productId === pendingItem.productId && + existingItem.variantId === pendingItem.variantId && + existingItem.quantity === pendingItem.quantity && + existingItem.inventoryVersion === pendingItem.inventoryVersion && + existingItem.unitPriceMinor === pendingItem.unitPriceMinor + ); + }); + + if ( + existingOrder.cartId !== pending.cartId || + existingOrder.paymentPhase !== pending.paymentPhase || + existingOrder.currency !== pending.currency || + existingOrder.totalMinor !== pending.totalMinor || + existingOrder.finalizeTokenHash !== finalizeTokenHash || + !orderLineItemsMatch + ) { + throwCommerceApiError({ + code: "ORDER_STATE_CONFLICT", + message: "Cached checkout recovery state no longer matches current order", + details: { + idempotencyKey: idempotencyDocId, + orderId: pending.orderId, + }, + }); + } + } + + const existingAttempt = await attempts.get(pending.paymentAttemptId); + if (!existingAttempt) { + await attempts.put(pending.paymentAttemptId, { + orderId: pending.orderId, + providerId: expectedProviderId, + status: "pending", + createdAt: pending.createdAt, + updatedAt: nowIso, + }); + } else if ( + existingAttempt.orderId !== pending.orderId || + existingAttempt.providerId !== expectedProviderId || + existingAttempt.status !== "pending" + ) { + throwCommerceApiError({ + code: "ORDER_STATE_CONFLICT", + message: "Cached checkout recovery state no longer matches current payment attempt", + details: { + idempotencyKey: idempotencyDocId, + paymentAttemptId: pending.paymentAttemptId, + }, + }); + } + + const base = checkoutResponseFromPendingState(pending); + const replayIntegrity = await computeCheckoutReplayIntegrity(cached.keyHash, base); + const response: CheckoutResponse = { ...base, replayIntegrity }; + await idempotencyKeys.put(idempotencyDocId, { + ...cached, + httpStatus: 200, + responseBody: response, + }); + return response; +} + +export function deterministicOrderId(keyHash: string): string { + return `checkout-order:${keyHash}`; +} + +export function deterministicPaymentAttemptId(keyHash: string): string { + return `checkout-attempt:${keyHash}`; +} + +export type CheckoutStateInput = CheckoutInput & { + idempotencyRouteKey: string; + cartFingerprint: string; + cartUpdatedAt: string; + nowIso: string; +}; diff --git a/packages/plugins/commerce/src/handlers/checkout.test.ts b/packages/plugins/commerce/src/handlers/checkout.test.ts new file mode 100644 index 000000000..9e56c0a11 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/checkout.test.ts @@ -0,0 +1,1515 @@ +import type { RouteContext } from "emdash"; +import { beforeEach, describe, expect, it, vi } from "vitest"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { cartContentFingerprint } from "../lib/cart-fingerprint.js"; +import { sha256HexAsync } from "../lib/crypto-adapter.js"; +import { inventoryStockDocId } from "../orchestration/finalize-payment.js"; +import type { CheckoutInput } from "../schemas.js"; +import type { + StoredCart, + StoredIdempotencyKey, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredBundleComponent, + StoredProductSku, + StoredProductSkuOptionValue, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, +} from "../types.js"; +import { + CHECKOUT_ROUTE, + deterministicOrderId, + deterministicPaymentAttemptId, +} from "./checkout-state.js"; +import { checkoutHandler } from "./checkout.js"; + +function asRouteContext(context: unknown): RouteContext { + return context as RouteContext; +} + +function asMemCollection(collection: unknown): MemColl { + return collection as MemColl; +} + +const consumeKvRateLimit = vi.fn(async (_opts?: unknown) => true); +vi.mock("../lib/rate-limit-kv.js", () => ({ + __esModule: true, + consumeKvRateLimit: (opts: unknown) => consumeKvRateLimit(opts), +})); + +type MemCollection = { + get(id: string): Promise; + put(id: string, data: T): Promise; + query?(options?: { where?: Record; limit?: number }): Promise<{ + items: Array<{ id: string; data: T }>; + hasMore: boolean; + }>; + rows: Map; +}; + +class MemColl implements MemCollection { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } + + async query( + options: { where?: Record; limit?: number } = {}, + ): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }> { + const where = options.where ?? {}; + const limit = options.limit; + let items = Array.from(this.rows.entries(), ([id, data]) => ({ id, data })); + for (const [field, value] of Object.entries(where)) { + items = items.filter((item) => (item.data as Record)[field] === value); + } + if (typeof limit === "number") { + items = items.slice(0, limit); + } + return { items, hasMore: false }; + } +} + +/** Default catalog product for checkout tests that do not seed `products`. */ +class DefaultProductsColl extends MemColl { + override async get(id: string): Promise { + const row = this.rows.get(id); + if (row) return structuredClone(row); + const now = "2026-01-01T00:00:00.000Z"; + return { + id, + type: "simple", + status: "active", + visibility: "public", + slug: id, + title: id, + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + }; + } +} + +class MemKv { + store = new Map(); + + async get(key: string): Promise { + const row = this.store.get(key); + return row === undefined ? null : (row as T); + } + + async set(key: string, value: T): Promise { + this.store.set(key, value); + } +} + +function oneTimePutFailure( + collection: MemColl, + failCallNumber = 2, +): MemColl { + let callCount = 0; + return { + get rows() { + return collection.rows; + }, + get: (id: string) => collection.get(id), + put: async (id: string, data: T): Promise => { + callCount += 1; + if (callCount === failCallNumber) { + throw new Error("simulated idempotency persistence failure"); + } + await collection.put(id, data); + }, + } as MemColl; +} + +function contextFor({ + idempotencyKeys, + orders, + paymentAttempts, + carts, + inventoryStock, + kv, + idempotencyKey, + cartId, + ownerToken, + requestMethod = "POST", + ip = "127.0.0.1", + extras, +}: { + idempotencyKeys: MemCollection; + orders: MemCollection; + paymentAttempts: MemCollection; + carts: MemCollection; + inventoryStock: MemCollection; + kv: MemKv; + idempotencyKey: string; + cartId: string; + ownerToken?: string; + requestMethod?: string; + ip?: string; + extras?: { + products?: MemCollection; + productSkus?: MemCollection; + productSkuOptionValues?: MemCollection; + digitalAssets?: MemCollection; + digitalEntitlements?: MemCollection; + productAssetLinks?: MemCollection; + productAssets?: MemCollection; + bundleComponents?: MemCollection; + }; +}): RouteContext { + const req = new Request("https://example.local/checkout", { + method: requestMethod, + headers: new Headers({ "Idempotency-Key": idempotencyKey }), + }); + const catalogDefaults = { + products: new DefaultProductsColl(), + productSkus: new MemColl(), + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + bundleComponents: new MemColl(), + }; + return asRouteContext({ + request: req as Request & { headers: Headers }, + input: { + cartId, + idempotencyKey, + ...(ownerToken !== undefined ? { ownerToken } : {}), + }, + storage: { + idempotencyKeys, + orders, + paymentAttempts, + carts, + inventoryStock, + ...catalogDefaults, + ...extras, + }, + requestMeta: { + ip, + }, + kv, + }); +} + +describe("checkout idempotency persistence recovery", () => { + it("retries without duplicate orders when idempotency persistence fails after partial success", async () => { + const cartId = "cart_1"; + const idempotencyKey = "idem-key-strong-16"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-for-idempotent-retry"; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const idempotencyRows = new Map(); + const idempotencyBase = new MemColl(idempotencyRows); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ); + const kv = new MemKv(); + + // Pending 202 then completed 200 — fail the second idempotency write after order/attempt exist. + const failingIdempotency = oneTimePutFailure(idempotencyBase, 2); + const failingCtx = contextFor({ + idempotencyKeys: failingIdempotency, + orders, + paymentAttempts, + carts, + inventoryStock, + kv, + idempotencyKey, + cartId, + ownerToken, + }); + + await expect(checkoutHandler(failingCtx)).rejects.toThrow( + "simulated idempotency persistence failure", + ); + + expect(orders.rows.size).toBe(1); + expect(paymentAttempts.rows.size).toBe(1); + const firstOrderId = orders.rows.keys().next().value; + const firstAttemptId = paymentAttempts.rows.keys().next().value; + + const retryCtx = contextFor({ + idempotencyKeys: idempotencyBase, + orders, + paymentAttempts, + carts, + inventoryStock, + kv, + idempotencyKey, + cartId, + ownerToken, + }); + const secondResult = await checkoutHandler(retryCtx); + + expect(secondResult).toMatchObject({ + orderId: firstOrderId, + paymentAttemptId: firstAttemptId, + currency: "USD", + paymentPhase: "payment_pending", + }); + expect(orders.rows.size).toBe(1); + expect(paymentAttempts.rows.size).toBe(1); + }); + + it("falls back to storage-backed checkout when cached completed response has no matching rows", async () => { + const cartId = "cart_stale_cache"; + const idempotencyKey = "idem-key-stale-cache"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-for-stale-cache"; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 2, + unitPriceMinor: 650, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 2, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ); + const kv = new MemKv(); + const idempotencyRows = new Map(); + const idempotency = new MemColl(idempotencyRows); + + const fingerprint = cartContentFingerprint(cart.lineItems); + const keyHash = await sha256HexAsync( + `${CHECKOUT_ROUTE}|${cartId}|${cart.updatedAt}|${fingerprint}|${idempotencyKey}`, + ); + const idempotencyDocId = `idemp:${keyHash}`; + await idempotency.put(idempotencyDocId, { + route: CHECKOUT_ROUTE, + keyHash, + httpStatus: 200, + responseBody: { + orderId: "stale_order_1", + paymentPhase: "payment_pending", + paymentAttemptId: "stale_attempt_1", + currency: "USD", + totalMinor: 650, + finalizeToken: "cached-token", + }, + createdAt: now, + }); + + const result = await checkoutHandler( + contextFor({ + idempotencyKeys: idempotency, + orders, + paymentAttempts, + carts, + inventoryStock, + kv, + idempotencyKey, + cartId, + ownerToken, + }), + ); + + const expectedOrderId = deterministicOrderId(keyHash); + const expectedAttemptId = deterministicPaymentAttemptId(keyHash); + expect(result.orderId).toBe(expectedOrderId); + expect(result.paymentAttemptId).toBe(expectedAttemptId); + expect(orders.rows.size).toBe(1); + expect(paymentAttempts.rows.size).toBe(1); + expect(orders.rows.has(expectedOrderId)).toBe(true); + expect(paymentAttempts.rows.has(expectedAttemptId)).toBe(true); + }); + + it("serves fresh idempotent replay on repeated successful checkout calls", async () => { + const cartId = "cart_2"; + const idempotencyKey = "idem-key-strong-2"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-for-idempotent-replay"; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: "p2", + quantity: 2, + inventoryVersion: 1, + unitPriceMinor: 200, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const idempotency = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("p2", ""), + { + productId: "p2", + variantId: "", + version: 1, + quantity: 5, + updatedAt: now, + }, + ], + ]), + ); + const kv = new MemKv(); + const baseCtx = contextFor({ + idempotencyKeys: idempotency, + orders, + paymentAttempts, + carts, + inventoryStock, + kv, + idempotencyKey, + cartId, + ownerToken, + }); + + const first = await checkoutHandler(baseCtx); + const second = await checkoutHandler(baseCtx); + + expect(second).toEqual(first); + expect(orders.rows.size).toBe(1); + expect(paymentAttempts.rows.size).toBe(1); + }); + + it("requires ownerToken when cart has ownerTokenHash", async () => { + const cartId = "cart_owned"; + const idempotencyKey = "idem-key-owned-16ch"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerSecret = "owner-secret-for-checkout-1"; + const cart: StoredCart = { + currency: "USD", + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }], + ownerTokenHash: await sha256HexAsync(ownerSecret), + createdAt: now, + updatedAt: now, + }; + + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 1, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ), + kv: new MemKv(), + idempotencyKey, + cartId, + }); + + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "cart_token_required" }); + }); + + it("completes checkout when ownerToken matches cart ownerTokenHash", async () => { + const cartId = "cart_owned_ok"; + const idempotencyKey = "idem-key-owned-ok16"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerSecret = "correct-owner-token-12345"; + const cart: StoredCart = { + currency: "USD", + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }], + ownerTokenHash: await sha256HexAsync(ownerSecret), + createdAt: now, + updatedAt: now, + }; + + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 1, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ), + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken: ownerSecret, + }); + + const out = await checkoutHandler(ctx); + expect(out.paymentPhase).toBe("payment_pending"); + expect(out.totalMinor).toBe(100); + }); + + it("rejects checkout with wrong ownerToken when cart has ownerTokenHash", async () => { + const cartId = "cart_owned_2"; + const idempotencyKey = "idem-key-owned-16c2"; + const now = "2026-04-02T12:00:00.000Z"; + const cart: StoredCart = { + currency: "USD", + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }], + ownerTokenHash: await sha256HexAsync("correct-owner-token-12345"), + createdAt: now, + updatedAt: now, + }; + + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl( + new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 1, + quantity: 10, + updatedAt: now, + }, + ], + ]), + ), + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken: "wrong-owner-token-123456789012", + }); + + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "cart_token_invalid" }); + }); +}); + +describe("checkout route guardrails", () => { + beforeEach(() => { + consumeKvRateLimit.mockClear(); + consumeKvRateLimit.mockResolvedValue(true); + }); + + it("requires POST method", async () => { + const cartId = "cart_method"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-method-123456"; + const cart: StoredCart = { + currency: "USD", + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl(), + kv: new MemKv(), + idempotencyKey: "idem-key-strong-16", + cartId, + requestMethod: "GET", + ownerToken, + }); + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "METHOD_NOT_ALLOWED" }); + }); + + it("validates cart content bounds before processing", async () => { + const cartId = "cart_caps"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-bounds"; + const tooMany = Array.from({ length: COMMERCE_LIMITS.maxCartLineItems + 1 }, (_, i) => ({ + productId: `p-${i}`, + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 100, + })); + + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl( + new Map([ + [ + cartId, + { + currency: "USD", + lineItems: tooMany, + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }, + ], + ]), + ), + inventoryStock: new MemColl(), + kv: new MemKv(), + idempotencyKey: "idem-key-strong-17", + cartId, + ownerToken, + }); + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "payload_too_large" }); + }); + + it("blocks checkout when rate limit is exceeded", async () => { + const cartId = "cart_rate"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-rate-limit"; + const cart: StoredCart = { + currency: "USD", + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + const idempotencyKey = "idem-key-strong-r8"; + + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl(), + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + }); + + consumeKvRateLimit.mockResolvedValueOnce(false); + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "rate_limited" }); + expect(consumeKvRateLimit).toHaveBeenCalledTimes(1); + }); + + it("rejects checkout when simple-item product-level stock row is missing", async () => { + const cartId = "cart_authority"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-inventory-16"; + const product: StoredProduct = { + id: "authority-product", + type: "simple", + status: "active", + visibility: "public", + slug: "authority-product", + title: "Authority Product", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const sku: StoredProductSku = { + id: "sku_authority", + productId: product.id, + skuCode: "AUTH-SKU", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 50, + inventoryVersion: 4, + requiresShipping: true, + isDigital: false, + createdAt: now, + updatedAt: now, + }; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { productId: product.id, quantity: 1, inventoryVersion: 4, unitPriceMinor: 1000 }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + const idempotencyKey = "idem-key-strong-18"; + const ctx = contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + // Missing product-level stock row for simple item checkout path on purpose. + inventoryStock: new MemColl( + new Map([ + [ + inventoryStockDocId(product.id, sku.id), + { + productId: product.id, + variantId: sku.id, + version: 4, + quantity: 100, + updatedAt: now, + }, + ], + ]), + ), + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + extras: { + products: new MemColl(new Map([[product.id, product]])), + productSkus: new MemColl(new Map([[sku.id, sku]])), + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + bundleComponents: new MemColl(), + }, + }); + + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "product_unavailable" }); + }); + + it("rejects mismatched header/body idempotency input", async () => { + const cartId = "cart_conflict"; + const now = "2026-04-02T12:00:00.000Z"; + const ownerToken = "owner-token-conflict"; + const cart: StoredCart = { + currency: "USD", + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + const req = new Request("https://example.local/checkout", { + method: "POST", + headers: new Headers({ "Idempotency-Key": "header-key-16chars" }), + }); + const ctx = { + request: req as Request & { headers: Headers }, + input: { + cartId, + idempotencyKey: "body-key-16chars", + }, + storage: { + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl(), + }, + requestMeta: { ip: "127.0.0.1" }, + kv: new MemKv(), + } as unknown as RouteContext; + await expect(checkoutHandler(ctx)).rejects.toMatchObject({ code: "BAD_REQUEST" }); + }); +}); + +describe("checkout order snapshot capture", () => { + it("stores catalog snapshot fields on order line items", async () => { + const now = "2026-04-04T12:00:00.000Z"; + const cartId = "snapshot-cart"; + const idempotencyKey = "idem-key-snapshot-16"; + const ownerToken = "owner-token-snapshot"; + + const product: StoredProduct = { + id: "product_snapshot_1", + type: "simple", + status: "active", + visibility: "public", + slug: "snapshot-product", + title: "Snapshot Product", + shortDescription: "Snap short", + longDescription: "Snap long", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const sku: StoredProductSku = { + id: "sku_snapshot_1", + productId: product.id, + skuCode: "SNAP-SKU", + status: "active", + unitPriceMinor: 1200, + compareAtPriceMinor: 1500, + inventoryQuantity: 20, + inventoryVersion: 4, + requiresShipping: true, + isDigital: false, + createdAt: now, + updatedAt: now, + }; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: product.id, + variantId: sku.id, + quantity: 2, + inventoryVersion: 4, + unitPriceMinor: 1200, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const idempotencyKeys = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId(product.id, sku.id), + { + productId: product.id, + variantId: sku.id, + version: 4, + quantity: 5, + updatedAt: now, + }, + ], + ]), + ); + const products = new MemColl(new Map([[product.id, product]])); + const productSkus = new MemColl(new Map([[sku.id, sku]])); + + const out = await checkoutHandler( + contextFor({ + idempotencyKeys, + orders, + paymentAttempts, + carts, + inventoryStock, + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + extras: { + products, + productSkus, + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + bundleComponents: new MemColl(), + }, + }), + ); + + expect(out.totalMinor).toBe(2400); + const orderId = deterministicOrderId( + await sha256HexAsync( + `${CHECKOUT_ROUTE}|${cartId}|${cart.updatedAt}|${cartContentFingerprint(cart.lineItems)}|${idempotencyKey}`, + ), + ); + const order = await orders.get(orderId); + expect(order).toBeTruthy(); + expect(order?.lineItems[0]?.snapshot?.productTitle).toBe("Snapshot Product"); + expect(order?.lineItems[0]?.snapshot?.skuCode).toBe("SNAP-SKU"); + expect(order?.lineItems[0]?.snapshot?.lineSubtotalMinor).toBe(2400); + expect(order?.lineItems[0]?.snapshot?.lineDiscountMinor).toBe(0); + expect(order?.lineItems[0]?.snapshot?.lineTotalMinor).toBe(2400); + + product.title = "Updated Title"; + sku.unitPriceMinor = 3000; + await products.put(product.id, product); + await productSkus.put(sku.id, sku); + + const cachedOrder = await orders.get(orderId); + expect(cachedOrder?.lineItems[0]?.snapshot?.productTitle).toBe("Snapshot Product"); + }); + + it("captures digital entitlement and image snapshot data", async () => { + const now = "2026-04-04T12:00:00.000Z"; + const cartId = "snapshot-digital-cart"; + const idempotencyKey = "idem-digital-16chars"; + const ownerToken = "owner-token-digital"; + + const product: StoredProduct = { + id: "product_digital_1", + type: "simple", + status: "active", + visibility: "public", + slug: "snapshot-digital", + title: "Snapshot Digital", + shortDescription: "Snapshot digital short", + longDescription: "Snapshot digital long", + featured: false, + sortOrder: 0, + requiresShippingDefault: false, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const sku: StoredProductSku = { + id: "sku_digital_1", + productId: product.id, + skuCode: "DIGI-SKU", + status: "active", + unitPriceMinor: 900, + compareAtPriceMinor: 1200, + inventoryQuantity: 30, + inventoryVersion: 2, + requiresShipping: false, + isDigital: true, + createdAt: now, + updatedAt: now, + }; + const image: StoredProductAsset = { + id: "asset_image_1", + provider: "cloudinary", + externalAssetId: "image-001", + fileName: "snapshot.jpg", + altText: "Snapshot cover", + createdAt: now, + updatedAt: now, + }; + const imageLink: StoredProductAssetLink = { + id: "asset_link_image_1", + targetType: "product", + targetId: product.id, + assetId: image.id, + role: "primary_image", + position: 0, + createdAt: now, + updatedAt: now, + }; + const asset: StoredDigitalAsset = { + id: "digital_asset_1", + provider: "s3", + externalAssetId: "asset-pdf", + label: "Guide PDF", + downloadLimit: 2, + downloadExpiryDays: 60, + isManualOnly: false, + isPrivate: false, + createdAt: now, + updatedAt: now, + }; + const entitlement: StoredDigitalEntitlement = { + id: "entitlement_1", + skuId: sku.id, + digitalAssetId: asset.id, + grantedQuantity: 1, + createdAt: now, + updatedAt: now, + }; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: product.id, + variantId: sku.id, + quantity: 1, + inventoryVersion: 2, + unitPriceMinor: 900, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const idempotencyKeys = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId(product.id, sku.id), + { + productId: product.id, + variantId: sku.id, + version: 2, + quantity: 20, + updatedAt: now, + }, + ], + ]), + ); + const products = new MemColl(new Map([[product.id, product]])); + const productSkus = new MemColl(new Map([[sku.id, sku]])); + const productAssets = new MemColl(new Map([[image.id, image]])); + const productAssetLinks = new MemColl(new Map([[imageLink.id, imageLink]])); + const digitalAssets = new MemColl(new Map([[asset.id, asset]])); + const digitalEntitlements = new MemColl(new Map([[entitlement.id, entitlement]])); + + await checkoutHandler( + contextFor({ + idempotencyKeys, + orders, + paymentAttempts, + carts, + inventoryStock, + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + extras: { + products, + productSkus, + productSkuOptionValues: new MemColl(), + digitalAssets, + digitalEntitlements, + productAssetLinks, + productAssets, + bundleComponents: new MemColl(), + }, + }), + ); + + const orderId = deterministicOrderId( + await sha256HexAsync( + `${CHECKOUT_ROUTE}|${cartId}|${cart.updatedAt}|${cartContentFingerprint(cart.lineItems)}|${idempotencyKey}`, + ), + ); + const order = await orders.get(orderId); + const snapshot = order?.lineItems[0]?.snapshot; + expect(snapshot?.digitalEntitlements).toEqual([ + { + entitlementId: entitlement.id, + digitalAssetId: asset.id, + digitalAssetLabel: asset.label, + grantedQuantity: entitlement.grantedQuantity, + downloadLimit: asset.downloadLimit, + downloadExpiryDays: asset.downloadExpiryDays, + isManualOnly: asset.isManualOnly, + isPrivate: asset.isPrivate, + }, + ]); + expect(snapshot?.image).toMatchObject({ + assetId: image.id, + provider: image.provider, + externalAssetId: image.externalAssetId, + }); + }); + + it("persists frozen snapshot during idempotent checkout replay", async () => { + const now = "2026-04-05T12:00:00.000Z"; + const cartId = "snapshot-replay-cart"; + const idempotencyKey = "idem-key-replay-16"; + const ownerToken = "owner-token-replay"; + + const product: StoredProduct = { + id: "product_replay_1", + type: "simple", + status: "active", + visibility: "public", + slug: "snapshot-replay", + title: "Replay Product", + shortDescription: "Replay short", + longDescription: "Replay long", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const sku: StoredProductSku = { + id: "sku_replay_1", + productId: product.id, + skuCode: "REPLAY-SKU", + status: "active", + unitPriceMinor: 1500, + inventoryQuantity: 12, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: now, + updatedAt: now, + }; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: product.id, + variantId: sku.id, + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 1500, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const idempotencyKeys = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId(product.id, sku.id), + { + productId: product.id, + variantId: sku.id, + version: 1, + quantity: 6, + updatedAt: now, + }, + ], + ]), + ); + const ctx = contextFor({ + idempotencyKeys, + orders, + paymentAttempts, + carts, + inventoryStock, + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + extras: { + products: new MemColl(new Map([[product.id, product]])), + productSkus: new MemColl(new Map([[sku.id, sku]])), + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + bundleComponents: new MemColl(), + }, + }); + + const first = await checkoutHandler(ctx); + product.title = "Mutated Replay Product"; + sku.unitPriceMinor = 9999; + await asMemCollection(ctx.storage.products).put(product.id, product); + await asMemCollection(ctx.storage.productSkus).put(sku.id, sku); + const second = await checkoutHandler(ctx); + expect(second.orderId).toBe(first.orderId); + + const orderId = deterministicOrderId( + await sha256HexAsync( + `${CHECKOUT_ROUTE}|${cartId}|${cart.updatedAt}|${cartContentFingerprint(cart.lineItems)}|${idempotencyKey}`, + ), + ); + const order = await orders.get(orderId); + expect(order?.lineItems[0]?.snapshot?.productTitle).toBe("Replay Product"); + expect(second.totalMinor).toBe(first.totalMinor); + }); + + it("captures bundle summary in snapshot", async () => { + const now = "2026-04-06T12:00:00.000Z"; + const cartId = "snapshot-bundle-cart"; + const idempotencyKey = "idem-key-bundle-16"; + const ownerToken = "owner-token-bundle"; + + const componentProductA: StoredProduct = { + id: "bundle_component_product_a", + type: "simple", + status: "active", + visibility: "public", + slug: "component-a", + title: "Component A", + shortDescription: "Component A short", + longDescription: "Component A long", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const componentProductB: StoredProduct = { + id: "bundle_component_product_b", + type: "simple", + status: "active", + visibility: "public", + slug: "component-b", + title: "Component B", + shortDescription: "Component B short", + longDescription: "Component B long", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const bundle: StoredProduct = { + id: "bundle_product_1", + type: "bundle", + status: "active", + visibility: "public", + slug: "snapshot-bundle", + title: "Snapshot Bundle", + shortDescription: "Bundle short", + longDescription: "Bundle long", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + bundleDiscountType: "percentage", + bundleDiscountValueBps: 10_000, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const componentSkuA: StoredProductSku = { + id: "bundle_component_sku_a", + productId: componentProductA.id, + skuCode: "COMP-A", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 20, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: now, + updatedAt: now, + }; + const componentSkuB: StoredProductSku = { + id: "bundle_component_sku_b", + productId: componentProductB.id, + skuCode: "COMP-B", + status: "active", + unitPriceMinor: 500, + inventoryQuantity: 9, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: now, + updatedAt: now, + }; + const componentA: StoredBundleComponent = { + id: "bundle_comp_link_a", + bundleProductId: bundle.id, + componentSkuId: componentSkuA.id, + quantity: 2, + position: 0, + createdAt: now, + updatedAt: now, + }; + const componentB: StoredBundleComponent = { + id: "bundle_comp_link_b", + bundleProductId: bundle.id, + componentSkuId: componentSkuB.id, + quantity: 1, + position: 1, + createdAt: now, + updatedAt: now, + }; + + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: bundle.id, + quantity: 2, + inventoryVersion: 1, + unitPriceMinor: 0, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + const idempotencyKeys = new MemColl(); + const orders = new MemColl(); + const paymentAttempts = new MemColl(); + const carts = new MemColl(new Map([[cartId, cart]])); + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId(componentProductA.id, componentSkuA.id), + { + productId: componentProductA.id, + variantId: componentSkuA.id, + version: 5, + quantity: 50, + updatedAt: now, + }, + ], + [ + inventoryStockDocId(componentProductB.id, componentSkuB.id), + { + productId: componentProductB.id, + variantId: componentSkuB.id, + version: 7, + quantity: 30, + updatedAt: now, + }, + ], + ]), + ); + const products = new MemColl( + new Map([ + [componentProductA.id, componentProductA], + [componentProductB.id, componentProductB], + [bundle.id, bundle], + ]), + ); + const productSkus = new MemColl( + new Map([ + [componentSkuA.id, componentSkuA], + [componentSkuB.id, componentSkuB], + ]), + ); + const bundleComponents = new MemColl( + new Map([ + [componentA.id, componentA], + [componentB.id, componentB], + ]), + ); + + await checkoutHandler( + contextFor({ + idempotencyKeys, + orders, + paymentAttempts, + carts, + inventoryStock, + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + extras: { + products, + productSkus, + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + bundleComponents, + }, + }), + ); + + const orderId = deterministicOrderId( + await sha256HexAsync( + `${CHECKOUT_ROUTE}|${cartId}|${cart.updatedAt}|${cartContentFingerprint(cart.lineItems)}|${idempotencyKey}`, + ), + ); + const order = await orders.get(orderId); + const snapshot = order?.lineItems[0]?.snapshot; + expect(snapshot?.bundleSummary).toMatchObject({ + subtotalMinor: 2500, + discountType: "percentage", + discountValueBps: 10_000, + discountAmountMinor: 2500, + finalPriceMinor: 0, + availability: 9, + }); + expect(snapshot?.lineSubtotalMinor).toBe(5000); + expect(snapshot?.lineDiscountMinor).toBe(5000); + expect(snapshot?.lineTotalMinor).toBe(0); + expect(order?.lineItems[0]?.unitPriceMinor).toBe(0); + expect(snapshot?.bundleSummary?.components.every((c) => c.componentInventoryVersion >= 0)).toBe( + true, + ); + }); + + it("rejects checkout when a bundle component has insufficient stock", async () => { + const now = "2026-04-07T12:00:00.000Z"; + const cartId = "snapshot-bundle-low-stock"; + const idempotencyKey = "idem-bundle-lowstk16"; + const ownerToken = "owner-token-bndl-low"; + + const componentProduct: StoredProduct = { + id: "low_stock_comp_prod", + type: "simple", + status: "active", + visibility: "public", + slug: "low-comp", + title: "Component", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const bundle: StoredProduct = { + id: "bundle_low_stock", + type: "bundle", + status: "active", + visibility: "public", + slug: "bundle-low", + title: "Low bundle", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + bundleDiscountType: "none", + createdAt: now, + updatedAt: now, + publishedAt: now, + }; + const componentSku: StoredProductSku = { + id: "low_stock_comp_sku", + productId: componentProduct.id, + skuCode: "LOW-COMP", + status: "active", + unitPriceMinor: 100, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: now, + updatedAt: now, + }; + const componentLink: StoredBundleComponent = { + id: "low_bc_1", + bundleProductId: bundle.id, + componentSkuId: componentSku.id, + quantity: 10, + position: 0, + createdAt: now, + updatedAt: now, + }; + const cart: StoredCart = { + currency: "USD", + lineItems: [ + { + productId: bundle.id, + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 100, + }, + ], + ownerTokenHash: await sha256HexAsync(ownerToken), + createdAt: now, + updatedAt: now, + }; + + await expect( + checkoutHandler( + contextFor({ + idempotencyKeys: new MemColl(), + orders: new MemColl(), + paymentAttempts: new MemColl(), + carts: new MemColl(new Map([[cartId, cart]])), + inventoryStock: new MemColl( + new Map([ + [ + inventoryStockDocId(componentProduct.id, componentSku.id), + { + productId: componentProduct.id, + variantId: componentSku.id, + version: 1, + quantity: 3, + updatedAt: now, + }, + ], + ]), + ), + kv: new MemKv(), + idempotencyKey, + cartId, + ownerToken, + extras: { + products: new MemColl( + new Map([ + [componentProduct.id, componentProduct], + [bundle.id, bundle], + ]), + ), + productSkus: new MemColl(new Map([[componentSku.id, componentSku]])), + productSkuOptionValues: new MemColl(), + digitalAssets: new MemColl(), + digitalEntitlements: new MemColl(), + productAssetLinks: new MemColl(), + productAssets: new MemColl(), + bundleComponents: new MemColl(new Map([[componentLink.id, componentLink]])), + }, + }), + ), + ).rejects.toMatchObject({ code: "insufficient_stock" }); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/checkout.ts b/packages/plugins/commerce/src/handlers/checkout.ts new file mode 100644 index 000000000..51ce838a4 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/checkout.ts @@ -0,0 +1,320 @@ +/** + * Checkout: cart → `payment_pending` order + `pending` payment attempt (Stripe session in a later slice). + * When the cart has `ownerTokenHash`, `ownerToken` must match (same possession proof as `cart/get`). + */ + +import type { RouteContext, StorageCollection } from "emdash"; +import { PluginRouteError } from "emdash"; + +import { validateIdempotencyKey } from "../kernel/idempotency-key.js"; +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { cartContentFingerprint } from "../lib/cart-fingerprint.js"; +import { projectCartLineItemsForStorage } from "../lib/cart-lines.js"; +import { assertCartOwnerToken } from "../lib/cart-owner-token.js"; +import { validateCartLineItems } from "../lib/cart-validation.js"; +import { buildOrderLineSnapshots } from "../lib/catalog-order-snapshots.js"; +import { validateLineItemsStockForCheckout } from "../lib/checkout-inventory-validation.js"; +import { randomHex, sha256HexAsync } from "../lib/crypto-adapter.js"; +import { isIdempotencyRecordFresh } from "../lib/idempotency-ttl.js"; +import { LineConflictError, mergeLineItemsBySku } from "../lib/merge-line-items.js"; +import { buildRateLimitActorKey } from "../lib/rate-limit-identity.js"; +import { consumeKvRateLimit } from "../lib/rate-limit-kv.js"; +import { requirePost } from "../lib/require-post.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { CheckoutInput } from "../schemas.js"; +import type { + StoredCart, + StoredIdempotencyKey, + StoredOrder, + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductSku, + StoredProductSkuOptionValue, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredBundleComponent, + StoredPaymentAttempt, + StoredInventoryStock, + OrderLineItem, +} from "../types.js"; +import { asCollection } from "./catalog-conflict.js"; +import type { CheckoutPendingState, CheckoutResponse } from "./checkout-state.js"; +import { + CHECKOUT_PENDING_KIND, + CHECKOUT_ROUTE, + computeCheckoutReplayIntegrity, + decideCheckoutReplayState, + deterministicOrderId, + deterministicPaymentAttemptId, + restorePendingCheckout, + resolvePaymentProviderId, + toCheckoutClientResponse, + validateCachedCheckoutCompleted, +} from "./checkout-state.js"; + +type SnapshotQueryCollection = { + get(id: string): Promise; + query(options?: { + where?: Record; + limit?: number; + }): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }>; +}; + +function asSnapshotCollection(raw: unknown): SnapshotQueryCollection { + if (raw) { + const collection = raw as { + get: (id: string) => Promise; + query?: SnapshotQueryCollection["query"]; + }; + return { + get: collection.get.bind(collection), + query: collection.query + ? collection.query.bind(collection) + : async () => ({ items: [], hasMore: false }), + }; + } + return { + async get() { + return null; + }, + async query() { + return { items: [], hasMore: false }; + }, + }; +} + +export async function checkoutHandler( + ctx: RouteContext, + paymentProviderId?: string, +) { + requirePost(ctx); + const resolvedPaymentProviderId = resolvePaymentProviderId(paymentProviderId); + + const nowMs = Date.now(); + const nowIso = new Date(nowMs).toISOString(); + + const headerKey = ctx.request.headers.get("Idempotency-Key")?.trim() || undefined; + const bodyKey = ctx.input.idempotencyKey?.trim() || undefined; + + if (headerKey && bodyKey && headerKey !== bodyKey) { + throw PluginRouteError.badRequest( + "Idempotency-Key conflict: header and body values must match when both are supplied", + ); + } + + const idempotencyKey = bodyKey ?? headerKey; + + if (!validateIdempotencyKey(idempotencyKey)) { + throw PluginRouteError.badRequest( + "Idempotency-Key is required (header or body) and must be 16–128 printable ASCII characters", + ); + } + + const ipHash = await buildRateLimitActorKey(ctx, "checkout"); + const allowed = await consumeKvRateLimit({ + kv: ctx.kv, + keySuffix: `checkout:ip:${ipHash}`, + limit: COMMERCE_LIMITS.defaultCheckoutPerIpPerWindow, + windowMs: COMMERCE_LIMITS.defaultRateWindowMs, + nowMs, + }); + if (!allowed) { + throwCommerceApiError({ + code: "RATE_LIMITED", + message: "Too many checkout attempts; try again shortly", + }); + } + + const carts = asCollection(ctx.storage.carts); + const orders = asCollection(ctx.storage.orders); + const attempts = asCollection(ctx.storage.paymentAttempts); + const cart = await carts.get(ctx.input.cartId); + if (!cart) { + throwCommerceApiError({ code: "CART_NOT_FOUND", message: "Cart not found" }); + } + await assertCartOwnerToken(cart, ctx.input.ownerToken, "checkout"); + if (cart.lineItems.length === 0) { + throwCommerceApiError({ code: "CART_EMPTY", message: "Cart has no line items" }); + } + if (cart.lineItems.length > COMMERCE_LIMITS.maxCartLineItems) { + throwCommerceApiError({ + code: "PAYLOAD_TOO_LARGE", + message: `Cart exceeds maximum of ${COMMERCE_LIMITS.maxCartLineItems} line items`, + }); + } + const lineItemValidationMessage = validateCartLineItems(cart.lineItems); + if (lineItemValidationMessage) { + throw PluginRouteError.badRequest(lineItemValidationMessage); + } + + const fingerprint = cartContentFingerprint(cart.lineItems); + const keyHash = await sha256HexAsync( + `${CHECKOUT_ROUTE}|${ctx.input.cartId}|${cart.updatedAt}|${fingerprint}|${idempotencyKey}`, + ); + const idempotencyDocId = `idemp:${keyHash}`; + + const idempotencyKeys = asCollection(ctx.storage.idempotencyKeys); + const cached = await idempotencyKeys.get(idempotencyDocId); + if (cached && isIdempotencyRecordFresh(cached.createdAt, nowMs)) { + const decision = decideCheckoutReplayState(cached); + switch (decision.kind) { + case "cached_completed": + const cachedOrder = await orders.get(decision.response.orderId); + const cachedAttempt = await attempts.get(decision.response.paymentAttemptId); + if ( + !(await validateCachedCheckoutCompleted( + keyHash, + decision.response, + cachedOrder, + cachedAttempt, + )) + ) { + break; + } + return toCheckoutClientResponse(decision.response); + case "cached_pending": + return toCheckoutClientResponse( + await restorePendingCheckout( + idempotencyDocId, + cached, + decision.pending, + nowIso, + idempotencyKeys, + orders, + attempts, + ), + ); + case "not_cached": + default: + break; + } + } + + const inventoryStock = asCollection(ctx.storage.inventoryStock); + await validateLineItemsStockForCheckout(cart.lineItems, { + products: asCollection(ctx.storage.products), + bundleComponents: asCollection(ctx.storage.bundleComponents), + productSkus: asCollection(ctx.storage.productSkus), + inventoryStock, + }); + + let orderLineItems: OrderLineItem[]; + try { + orderLineItems = mergeLineItemsBySku(projectCartLineItemsForStorage(cart.lineItems)); + } catch (error) { + if (error instanceof LineConflictError) { + throwCommerceApiError({ + code: "ORDER_STATE_CONFLICT", + message: error.message, + details: { + reason: "line_conflict", + productId: error.productId, + variantId: error.variantId ?? null, + expected: error.expected, + actual: error.actual, + }, + }); + } + throw PluginRouteError.badRequest( + "Cart has duplicate SKUs with conflicting price or inventory version snapshots", + ); + } + + const productSnapshots = await buildOrderLineSnapshots(orderLineItems, cart.currency, { + products: asSnapshotCollection(ctx.storage.products), + productSkus: asSnapshotCollection(ctx.storage.productSkus), + productSkuOptionValues: asSnapshotCollection( + ctx.storage.productSkuOptionValues, + ), + productDigitalAssets: asSnapshotCollection(ctx.storage.digitalAssets), + productDigitalEntitlements: asSnapshotCollection( + ctx.storage.digitalEntitlements, + ), + productAssetLinks: asSnapshotCollection(ctx.storage.productAssetLinks), + productAssets: asSnapshotCollection(ctx.storage.productAssets), + bundleComponents: asSnapshotCollection(ctx.storage.bundleComponents), + inventoryStock: { + get: (id: string) => inventoryStock.get(id), + }, + }); + const orderLineItemsWithSnapshots = orderLineItems.map((line, index) => ({ + ...line, + snapshot: productSnapshots[index], + unitPriceMinor: productSnapshots[index]?.unitPriceMinor ?? line.unitPriceMinor, + })); + + const totalMinor = orderLineItemsWithSnapshots.reduce( + (sum, l) => sum + l.unitPriceMinor * l.quantity, + 0, + ); + const orderId = deterministicOrderId(keyHash); + + const finalizeToken = await randomHex(24); + const finalizeTokenHash = await sha256HexAsync(finalizeToken); + + const order: StoredOrder = { + cartId: ctx.input.cartId, + paymentPhase: "payment_pending", + currency: cart.currency, + lineItems: orderLineItemsWithSnapshots, + totalMinor, + finalizeTokenHash, + createdAt: nowIso, + updatedAt: nowIso, + }; + + const paymentAttemptId = deterministicPaymentAttemptId(keyHash); + const attempt: StoredPaymentAttempt = { + orderId, + providerId: resolvedPaymentProviderId, + status: "pending", + createdAt: nowIso, + updatedAt: nowIso, + }; + + const pendingState: CheckoutPendingState = { + kind: CHECKOUT_PENDING_KIND, + orderId, + paymentAttemptId, + providerId: resolvedPaymentProviderId, + cartId: ctx.input.cartId, + paymentPhase: "payment_pending", + finalizeToken, + totalMinor, + currency: cart.currency, + lineItems: orderLineItemsWithSnapshots, + createdAt: nowIso, + }; + + await idempotencyKeys.put(idempotencyDocId, { + route: CHECKOUT_ROUTE, + keyHash, + httpStatus: 202, + responseBody: pendingState, + createdAt: nowIso, + }); + + await orders.put(orderId, order); + await attempts.put(paymentAttemptId, attempt); + + const responseBody: CheckoutResponse = { + orderId, + paymentPhase: "payment_pending", + paymentAttemptId, + totalMinor, + currency: cart.currency, + finalizeToken, + }; + const replayIntegrity = await computeCheckoutReplayIntegrity(keyHash, responseBody); + + await idempotencyKeys.put(idempotencyDocId, { + route: CHECKOUT_ROUTE, + keyHash, + httpStatus: 200, + responseBody: { ...responseBody, replayIntegrity }, + createdAt: nowIso, + }); + + return responseBody; +} diff --git a/packages/plugins/commerce/src/handlers/cron.test.ts b/packages/plugins/commerce/src/handlers/cron.test.ts new file mode 100644 index 000000000..d252484d2 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/cron.test.ts @@ -0,0 +1,75 @@ +import type { PluginContext } from "emdash"; +import { describe, expect, it, vi } from "vitest"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import type { StoredIdempotencyKey } from "../types.js"; +import { handleIdempotencyCleanup } from "./cron.js"; + +class MemIdemp { + constructor(public readonly rows = new Map()) {} + + async query(opts: { + where?: Record; + limit?: number; + cursor?: string; + orderBy?: Record; + }) { + const where = opts.where ?? {}; + const lt = (where.createdAt as { lt?: string } | undefined)?.lt; + const items: Array<{ id: string; data: StoredIdempotencyKey }> = []; + for (const [id, data] of this.rows) { + if (lt !== undefined && typeof data.createdAt === "string" && !(data.createdAt < lt)) + continue; + items.push({ id, data: { ...data } }); + if (items.length >= (opts.limit ?? 100)) break; + } + return { items, hasMore: false, cursor: undefined as string | undefined }; + } + + async deleteMany(ids: string[]): Promise { + let n = 0; + for (const id of ids) { + if (this.rows.delete(id)) n++; + } + return n; + } +} + +describe("handleIdempotencyCleanup", () => { + it("deletes rows older than TTL", async () => { + const old = new Date( + Date.now() - COMMERCE_LIMITS.idempotencyRecordTtlMs - 86_400_000, + ).toISOString(); + const recent = new Date().toISOString(); + const mem = new MemIdemp(); + mem.rows.set("a", { + route: "checkout", + keyHash: "h1", + httpStatus: 200, + responseBody: {}, + createdAt: old, + }); + mem.rows.set("b", { + route: "checkout", + keyHash: "h2", + httpStatus: 200, + responseBody: {}, + createdAt: recent, + }); + + const log = { info: vi.fn() }; + const ctx = { + storage: { idempotencyKeys: mem }, + log, + } as unknown as PluginContext; + + await handleIdempotencyCleanup(ctx); + + expect(mem.rows.has("a")).toBe(false); + expect(mem.rows.has("b")).toBe(true); + expect(log.info).toHaveBeenCalledWith( + "commerce.cron.idempotency_cleanup", + expect.objectContaining({ deleted: 1 }), + ); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/cron.ts b/packages/plugins/commerce/src/handlers/cron.ts new file mode 100644 index 000000000..642d501b4 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/cron.ts @@ -0,0 +1,40 @@ +/** + * Scheduled maintenance (idempotency TTL, future retention jobs). + */ + +import type { PluginContext } from "emdash"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import type { StoredIdempotencyKey } from "../types.js"; +import { asCollection } from "./catalog-conflict.js"; + +/** + * Delete idempotency records older than {@link COMMERCE_LIMITS.idempotencyRecordTtlMs} + * (same window used for replay; expired rows are safe to remove). + */ +export async function handleIdempotencyCleanup(ctx: PluginContext): Promise { + const coll = asCollection(ctx.storage.idempotencyKeys); + const cutoffIso = new Date(Date.now() - COMMERCE_LIMITS.idempotencyRecordTtlMs).toISOString(); + let cursor: string | undefined; + let deleted = 0; + + do { + const batch = await coll.query({ + where: { createdAt: { lt: cutoffIso } }, + limit: 100, + cursor, + orderBy: { createdAt: "asc" }, + }); + + const ids = batch.items.map((row) => row.id); + if (ids.length > 0) { + deleted += await coll.deleteMany(ids); + } + + cursor = batch.cursor; + } while (cursor); + + if (deleted > 0) { + ctx.log.info("commerce.cron.idempotency_cleanup", { deleted }); + } +} diff --git a/packages/plugins/commerce/src/handlers/recommendations.test.ts b/packages/plugins/commerce/src/handlers/recommendations.test.ts new file mode 100644 index 000000000..412797918 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/recommendations.test.ts @@ -0,0 +1,32 @@ +import { PluginRouteError } from "emdash"; +import { describe, expect, it } from "vitest"; + +import type { RecommendationsInput } from "../schemas.js"; +import { recommendationsHandler } from "./recommendations.js"; + +function ctx( + method: string, + input: RecommendationsInput = {}, +): Parameters[0] { + return { + request: new Request("https://example.test/api", { method }), + input, + } as never; +} + +describe("recommendationsHandler", () => { + it("returns disabled payload on POST", async () => { + const out = await recommendationsHandler(ctx("POST", { limit: 5 })); + expect(out).toEqual({ + ok: true, + enabled: false, + strategy: "disabled", + productIds: [], + reason: "no_recommender_configured", + }); + }); + + it("rejects non-POST", async () => { + await expect(recommendationsHandler(ctx("GET"))).rejects.toThrow(PluginRouteError); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/recommendations.ts b/packages/plugins/commerce/src/handlers/recommendations.ts new file mode 100644 index 000000000..937683be4 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/recommendations.ts @@ -0,0 +1,137 @@ +/** + * Read-only recommendation contract and route seam for third-party providers. + * + * Checkout and finalize are closed kernel paths. Recommendation hooks are + * explicitly read-only and may only post-derive product IDs. + */ + +import type { RouteContext } from "emdash"; + +import type { CommerceRecommendationResolver } from "../catalog-extensibility.js"; +import type { + CommerceRecommendationResult, + CommerceRecommendationInput, +} from "../catalog-extensibility.js"; +import { requirePost } from "../lib/require-post.js"; +import type { RecommendationsInput } from "../schemas.js"; + +export interface RecommendationsResponseBase { + ok: true; + productIds: readonly string[]; + reason: string; +} + +export interface RecommendationsDisabledResponse extends RecommendationsResponseBase { + enabled: false; + strategy: "disabled"; + productIds: []; + reason: "no_recommender_configured" | "provider_error" | "provider_empty" | "provider_invalid"; +} + +export interface RecommendationsEnabledResponse extends RecommendationsResponseBase { + enabled: true; + strategy: "provider"; + providerId?: string; +} + +export type RecommendationsResponse = + | RecommendationsDisabledResponse + | RecommendationsEnabledResponse; + +export type RecommendationsHandlerOptions = { + resolver?: CommerceRecommendationResolver; + providerId?: string; +}; + +const DISABLED_PROVIDER_RESPONSE: RecommendationsDisabledResponse = { + ok: true, + enabled: false, + strategy: "disabled", + productIds: [], + reason: "no_recommender_configured", +}; + +function normalizeLimit(limit: number | undefined): number { + if (typeof limit !== "number" || !Number.isFinite(limit)) return 10; + if (limit < 1) return 1; + return limit; +} + +function toInput(input: RecommendationsInput): CommerceRecommendationInput { + return { + productId: input.productId, + variantId: input.variantId, + cartId: input.cartId, + limit: normalizeLimit(input.limit), + }; +} + +function buildProviderResponse( + result: CommerceRecommendationResult | null | undefined, + inputLimit: number, + fallbackProviderId?: string, +): RecommendationsResponse { + if (!result) { + return { + ...DISABLED_PROVIDER_RESPONSE, + reason: "no_recommender_configured", + }; + } + const productIds = (result.productIds ?? []) + .filter((value): value is string => typeof value === "string" && value.length > 0) + .filter((value, index, list) => index === list.indexOf(value)) + .slice(0, inputLimit); + if (productIds.length === 0) { + return { + ...DISABLED_PROVIDER_RESPONSE, + reason: "provider_empty", + }; + } + return { + ok: true, + enabled: true, + strategy: "provider", + productIds, + providerId: result.providerId ?? fallbackProviderId, + reason: result.reason ?? "provider_result", + }; +} + +export function createRecommendationsHandler( + options: RecommendationsHandlerOptions = {}, +): (ctx: RouteContext) => Promise { + return async function handleRecommendations( + ctx: RouteContext, + ): Promise { + requirePost(ctx); + const input = toInput(ctx.input); + if (!options.resolver) { + return DISABLED_PROVIDER_RESPONSE; + } + + try { + const resolved = await options.resolver(input); + return buildProviderResponse(resolved, input.limit ?? 10, options.providerId); + } catch { + return { + ok: true, + enabled: false, + strategy: "disabled", + productIds: [], + reason: "provider_error", + }; + } + }; +} + +export async function recommendationsHandler( + ctx: RouteContext, +): Promise { + return createRecommendationsHandler()(ctx); +} + +/** + * Type-level contract to make the read-only recommendation seam obvious to + * external plugins and MCP tooling. + */ +export type { CommerceRecommendationResult, CommerceRecommendationInput }; diff --git a/packages/plugins/commerce/src/handlers/webhook-handler.test.ts b/packages/plugins/commerce/src/handlers/webhook-handler.test.ts new file mode 100644 index 000000000..d4dfaa6ea --- /dev/null +++ b/packages/plugins/commerce/src/handlers/webhook-handler.test.ts @@ -0,0 +1,152 @@ +import { beforeEach, describe, expect, it, vi } from "vitest"; + +const finalizePaymentFromWebhook = vi.fn(); +const consumeKvRateLimit = vi.fn(async (_opts?: unknown) => true); + +vi.mock("../orchestration/finalize-payment.js", () => ({ + __esModule: true, + finalizePaymentFromWebhook: (...args: unknown[]) => finalizePaymentFromWebhook(...args), +})); +vi.mock("../lib/rate-limit-kv.js", () => ({ + __esModule: true, + consumeKvRateLimit: (opts: unknown) => consumeKvRateLimit(opts), +})); + +import { createPaymentWebhookRoute } from "../services/commerce-extension-seams.js"; +import type { handlePaymentWebhook } from "./webhook-handler.js"; + +describe("payment webhook seam", () => { + beforeEach(() => { + finalizePaymentFromWebhook.mockReset(); + consumeKvRateLimit.mockReset(); + consumeKvRateLimit.mockResolvedValue(true); + }); + + function ctx(): Parameters[0] { + return { + request: new Request("https://example.test/webhooks/stripe", { + method: "POST", + body: JSON.stringify({ + orderId: "order_1", + externalEventId: "evt_1", + finalizeToken: "tok", + }), + headers: { "content-length": "57" }, + }), + input: { orderId: "order_1", externalEventId: "evt_1", finalizeToken: "tok" }, + storage: { + orders: {} as never, + webhookReceipts: {} as never, + paymentAttempts: {} as never, + inventoryLedger: {} as never, + inventoryStock: {} as never, + }, + kv: {} as never, + requestMeta: { ip: "127.0.0.1" }, + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never; + } + + const adapter = { + providerId: "stripe", + verifyRequest: vi.fn(async () => undefined), + buildFinalizeInput: vi.fn(() => ({ + orderId: "order_1", + externalEventId: "evt_1", + finalizeToken: "tok", + })), + buildCorrelationId: vi.fn(() => "corr:evt_1"), + buildRateLimitSuffix: vi.fn(() => "stripe:ip"), + }; + + it("adapts provider input and delegates to finalize-payment", async () => { + finalizePaymentFromWebhook.mockResolvedValue({ + kind: "completed", + orderId: "order_1", + }); + + const out = await createPaymentWebhookRoute(adapter)(ctx()); + + expect(adapter.verifyRequest).toHaveBeenCalledTimes(1); + expect(adapter.buildFinalizeInput).toHaveBeenCalledTimes(1); + expect(finalizePaymentFromWebhook).toHaveBeenCalledTimes(1); + expect(finalizePaymentFromWebhook).toHaveBeenCalledWith( + expect.anything(), + expect.objectContaining({ + orderId: "order_1", + externalEventId: "evt_1", + finalizeToken: "tok", + providerId: "stripe", + correlationId: "corr:evt_1", + }), + ); + expect(out).toEqual({ ok: true, replay: false, orderId: "order_1" }); + }); + + it("rejects non-POST webhook requests", async () => { + await expect( + createPaymentWebhookRoute(adapter)({ + ...(ctx() as ReturnType), + request: new Request("https://example.test/webhooks/stripe", { method: "GET" }), + } as never), + ).rejects.toMatchObject({ code: "METHOD_NOT_ALLOWED" }); + }); + + it("rejects oversized webhook payload by header cap", async () => { + await expect( + createPaymentWebhookRoute(adapter)({ + ...(ctx() as ReturnType), + request: new Request("https://example.test/webhooks/stripe", { + method: "POST", + body: "{}", + headers: { "content-length": `${Number.MAX_SAFE_INTEGER}` }, + }), + } as never), + ).rejects.toMatchObject({ code: "payload_too_large" }); + }); + + it("rejects oversized webhook payload when content-length is missing or malformed", async () => { + const bigBody = "x".repeat(65_537); + await expect( + createPaymentWebhookRoute(adapter)({ + ...(ctx() as ReturnType), + request: new Request("https://example.test/webhooks/stripe", { + method: "POST", + body: bigBody, + headers: { "content-length": "not-a-number" }, + }), + } as never), + ).rejects.toMatchObject({ code: "payload_too_large" }); + }); + + it("enforces webhook rate limit", async () => { + consumeKvRateLimit.mockResolvedValueOnce(false); + await expect(createPaymentWebhookRoute(adapter)(ctx())).rejects.toMatchObject({ + code: "rate_limited", + }); + expect(consumeKvRateLimit).toHaveBeenCalledTimes(1); + }); + + it("dedupes concurrent duplicate webhook deliveries", async () => { + let resolveFinalize!: () => void; + const finalizePromise = new Promise<{ kind: "completed"; orderId: string }>((resolve) => { + resolveFinalize = () => resolve({ kind: "completed", orderId: "order_1" }); + }); + finalizePaymentFromWebhook.mockReturnValue(finalizePromise); + + const first = createPaymentWebhookRoute(adapter)(ctx()); + const second = createPaymentWebhookRoute(adapter)(ctx()); + const all = Promise.all([first, second]); + + resolveFinalize(); + const [firstResult, secondResult] = await all; + expect(finalizePaymentFromWebhook).toHaveBeenCalledTimes(1); + expect(firstResult).toEqual({ ok: true, replay: false, orderId: "order_1" }); + expect(secondResult).toEqual({ ok: true, replay: false, orderId: "order_1" }); + }); +}); diff --git a/packages/plugins/commerce/src/handlers/webhook-handler.ts b/packages/plugins/commerce/src/handlers/webhook-handler.ts new file mode 100644 index 000000000..bd4267142 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/webhook-handler.ts @@ -0,0 +1,133 @@ +/** + * Shared payment-webhook orchestration entrypoint for gateway providers. + * + * The commerce kernel stays the only place that writes orders, payment attempts, + * webhook receipts, and inventory. Providers/third-party modules should adapt to + * this contract instead of writing storage directly. + */ + +import type { RouteContext } from "emdash"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { buildRateLimitActorKey } from "../lib/rate-limit-identity.js"; +import { consumeKvRateLimit } from "../lib/rate-limit-kv.js"; +import { requirePost } from "../lib/require-post.js"; +import { + finalizePaymentFromWebhook, + type FinalizeWebhookInput, + type FinalizeWebhookResult, + type FinalizePaymentPorts, +} from "../orchestration/finalize-payment.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { + CommerceWebhookAdapter, + CommerceWebhookFinalizeResponse, + CommerceWebhookInput, +} from "../services/commerce-provider-contracts.js"; +import type { + StoredInventoryLedgerEntry, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredWebhookReceipt, +} from "../types.js"; +import { asCollection } from "./catalog-conflict.js"; +const inFlightWebhookFinalizeByKey = new Map>(); + +export type WebhookProviderInput = CommerceWebhookInput; + +export type WebhookFinalizeResponse = CommerceWebhookFinalizeResponse; + +export type { CommerceWebhookAdapter } from "../services/commerce-provider-contracts.js"; + +function buildFinalizePorts(ctx: RouteContext): FinalizePaymentPorts { + return { + orders: asCollection(ctx.storage.orders), + webhookReceipts: asCollection(ctx.storage.webhookReceipts), + paymentAttempts: asCollection(ctx.storage.paymentAttempts), + inventoryLedger: asCollection(ctx.storage.inventoryLedger), + inventoryStock: asCollection(ctx.storage.inventoryStock), + log: ctx.log, + }; +} + +function toWebhookResult(result: FinalizeWebhookResult): WebhookFinalizeResponse { + if (result.kind === "replay") { + return { ok: true, replay: true, reason: result.reason }; + } + if (result.kind === "completed") { + return { ok: true, replay: false, orderId: result.orderId }; + } + // api_error + throwCommerceApiError(result.error); +} + +export async function handlePaymentWebhook( + ctx: RouteContext, + adapter: CommerceWebhookAdapter, +): Promise { + requirePost(ctx); + + const contentLength = ctx.request.headers.get("content-length"); + const n = contentLength !== null && contentLength !== "" ? Number(contentLength) : Number.NaN; + if (Number.isFinite(n)) { + if (n > COMMERCE_LIMITS.maxWebhookBodyBytes) { + throwCommerceApiError({ + code: "PAYLOAD_TOO_LARGE", + message: "Webhook body is too large", + }); + } + } else { + const bodyText = await ctx.request.clone().text(); + const bodyBytes = new TextEncoder().encode(bodyText).byteLength; + if (bodyBytes > COMMERCE_LIMITS.maxWebhookBodyBytes) { + throwCommerceApiError({ + code: "PAYLOAD_TOO_LARGE", + message: "Webhook body is too large", + }); + } + } + + await adapter.verifyRequest(ctx); + + const input = adapter.buildFinalizeInput(ctx); + const inFlightKey = `${adapter.providerId}\0${input.orderId}\0${input.externalEventId}\0${input.finalizeToken}`; + + let pending = inFlightWebhookFinalizeByKey.get(inFlightKey); + if (!pending) { + pending = (async () => { + try { + const nowMs = Date.now(); + const ipHash = await buildRateLimitActorKey( + ctx, + `webhook:${adapter.buildRateLimitSuffix(ctx)}`, + ); + const allowed = await consumeKvRateLimit({ + kv: ctx.kv, + keySuffix: `webhook:${adapter.buildRateLimitSuffix(ctx)}:${ipHash}`, + limit: COMMERCE_LIMITS.defaultWebhookPerIpPerWindow, + windowMs: COMMERCE_LIMITS.defaultRateWindowMs, + nowMs, + }); + if (!allowed) { + throwCommerceApiError({ + code: "RATE_LIMITED", + message: "Too many webhook deliveries from this network path", + }); + } + + const finalInput: FinalizeWebhookInput = { + ...input, + providerId: adapter.providerId, + correlationId: adapter.buildCorrelationId(ctx), + }; + const result = await finalizePaymentFromWebhook(buildFinalizePorts(ctx), finalInput); + return toWebhookResult(result); + } finally { + inFlightWebhookFinalizeByKey.delete(inFlightKey); + } + })(); + inFlightWebhookFinalizeByKey.set(inFlightKey, pending); + } + return await pending; +} diff --git a/packages/plugins/commerce/src/handlers/webhooks-stripe.test.ts b/packages/plugins/commerce/src/handlers/webhooks-stripe.test.ts new file mode 100644 index 000000000..a123ddc17 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/webhooks-stripe.test.ts @@ -0,0 +1,325 @@ +import { beforeEach, describe, expect, it, vi } from "vitest"; + +import { STRIPE_WEBHOOK_SIGNATURE } from "../services/commerce-provider-contracts.js"; +import { + clampStripeTolerance, + extractStripeFinalizeMetadata, + hashWithSecret, + isWebhookBodyWithinSizeLimit, + isWebhookSignatureValid, + parseStripeSignatureHeader, + resolveWebhookSignatureToleranceSeconds, + stripeWebhookHandler, +} from "./webhooks-stripe.js"; + +const finalizePaymentFromWebhook = vi.fn<(ports: unknown, input: unknown) => Promise>(); +const consumeKvRateLimit = vi.fn< + (input: { + kv: unknown; + keySuffix: string; + limit: number; + windowMs: number; + nowMs: number; + }) => Promise +>(async () => true); + +vi.mock("../orchestration/finalize-payment.js", () => ({ + __esModule: true, + finalizePaymentFromWebhook: (...args: Parameters) => + finalizePaymentFromWebhook(...args), +})); +vi.mock("../lib/rate-limit-kv.js", () => ({ + __esModule: true, + consumeKvRateLimit: (...args: Parameters) => + consumeKvRateLimit(...args), +})); + +describe("stripe webhook signature helpers", () => { + const secret = "whsec_test_secret"; + const rawBody = JSON.stringify({ orderId: "o1", externalEventId: "evt_1" }); + const rawStripeEventBody = JSON.stringify({ + id: "evt_live_test", + type: "payment_intent.succeeded", + data: { + object: { + id: "pi_live_test", + metadata: { + emdashOrderId: "order_1", + emdashFinalizeToken: "token_12345678901234", + }, + }, + }, + }); + const timestamp = 1_760_000_000; + + beforeEach(() => { + finalizePaymentFromWebhook.mockReset(); + consumeKvRateLimit.mockReset(); + consumeKvRateLimit.mockResolvedValue(true); + }); + + it("parses stripe signature header", async () => { + const hash = await hashWithSecret(secret, timestamp, rawBody); + const sig = `t=${timestamp},v1=${hash},v1=ignored`; + const parsed = parseStripeSignatureHeader(sig); + expect(parsed).toEqual({ + timestamp, + signatures: [hash, "ignored"], + }); + }); + + it("validates a matching v1 signature", async () => { + const hash = await hashWithSecret(secret, timestamp, rawBody); + const sig = `t=${timestamp},v1=${hash}`; + const restore = vi.spyOn(Date, "now").mockReturnValue(timestamp * 1000); + expect(await isWebhookSignatureValid(secret, rawBody, sig, 300)).toBe(true); + restore.mockRestore(); + }); + + it("rejects mismatched secret", async () => { + const hash = await hashWithSecret(secret, timestamp, rawBody); + const sig = `t=${timestamp},v1=${hash}`; + expect(await isWebhookSignatureValid("whsec_other_secret", rawBody, sig, 300)).toBe(false); + }); + + it("rejects missing timestamp", async () => { + const hash = await hashWithSecret(secret, timestamp, rawBody); + const sig = `v1=${hash}`; + expect(await isWebhookSignatureValid(secret, rawBody, sig, 300)).toBe(false); + }); + + it("rejects stale signatures", async () => { + const oldTimestamp = timestamp - 360; + const hash = await hashWithSecret(secret, oldTimestamp, rawBody); + const sig = `t=${oldTimestamp},v1=${hash}`; + // Tolerance is 300s; advance wall clock well beyond that vs signature timestamp. + const mockNowSeconds = oldTimestamp + 400; + const restore = vi.spyOn(Date, "now").mockReturnValue(mockNowSeconds * 1000); + expect(await isWebhookSignatureValid(secret, rawBody, sig, 300)).toBe(false); + restore.mockRestore(); + }); + + it("accepts raw webhook bodies inside byte-size limit", () => { + expect(isWebhookBodyWithinSizeLimit("a".repeat(65_536))).toBe(true); + }); + + it("rejects raw webhook bodies over byte-size limit", () => { + expect(isWebhookBodyWithinSizeLimit("a".repeat(65_537))).toBe(false); + }); + + it("extracts Stripe finalize metadata from verified event payload", () => { + const metadata = extractStripeFinalizeMetadata(JSON.parse(rawStripeEventBody)); + expect(metadata).toEqual({ + externalEventId: "evt_live_test", + orderId: "order_1", + finalizeToken: "token_12345678901234", + }); + }); + + it("rejects event payload without required metadata", () => { + const metadata = extractStripeFinalizeMetadata({ + id: "evt_missing", + type: "payment_intent.succeeded", + data: { object: { id: "pi_1", metadata: {} } }, + }); + + expect(metadata).toBeNull(); + }); + + it("clamps webhook tolerance setting to configured bounds", () => { + expect(clampStripeTolerance(0)).toBe(STRIPE_WEBHOOK_SIGNATURE.minToleranceSeconds); + expect(clampStripeTolerance(9_999_999)).toBe(STRIPE_WEBHOOK_SIGNATURE.maxToleranceSeconds); + expect(clampStripeTolerance("150")).toBe(150); + }); + + it("resolves webhook tolerance from KV settings", async () => { + const ctx = { + kv: { + get: vi.fn(async (key: string) => { + return key === "settings:stripeWebhookToleranceSeconds" ? "7200" : null; + }), + }, + } as never; + + await expect(resolveWebhookSignatureToleranceSeconds(ctx)).resolves.toBe(7_200); + }); + + it("falls back to default tolerance for malformed settings", async () => { + const ctx = { + kv: { + get: vi.fn(async (key: string) => { + return key === "settings:stripeWebhookToleranceSeconds" ? "not-a-number" : null; + }), + }, + } as never; + + await expect(resolveWebhookSignatureToleranceSeconds(ctx)).resolves.toBe(300); + }); + + it("builds finalization input from verified Stripe event metadata", async () => { + finalizePaymentFromWebhook.mockResolvedValue({ + kind: "completed", + orderId: "order_1", + }); + + const webhookSecret = "whsec_live_test"; + const body = rawStripeEventBody; + const testTimestamp = 1_760_000_999; + const sig = `t=${testTimestamp},v1=${await hashWithSecret(webhookSecret, testTimestamp, body)}`; + const clock = vi.spyOn(Date, "now").mockReturnValue(testTimestamp * 1000); + + const ctx = { + request: new Request("https://example.test/webhooks/stripe", { + method: "POST", + body, + headers: { + "content-length": String(body.length), + "Stripe-Signature": sig, + }, + }), + input: JSON.parse(rawStripeEventBody), + storage: { + orders: {}, + webhookReceipts: {}, + paymentAttempts: {}, + inventoryLedger: {}, + inventoryStock: {}, + }, + kv: { + get: vi.fn(async (key: string) => { + if (key === "settings:stripeWebhookSecret") return webhookSecret; + if (key === "settings:stripeWebhookToleranceSeconds") return "300"; + return null; + }), + }, + requestMeta: { ip: "127.0.0.1" }, + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never; + + try { + await stripeWebhookHandler(ctx); + } finally { + clock.mockRestore(); + } + + expect(finalizePaymentFromWebhook).toHaveBeenCalledWith( + expect.anything(), + expect.objectContaining({ + orderId: "order_1", + externalEventId: "evt_live_test", + finalizeToken: "token_12345678901234", + providerId: "stripe", + correlationId: "evt_live_test", + }), + ); + }); + + it("rejects legacy direct payload shape now that webhook compatibility mode is removed", async () => { + const webhookSecret = "whsec_live_test"; + const legacyBody = JSON.stringify({ + orderId: "order_1", + externalEventId: "evt_legacy", + finalizeToken: "token_legacy_12345678901234", + }); + const testTimestamp = 1_760_001_001; + const sig = `t=${testTimestamp},v1=${await hashWithSecret(webhookSecret, testTimestamp, legacyBody)}`; + const clock = vi.spyOn(Date, "now").mockReturnValue(testTimestamp * 1000); + + const ctx = { + request: new Request("https://example.test/webhooks/stripe", { + method: "POST", + body: legacyBody, + headers: { + "content-length": String(legacyBody.length), + "Stripe-Signature": sig, + }, + }), + input: JSON.parse(legacyBody), + storage: { + orders: {}, + webhookReceipts: {}, + paymentAttempts: {}, + inventoryLedger: {}, + inventoryStock: {}, + }, + kv: { + get: vi.fn(async (key: string) => { + if (key === "settings:stripeWebhookSecret") return webhookSecret; + if (key === "settings:stripeWebhookToleranceSeconds") return "300"; + return null; + }), + }, + requestMeta: { ip: "127.0.0.1" }, + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never; + + try { + await expect(stripeWebhookHandler(ctx)).rejects.toMatchObject({ + code: "order_state_conflict", + }); + } finally { + clock.mockRestore(); + } + }); + + it("rejects Stripe event payloads missing metadata", async () => { + const webhookSecret = "whsec_live_test"; + const body = JSON.stringify({ + id: "evt_invalid", + type: "payment_intent.succeeded", + data: { object: { id: "pi_1", metadata: {} } }, + }); + const testTimestamp = 1_760_000_999; + const sig = `t=${testTimestamp},v1=${await hashWithSecret(webhookSecret, testTimestamp, body)}`; + const clock = vi.spyOn(Date, "now").mockReturnValue(testTimestamp * 1000); + + try { + await expect( + stripeWebhookHandler({ + request: new Request("https://example.test/webhooks/stripe", { + method: "POST", + body, + headers: { + "content-length": String(body.length), + "Stripe-Signature": sig, + }, + }), + input: JSON.parse(body), + storage: { + orders: {}, + webhookReceipts: {}, + paymentAttempts: {}, + inventoryLedger: {}, + inventoryStock: {}, + }, + kv: { + get: vi.fn(async (key: string) => { + if (key === "settings:stripeWebhookSecret") return webhookSecret; + if (key === "settings:stripeWebhookToleranceSeconds") return "300"; + return null; + }), + }, + requestMeta: { ip: "127.0.0.1" }, + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never), + ).rejects.toMatchObject({ code: "order_state_conflict" }); + } finally { + clock.mockRestore(); + } + }); +}); diff --git a/packages/plugins/commerce/src/handlers/webhooks-stripe.ts b/packages/plugins/commerce/src/handlers/webhooks-stripe.ts new file mode 100644 index 000000000..569f2d6e5 --- /dev/null +++ b/packages/plugins/commerce/src/handlers/webhooks-stripe.ts @@ -0,0 +1,226 @@ +/** + * Stripe webhook entrypoint with in-route signature verification. + * The route still accepts the typed JSON body for deterministic plugin tests. + */ + +import type { RouteContext } from "emdash"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { hmacSha256HexAsync, constantTimeEqualHexAsync } from "../lib/crypto-adapter.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { StripeWebhookEventInput, StripeWebhookInput } from "../schemas.js"; +import { STRIPE_WEBHOOK_SIGNATURE } from "../services/commerce-provider-contracts.js"; +import { handlePaymentWebhook, type CommerceWebhookAdapter } from "./webhook-handler.js"; + +const MAX_WEBHOOK_BODY_BYTES = COMMERCE_LIMITS.maxWebhookBodyBytes; +const STRIPE_SIGNATURE_HEADER = "Stripe-Signature"; +const STRIPE_SIGNATURE_TOLERANCE_SECONDS = STRIPE_WEBHOOK_SIGNATURE.defaultToleranceSeconds; +const STRIPE_SIGNATURE_TOLERANCE_MIN_SECONDS = STRIPE_WEBHOOK_SIGNATURE.minToleranceSeconds; +const STRIPE_SIGNATURE_TOLERANCE_MAX_SECONDS = STRIPE_WEBHOOK_SIGNATURE.maxToleranceSeconds; +const STRIPE_PROVIDER_ID = "stripe"; +const STRIPE_METADATA_ORDER_ID_KEYS = ["orderId", "emdashOrderId", "emdash_order_id"] as const; +const STRIPE_METADATA_FINALIZE_TOKEN_KEYS = [ + "finalizeToken", + "emdashFinalizeToken", + "emdash_finalize_token", +] as const; + +type ParsedStripeSignature = { + timestamp: number; + signatures: string[]; +}; + +type StripeMetadataInput = { + orderId: string; + finalizeToken: string; + externalEventId: string; +}; + +function normalizeHeaderKeyValue(raw: string): [string, string] | null { + const [key, value] = raw.split("=").map((entry) => entry.trim()); + if (!key || !value) return null; + return [key, value]; +} + +function clampStripeTolerance(raw: unknown): number { + const parsed = typeof raw === "number" ? raw : Number.parseInt(String(raw), 10); + if (!Number.isFinite(parsed) || Number.isNaN(parsed)) return STRIPE_SIGNATURE_TOLERANCE_SECONDS; + if (parsed < STRIPE_SIGNATURE_TOLERANCE_MIN_SECONDS) + return STRIPE_SIGNATURE_TOLERANCE_MIN_SECONDS; + if (parsed > STRIPE_SIGNATURE_TOLERANCE_MAX_SECONDS) + return STRIPE_SIGNATURE_TOLERANCE_MAX_SECONDS; + return parsed; +} + +function selectFromMetadata( + input: Record | undefined, + keys: readonly string[], +): string | undefined { + for (const key of keys) { + const value = input?.[key]; + if (typeof value === "string" && value.length > 0) return value; + } + return undefined; +} + +function extractStripeFinalizeMetadata(event: unknown): StripeMetadataInput | null { + if (!event || typeof event !== "object") return null; + const payload = event as StripeWebhookEventInput; + if (!("id" in payload) || typeof payload.id !== "string") return null; + if ( + !payload.data || + typeof payload.data !== "object" || + !payload.data.object || + typeof payload.data.object !== "object" + ) { + return null; + } + + const metadata = payload.data.object.metadata; + if (!metadata || typeof metadata !== "object") return null; + const objectMetadata = metadata as Record; + + const orderId = selectFromMetadata(objectMetadata, STRIPE_METADATA_ORDER_ID_KEYS); + const finalizeToken = selectFromMetadata(objectMetadata, STRIPE_METADATA_FINALIZE_TOKEN_KEYS); + if (!orderId || !finalizeToken) return null; + + return { + orderId, + finalizeToken, + externalEventId: payload.id, + }; +} + +function parseStripeSignatureHeader(raw: string | null): ParsedStripeSignature | null { + if (!raw) return null; + const sigParts = raw.split(","); + let timestamp: number | null = null; + const signatures: string[] = []; + + for (const part of sigParts) { + const pair = normalizeHeaderKeyValue(part); + if (!pair) continue; + const [key, value] = pair; + if (!key || !value) continue; + if (key === "t") { + const parsed = Number.parseInt(value, 10); + if (Number.isNaN(parsed)) return null; + timestamp = parsed; + continue; + } + if (key === "v1") { + signatures.push(value); + } + } + if (timestamp === null || signatures.length === 0) return null; + return { timestamp, signatures }; +} + +async function hashWithSecret(secret: string, timestamp: number, rawBody: string): Promise { + return hmacSha256HexAsync(secret, `${timestamp}.${rawBody}`); +} + +function isWebhookBodyWithinSizeLimit(rawBody: string): boolean { + return new TextEncoder().encode(rawBody).byteLength <= MAX_WEBHOOK_BODY_BYTES; +} + +async function isWebhookSignatureValid( + secret: string, + rawBody: string, + rawSignature: string | null, + toleranceSeconds: number, +): Promise { + const parsed = parseStripeSignatureHeader(rawSignature); + if (!parsed) return false; + const now = Date.now() / 1000; + if (Math.abs(now - parsed.timestamp) > toleranceSeconds) return false; + + const expected = await hashWithSecret(secret, parsed.timestamp, rawBody); + for (const sig of parsed.signatures) { + if (await constantTimeEqualHexAsync(sig, expected)) return true; + } + return false; +} + +async function ensureValidStripeWebhookSignature( + ctx: RouteContext, +): Promise { + const secret = await ctx.kv.get("settings:stripeWebhookSecret"); + if (typeof secret !== "string" || secret.length === 0) { + throwCommerceApiError({ + code: "PROVIDER_UNAVAILABLE", + message: "Missing Stripe webhook signature secret", + }); + } + + const rawBody = await ctx.request.clone().text(); + const tolerance = await resolveWebhookSignatureToleranceSeconds(ctx); + if (!isWebhookBodyWithinSizeLimit(rawBody)) { + throwCommerceApiError({ + code: "PAYLOAD_TOO_LARGE", + message: "Webhook body is too large", + }); + } + const rawSig = ctx.request.headers.get(STRIPE_SIGNATURE_HEADER); + if (!(await isWebhookSignatureValid(secret, rawBody, rawSig, tolerance))) { + throwCommerceApiError({ + code: "WEBHOOK_SIGNATURE_INVALID", + message: "Invalid Stripe webhook signature", + }); + } +} + +async function resolveWebhookSignatureToleranceSeconds( + ctx: RouteContext, +): Promise { + const setting = await ctx.kv.get("settings:stripeWebhookToleranceSeconds"); + if (typeof setting === "number") { + return clampStripeTolerance(setting); + } + return clampStripeTolerance(typeof setting === "string" ? setting : undefined); +} + +const stripeWebhookAdapter: CommerceWebhookAdapter = { + providerId: STRIPE_PROVIDER_ID, + verifyRequest: ensureValidStripeWebhookSignature, + buildFinalizeInput(ctx) { + const parsedMetadata = extractStripeFinalizeMetadata(ctx.input); + if (!parsedMetadata) { + throwCommerceApiError({ + code: "ORDER_STATE_CONFLICT", + message: "Missing required emDash webhook metadata", + }); + } + + return { + orderId: parsedMetadata.orderId, + externalEventId: parsedMetadata.externalEventId, + providerId: STRIPE_PROVIDER_ID, + finalizeToken: parsedMetadata.finalizeToken, + }; + }, + buildCorrelationId(ctx) { + const parsedMetadata = extractStripeFinalizeMetadata(ctx.input); + if (parsedMetadata) { + return parsedMetadata.externalEventId; + } + return "unknown-event"; + }, + buildRateLimitSuffix() { + return "stripe:ip"; + }, +}; + +export async function stripeWebhookHandler(ctx: RouteContext) { + return handlePaymentWebhook(ctx, stripeWebhookAdapter); +} + +export { + hashWithSecret, + isWebhookBodyWithinSizeLimit, + resolveWebhookSignatureToleranceSeconds, + isWebhookSignatureValid, + clampStripeTolerance, + extractStripeFinalizeMetadata, + parseStripeSignatureHeader, +}; diff --git a/packages/plugins/commerce/src/index.ts b/packages/plugins/commerce/src/index.ts new file mode 100644 index 000000000..1c3acaec7 --- /dev/null +++ b/packages/plugins/commerce/src/index.ts @@ -0,0 +1,373 @@ +/** + * EmDash commerce plugin — kernel-first checkout + webhook finalize (Stripe wiring follows). + * + * Persistence: checkout writes the order and payment attempt as separate `put` calls; + * cron cleanup uses `deleteMany` on idempotency keys. Finalize uses interleaved + * ledger + stock `put`s per SKU to avoid inconsistent partial batches. + * + * @example + * ```ts + * // live.config.ts + * import { createPlugin } from "@emdash-cms/plugin-commerce"; + * export default defineConfig({ plugins: [createPlugin()] }); + * ``` + */ + +import type { PluginDefinition, PluginDescriptor, PluginRoute, RouteContext } from "emdash"; +import { definePlugin } from "emdash"; + +import { + COMMERCE_EXTENSION_HOOKS, + COMMERCE_KERNEL_RULES, + COMMERCE_RECOMMENDATION_HOOKS, + type CommerceRecommendationResolver, +} from "./catalog-extensibility.js"; +import { cartGetHandler, cartUpsertHandler } from "./handlers/cart.js"; +import { + addBundleComponentHandler, + removeBundleComponentHandler, + reorderBundleComponentHandler, + bundleComputeStorefrontHandler, +} from "./handlers/catalog.js"; +import { + createCategoryHandler, + listCategoriesHandler, + createProductCategoryLinkHandler, + removeProductCategoryLinkHandler, +} from "./handlers/catalog.js"; +import { + createDigitalAssetHandler, + createDigitalEntitlementHandler, + removeDigitalEntitlementHandler, +} from "./handlers/catalog.js"; +import { + reorderCatalogAssetHandler, + linkCatalogAssetHandler, + registerProductAssetHandler, + unlinkCatalogAssetHandler, +} from "./handlers/catalog.js"; +import { + createProductHandler, + updateProductHandler, + setProductStateHandler, + getStorefrontProductHandler, + createProductSkuHandler, + updateProductSkuHandler, + setSkuStatusHandler, + listStorefrontProductsHandler, + listStorefrontProductSkusHandler, +} from "./handlers/catalog.js"; +import { + createTagHandler, + listTagsHandler, + createProductTagLinkHandler, + removeProductTagLinkHandler, +} from "./handlers/catalog.js"; +import { checkoutGetOrderHandler } from "./handlers/checkout-get-order.js"; +import { checkoutHandler } from "./handlers/checkout.js"; +import { handleIdempotencyCleanup } from "./handlers/cron.js"; +import { stripeWebhookHandler } from "./handlers/webhooks-stripe.js"; +import { + cartGetInputSchema, + cartUpsertInputSchema, + productAssetLinkInputSchema, + productAssetReorderInputSchema, + productAssetRegisterInputSchema, + productAssetUnlinkInputSchema, + bundleComputeInputSchema, + bundleComponentAddInputSchema, + bundleComponentRemoveInputSchema, + bundleComponentReorderInputSchema, + categoryCreateInputSchema, + categoryListInputSchema, + digitalAssetCreateInputSchema, + digitalEntitlementCreateInputSchema, + digitalEntitlementRemoveInputSchema, + productCategoryLinkInputSchema, + productCategoryUnlinkInputSchema, + productCreateInputSchema, + productGetInputSchema, + productSkuStateInputSchema, + productListInputSchema, + productSkuCreateInputSchema, + productSkuUpdateInputSchema, + productSkuListInputSchema, + productStateInputSchema, + productUpdateInputSchema, + checkoutGetOrderInputSchema, + checkoutInputSchema, + tagCreateInputSchema, + tagListInputSchema, + productTagLinkInputSchema, + productTagUnlinkInputSchema, + recommendationsInputSchema, + stripeWebhookInputSchema, +} from "./schemas.js"; +import { createRecommendationsRoute } from "./services/commerce-extension-seams.js"; +import { COMMERCE_STORAGE_CONFIG } from "./storage.js"; + +/** + * The EmDash `definePlugin` route surface is bound to the commerce input contract + * per route. Route helpers keep public/admin registration centralized. + */ +type CommerceRouteHandler = (ctx: RouteContext) => Promise; + +/** + * Route helper constructors to keep public/private registration explicit and avoid + * accidental exposure of mutation endpoints. + */ +function adminRoute( + input: PluginRoute["input"], + handler: CommerceRouteHandler, +): PluginRoute { + return { + input, + handler, + } as PluginRoute; +} + +function publicRoute( + input: PluginRoute["input"], + handler: CommerceRouteHandler, +): PluginRoute { + return { + public: true, + input, + handler, + } as PluginRoute; +} + +/** Outbound Stripe API (`api.stripe.com`, `connect.stripe.com`, etc.). */ +const STRIPE_ALLOWED_HOSTS = ["*.stripe.com"] as const; + +/** + * Manifest-style descriptor uses the same storage declaration as {@link createPlugin}. + * Composite indexes are mirrored from runtime config for consistency. + */ +export function commercePlugin(): PluginDescriptor { + const storage = COMMERCE_STORAGE_CONFIG; + return { + id: "emdash-commerce", + version: "0.1.0", + entrypoint: "@emdash-cms/plugin-commerce", + capabilities: ["network:fetch"], + allowedHosts: [...STRIPE_ALLOWED_HOSTS], + storage, + }; +} + +export interface CommercePluginOptions { + extensions?: { + /** + * Optional read-only recommendation provider adapter for storefront features. + * The provider must only return product IDs and must not mutate commerce data. + */ + recommendationResolver?: CommerceRecommendationResolver; + /** + * Optional provider identifier for diagnostic/correlation output from recommender. + */ + recommendationProviderId?: string; + }; +} + +export function createPlugin(options: CommercePluginOptions = {}) { + const recommendationsRouteHandler = createRecommendationsRoute({ + resolver: options.extensions?.recommendationResolver, + providerId: options.extensions?.recommendationProviderId, + }); + const pluginDefinition: PluginDefinition = { + id: "emdash-commerce", + version: "0.1.0", + capabilities: ["network:fetch"], + allowedHosts: [...STRIPE_ALLOWED_HOSTS], + storage: COMMERCE_STORAGE_CONFIG, + admin: { + settingsSchema: { + stripePublishableKey: { + type: "string", + label: "Stripe publishable key", + description: "Used by the storefront / Elements (pk_…).", + default: "", + }, + stripeSecretKey: { + type: "secret", + label: "Stripe secret key", + description: "Server-side API key (sk_…). Required for PaymentIntents and refunds.", + }, + stripeWebhookSecret: { + type: "secret", + label: "Stripe webhook signing secret", + description: "whsec_… from the Stripe Dashboard; used to verify webhook signatures.", + }, + defaultCurrency: { + type: "string", + label: "Default currency (ISO 4217)", + description: "Fallback when cart currency is absent (e.g. USD).", + default: "USD", + }, + }, + }, + + hooks: { + "plugin:activate": { + handler: async (_event, ctx) => { + if (ctx.cron) { + await ctx.cron.schedule("idempotency-cleanup", { schedule: "@weekly" }); + } + }, + }, + cron: { + handler: async (event, ctx) => { + if (event.name === "idempotency-cleanup") { + await handleIdempotencyCleanup(ctx); + } + }, + }, + }, + + routes: { + // Storefront-safe read and action routes (public API surface). + "cart/upsert": publicRoute(cartUpsertInputSchema, cartUpsertHandler), + "cart/get": publicRoute(cartGetInputSchema, cartGetHandler), + "bundle/compute": publicRoute(bundleComputeInputSchema, bundleComputeStorefrontHandler), + "catalog/product/get": publicRoute(productGetInputSchema, getStorefrontProductHandler), + "catalog/category/list": publicRoute(categoryListInputSchema, listCategoriesHandler), + "catalog/tag/list": publicRoute(tagListInputSchema, listTagsHandler), + "catalog/products": publicRoute(productListInputSchema, listStorefrontProductsHandler), + "catalog/sku/list": publicRoute(productSkuListInputSchema, listStorefrontProductSkusHandler), + checkout: publicRoute(checkoutInputSchema, checkoutHandler), + "checkout/get-order": publicRoute(checkoutGetOrderInputSchema, checkoutGetOrderHandler), + recommendations: publicRoute(recommendationsInputSchema, recommendationsRouteHandler), + "webhooks/stripe": publicRoute(stripeWebhookInputSchema, stripeWebhookHandler), + + // Admin/auth-required catalog and commerce-admin mutation routes. + "product-assets/register": adminRoute( + productAssetRegisterInputSchema, + registerProductAssetHandler, + ), + "catalog/asset/link": adminRoute(productAssetLinkInputSchema, linkCatalogAssetHandler), + "catalog/asset/unlink": adminRoute(productAssetUnlinkInputSchema, unlinkCatalogAssetHandler), + "catalog/asset/reorder": adminRoute( + productAssetReorderInputSchema, + reorderCatalogAssetHandler, + ), + "bundle-components/add": adminRoute(bundleComponentAddInputSchema, addBundleComponentHandler), + "bundle-components/remove": adminRoute( + bundleComponentRemoveInputSchema, + removeBundleComponentHandler, + ), + "bundle-components/reorder": adminRoute( + bundleComponentReorderInputSchema, + reorderBundleComponentHandler, + ), + "digital-assets/create": adminRoute(digitalAssetCreateInputSchema, createDigitalAssetHandler), + "digital-entitlements/create": adminRoute( + digitalEntitlementCreateInputSchema, + createDigitalEntitlementHandler, + ), + "digital-entitlements/remove": adminRoute( + digitalEntitlementRemoveInputSchema, + removeDigitalEntitlementHandler, + ), + "catalog/product/create": adminRoute(productCreateInputSchema, createProductHandler), + "catalog/product/update": adminRoute(productUpdateInputSchema, updateProductHandler), + "catalog/product/state": adminRoute(productStateInputSchema, setProductStateHandler), + "catalog/category/create": adminRoute(categoryCreateInputSchema, createCategoryHandler), + "catalog/category/link": adminRoute( + productCategoryLinkInputSchema, + createProductCategoryLinkHandler, + ), + "catalog/category/unlink": adminRoute( + productCategoryUnlinkInputSchema, + removeProductCategoryLinkHandler, + ), + "catalog/tag/create": adminRoute(tagCreateInputSchema, createTagHandler), + "catalog/tag/link": adminRoute(productTagLinkInputSchema, createProductTagLinkHandler), + "catalog/tag/unlink": adminRoute(productTagUnlinkInputSchema, removeProductTagLinkHandler), + "catalog/sku/create": adminRoute(productSkuCreateInputSchema, createProductSkuHandler), + "catalog/sku/update": adminRoute(productSkuUpdateInputSchema, updateProductSkuHandler), + "catalog/sku/state": adminRoute(productSkuStateInputSchema, setSkuStatusHandler), + }, + }; + return definePlugin(pluginDefinition); +} + +export default createPlugin; + +export type * from "./types.js"; +export type { CommerceStorage } from "./storage.js"; +export { COMMERCE_STORAGE_CONFIG } from "./storage.js"; +export { COMMERCE_SETTINGS_KEYS } from "./settings-keys.js"; +export { + COMMERCE_EXTENSION_HOOKS, + COMMERCE_RECOMMENDATION_HOOKS, + COMMERCE_KERNEL_RULES, +} from "./catalog-extensibility.js"; +export { + finalizePaymentFromWebhook, + webhookReceiptDocId, + receiptToView, + inventoryStockDocId, +} from "./orchestration/finalize-payment.js"; +export { throwCommerceApiError } from "./route-errors.js"; +export type { CommerceCatalogProductSearchFields } from "./catalog-extensibility.js"; +export { + createRecommendationsRoute, + createPaymentWebhookRoute, + queryFinalizationState, + COMMERCE_MCP_ACTORS, + type CommerceMcpActor, + type CommerceMcpOperationContext, +} from "./services/commerce-extension-seams.js"; +export { PAYMENT_DEFAULTS } from "./services/commerce-provider-contracts.js"; +export type { + CommerceProviderDescriptor, + CommerceProviderType, + CommerceWebhookInput, + CommerceWebhookFinalizeResponse, +} from "./services/commerce-provider-contracts.js"; +export type { RecommendationsHandlerOptions } from "./handlers/recommendations.js"; +export type { + CommerceWebhookAdapter, + WebhookFinalizeResponse, +} from "./handlers/webhook-handler.js"; +export type { RecommendationsResponse } from "./handlers/recommendations.js"; +export type { CheckoutGetOrderResponse } from "./handlers/checkout-get-order.js"; +export type { CartUpsertResponse, CartGetResponse } from "./handlers/cart.js"; +export type { + ProductResponse, + ProductListResponse, + ProductSkuResponse, + ProductSkuListResponse, + StorefrontProductDetail, + StorefrontProductListResponse, + StorefrontSkuListResponse, +} from "./handlers/catalog.js"; +export type { + CategoryResponse, + CategoryListResponse, + ProductCategoryLinkResponse, + ProductCategoryLinkUnlinkResponse, +} from "./handlers/catalog.js"; +export type { + TagResponse, + TagListResponse, + ProductTagLinkResponse, + ProductTagLinkUnlinkResponse, +} from "./handlers/catalog.js"; +export type { + ProductAssetResponse, + ProductAssetLinkResponse, + ProductAssetUnlinkResponse, +} from "./handlers/catalog.js"; +export type { + BundleComponentResponse, + BundleComponentUnlinkResponse, + BundleComputeResponse, + StorefrontBundleComputeResponse, +} from "./handlers/catalog.js"; +export type { + DigitalAssetResponse, + DigitalEntitlementResponse, + DigitalEntitlementUnlinkResponse, +} from "./handlers/catalog.js"; diff --git a/packages/plugins/commerce/src/kernel/api-errors.test.ts b/packages/plugins/commerce/src/kernel/api-errors.test.ts new file mode 100644 index 000000000..8223c40f5 --- /dev/null +++ b/packages/plugins/commerce/src/kernel/api-errors.test.ts @@ -0,0 +1,52 @@ +import { describe, expect, it } from "vitest"; + +import { toCommerceApiError } from "./api-errors.js"; +import { COMMERCE_ERRORS } from "./errors.js"; + +describe("toCommerceApiError", () => { + it("maps internal error code to wire code and metadata", () => { + const error = toCommerceApiError({ + code: "PAYMENT_ALREADY_PROCESSED", + message: "Payment already captured", + }); + + expect(error.code).toBe("payment_already_processed"); + expect(error.httpStatus).toBe(COMMERCE_ERRORS.PAYMENT_ALREADY_PROCESSED.httpStatus); + expect(error.retryable).toBe(COMMERCE_ERRORS.PAYMENT_ALREADY_PROCESSED.retryable); + expect(error.message).toBe("Payment already captured"); + expect(error.details).toBeUndefined(); + }); + + it("preserves optional details", () => { + const error = toCommerceApiError({ + code: "ORDER_STATE_CONFLICT", + message: "Order is not in finalizable state", + details: { orderId: "ord_123", phase: "canceled" }, + }); + + expect(error.details).toEqual({ orderId: "ord_123", phase: "canceled" }); + expect(error.code).toBe("order_state_conflict"); + }); + + it("preserves retryable metadata for retryable errors", () => { + const error = toCommerceApiError({ + code: "PROVIDER_UNAVAILABLE", + message: "Provider timed out", + }); + + expect(error.code).toBe("provider_unavailable"); + expect(error.retryable).toBe(true); + expect(error.httpStatus).toBe(COMMERCE_ERRORS.PROVIDER_UNAVAILABLE.httpStatus); + }); + + it("preserves retryable metadata for non-retryable errors", () => { + const error = toCommerceApiError({ + code: "WEBHOOK_SIGNATURE_INVALID", + message: "Invalid webhook signature", + }); + + expect(error.code).toBe("webhook_signature_invalid"); + expect(error.retryable).toBe(false); + expect(error.httpStatus).toBe(COMMERCE_ERRORS.WEBHOOK_SIGNATURE_INVALID.httpStatus); + }); +}); diff --git a/packages/plugins/commerce/src/kernel/api-errors.ts b/packages/plugins/commerce/src/kernel/api-errors.ts new file mode 100644 index 000000000..ed24d57db --- /dev/null +++ b/packages/plugins/commerce/src/kernel/api-errors.ts @@ -0,0 +1,38 @@ +import { + COMMERCE_ERRORS, + type CommerceErrorCode, + type CommerceWireErrorCode, + commerceErrorCodeToWire, +} from "./errors.js"; + +export type CommerceApiError = { + code: CommerceWireErrorCode; + message: string; + httpStatus: number; + retryable: boolean; + details?: Record; +}; + +export type CommerceApiErrorInput = { + code: CommerceErrorCode; + message: string; + details?: Record; +}; + +export function toCommerceApiError(input: CommerceApiErrorInput): CommerceApiError { + const { code, message, details } = input; + const meta = COMMERCE_ERRORS[code]; + + const payload: CommerceApiError = { + code: commerceErrorCodeToWire(code), + message, + httpStatus: meta.httpStatus, + retryable: meta.retryable, + }; + + if (details !== undefined) { + payload.details = details; + } + + return payload; +} diff --git a/packages/plugins/commerce/src/kernel/errors.test.ts b/packages/plugins/commerce/src/kernel/errors.test.ts new file mode 100644 index 000000000..5fb1df874 --- /dev/null +++ b/packages/plugins/commerce/src/kernel/errors.test.ts @@ -0,0 +1,35 @@ +import { describe, expect, it } from "vitest"; + +import { + COMMERCE_ERRORS, + COMMERCE_ERROR_WIRE_CODES, + commerceErrorCodeToWire, + type CommerceErrorCode, +} from "./errors.js"; + +const WIRE_PATTERN = /^[a-z][a-z0-9_]*$/; + +describe("commerceErrorCodeToWire", () => { + it("maps every internal code to a non-empty snake_case wire code", () => { + for (const key of Object.keys(COMMERCE_ERRORS) as CommerceErrorCode[]) { + const wire = commerceErrorCodeToWire(key); + expect(wire).toMatch(WIRE_PATTERN); + expect(wire.length).toBeGreaterThan(0); + } + }); + + it("COMMERCE_ERROR_WIRE_CODES has exactly the same keys as COMMERCE_ERRORS", () => { + type ToSortedStrings = string[] & { + toSorted: (compareFn?: (left: string, right: string) => number) => string[]; + }; + + const sortedWireKeys = (Object.keys(COMMERCE_ERROR_WIRE_CODES) as ToSortedStrings).toSorted(); + const sortedErrorKeys = (Object.keys(COMMERCE_ERRORS) as ToSortedStrings).toSorted(); + expect(sortedWireKeys).toEqual(sortedErrorKeys); + }); + + it("returns known mappings for representative codes", () => { + expect(commerceErrorCodeToWire("WEBHOOK_REPLAY_DETECTED")).toBe("webhook_replay_detected"); + expect(commerceErrorCodeToWire("ORDER_STATE_CONFLICT")).toBe("order_state_conflict"); + }); +}); diff --git a/packages/plugins/commerce/src/kernel/errors.ts b/packages/plugins/commerce/src/kernel/errors.ts new file mode 100644 index 000000000..824f673a4 --- /dev/null +++ b/packages/plugins/commerce/src/kernel/errors.ts @@ -0,0 +1,112 @@ +/** + * Canonical error metadata for commerce (kernel). + * + * **Internal vs wire:** `COMMERCE_ERRORS` keys are **internal** identifiers + * (`UPPER_SNAKE`, stable for TypeScript and kernel branches). Public HTTP/API + * payloads must use **wire** codes: `snake_case` strings from + * `COMMERCE_ERROR_WIRE_CODES` / `commerceErrorCodeToWire()`. Route handlers are + * responsible for that mapping; the kernel does not emit HTTP. + */ +export const COMMERCE_ERRORS = { + // Inventory + INVENTORY_CHANGED: { httpStatus: 409, retryable: false }, + INSUFFICIENT_STOCK: { httpStatus: 409, retryable: false }, + + // Product / catalog + ASSET_LINK_NOT_FOUND: { httpStatus: 404, retryable: false }, + ASSET_NOT_FOUND: { httpStatus: 404, retryable: false }, + BUNDLE_COMPONENT_NOT_FOUND: { httpStatus: 404, retryable: false }, + CATEGORY_LINK_NOT_FOUND: { httpStatus: 404, retryable: false }, + PRODUCT_UNAVAILABLE: { httpStatus: 404, retryable: false }, + DIGITAL_ASSET_NOT_FOUND: { httpStatus: 404, retryable: false }, + DIGITAL_ENTITLEMENT_NOT_FOUND: { httpStatus: 404, retryable: false }, + VARIANT_UNAVAILABLE: { httpStatus: 404, retryable: false }, + TAG_LINK_NOT_FOUND: { httpStatus: 404, retryable: false }, + + // Cart + CART_NOT_FOUND: { httpStatus: 404, retryable: false }, + CART_EXPIRED: { httpStatus: 410, retryable: false }, + CART_EMPTY: { httpStatus: 422, retryable: false }, + /** Caller did not supply an owner token but the cart requires one. */ + CART_TOKEN_REQUIRED: { httpStatus: 401, retryable: false }, + /** Supplied owner token does not match the stored hash. */ + CART_TOKEN_INVALID: { httpStatus: 403, retryable: false }, + + // Order + ORDER_NOT_FOUND: { httpStatus: 404, retryable: false }, + ORDER_STATE_CONFLICT: { httpStatus: 409, retryable: false }, + PAYMENT_CONFLICT: { httpStatus: 409, retryable: false }, + /** Caller did not supply a finalizeToken but the order requires one. */ + ORDER_TOKEN_REQUIRED: { httpStatus: 401, retryable: false }, + /** Supplied finalizeToken does not match the stored hash. */ + ORDER_TOKEN_INVALID: { httpStatus: 403, retryable: false }, + + // Payment + PAYMENT_INITIATION_FAILED: { httpStatus: 502, retryable: true }, + PAYMENT_CONFIRMATION_FAILED: { httpStatus: 502, retryable: false }, + PAYMENT_ALREADY_PROCESSED: { httpStatus: 409, retryable: false }, + PROVIDER_UNAVAILABLE: { httpStatus: 503, retryable: true }, + + // Webhooks + WEBHOOK_SIGNATURE_INVALID: { httpStatus: 401, retryable: false }, + WEBHOOK_REPLAY_DETECTED: { httpStatus: 200, retryable: false }, + + // Discounts / coupons + INVALID_DISCOUNT: { httpStatus: 422, retryable: false }, + DISCOUNT_EXPIRED: { httpStatus: 410, retryable: false }, + + // Features / config + FEATURE_NOT_ENABLED: { httpStatus: 501, retryable: false }, + CURRENCY_MISMATCH: { httpStatus: 422, retryable: false }, + SHIPPING_REQUIRED: { httpStatus: 422, retryable: false }, + + // Abuse / limits + RATE_LIMITED: { httpStatus: 429, retryable: true }, + PAYLOAD_TOO_LARGE: { httpStatus: 413, retryable: false }, +} as const satisfies Record; + +export type CommerceErrorCode = keyof typeof COMMERCE_ERRORS; + +/** Wire-level / public API error code (snake_case), stable across versions. */ +export const COMMERCE_ERROR_WIRE_CODES = { + INVENTORY_CHANGED: "inventory_changed", + INSUFFICIENT_STOCK: "insufficient_stock", + PRODUCT_UNAVAILABLE: "product_unavailable", + ASSET_LINK_NOT_FOUND: "asset_link_not_found", + ASSET_NOT_FOUND: "asset_not_found", + BUNDLE_COMPONENT_NOT_FOUND: "bundle_component_not_found", + CATEGORY_LINK_NOT_FOUND: "category_link_not_found", + DIGITAL_ASSET_NOT_FOUND: "digital_asset_not_found", + DIGITAL_ENTITLEMENT_NOT_FOUND: "digital_entitlement_not_found", + TAG_LINK_NOT_FOUND: "tag_link_not_found", + VARIANT_UNAVAILABLE: "variant_unavailable", + CART_NOT_FOUND: "cart_not_found", + CART_EXPIRED: "cart_expired", + CART_EMPTY: "cart_empty", + CART_TOKEN_REQUIRED: "cart_token_required", + CART_TOKEN_INVALID: "cart_token_invalid", + ORDER_NOT_FOUND: "order_not_found", + ORDER_STATE_CONFLICT: "order_state_conflict", + PAYMENT_CONFLICT: "payment_conflict", + ORDER_TOKEN_REQUIRED: "order_token_required", + ORDER_TOKEN_INVALID: "order_token_invalid", + PAYMENT_INITIATION_FAILED: "payment_initiation_failed", + PAYMENT_CONFIRMATION_FAILED: "payment_confirmation_failed", + PAYMENT_ALREADY_PROCESSED: "payment_already_processed", + PROVIDER_UNAVAILABLE: "provider_unavailable", + WEBHOOK_SIGNATURE_INVALID: "webhook_signature_invalid", + WEBHOOK_REPLAY_DETECTED: "webhook_replay_detected", + INVALID_DISCOUNT: "invalid_discount", + DISCOUNT_EXPIRED: "discount_expired", + FEATURE_NOT_ENABLED: "feature_not_enabled", + CURRENCY_MISMATCH: "currency_mismatch", + SHIPPING_REQUIRED: "shipping_required", + RATE_LIMITED: "rate_limited", + PAYLOAD_TOO_LARGE: "payload_too_large", +} as const satisfies Record; + +export type CommerceWireErrorCode = (typeof COMMERCE_ERROR_WIRE_CODES)[CommerceErrorCode]; + +export function commerceErrorCodeToWire(code: CommerceErrorCode): CommerceWireErrorCode { + return COMMERCE_ERROR_WIRE_CODES[code]; +} diff --git a/packages/plugins/commerce/src/kernel/finalize-decision.test.ts b/packages/plugins/commerce/src/kernel/finalize-decision.test.ts new file mode 100644 index 000000000..b033f32ad --- /dev/null +++ b/packages/plugins/commerce/src/kernel/finalize-decision.test.ts @@ -0,0 +1,138 @@ +import { describe, expect, it } from "vitest"; + +import { decidePaymentFinalize } from "./finalize-decision.js"; + +describe("decidePaymentFinalize", () => { + const cid = "corr-1"; + + it("proceeds when order awaits payment and no processed receipt", () => { + expect( + decidePaymentFinalize({ + orderStatus: "payment_pending", + receipt: { exists: false }, + correlationId: cid, + }), + ).toEqual({ action: "proceed", correlationId: cid }); + }); + + it("proceeds when order is authorized and no receipt exists", () => { + expect( + decidePaymentFinalize({ + orderStatus: "authorized", + receipt: { exists: false }, + correlationId: cid, + }), + ).toEqual({ action: "proceed", correlationId: cid }); + }); + + it("noop when order already paid and webhook receipt already processed (replay)", () => { + const d = decidePaymentFinalize({ + orderStatus: "paid", + receipt: { exists: true, status: "processed" }, + correlationId: cid, + }); + expect(d.action).toBe("noop"); + if (d.action === "noop") { + expect(d.httpStatus).toBe(200); + expect(d.code).toBe("WEBHOOK_REPLAY_DETECTED"); + expect(d.reason).toBe("webhook_receipt_processed"); + } + }); + + it("noop when webhook was already processed", () => { + const d = decidePaymentFinalize({ + orderStatus: "payment_pending", + receipt: { exists: true, status: "processed" }, + correlationId: cid, + }); + expect(d).toEqual({ + action: "noop", + reason: "webhook_receipt_processed", + httpStatus: 200, + code: "WEBHOOK_REPLAY_DETECTED", + }); + }); + + it("noop when webhook is duplicate", () => { + const d = decidePaymentFinalize({ + orderStatus: "payment_pending", + receipt: { exists: true, status: "duplicate" }, + correlationId: cid, + }); + expect(d).toMatchObject({ + action: "noop", + reason: "webhook_receipt_duplicate", + httpStatus: 200, + code: "WEBHOOK_REPLAY_DETECTED", + }); + }); + + it("resumes finalization when webhook row is pending and order is already paid", () => { + const d = decidePaymentFinalize({ + orderStatus: "paid", + receipt: { exists: true, status: "pending" }, + correlationId: cid, + }); + expect(d).toEqual({ action: "proceed", correlationId: cid }); + }); + + it("continues when webhook row is pending and payment is still in progress", () => { + const d = decidePaymentFinalize({ + orderStatus: "payment_pending", + receipt: { exists: true, status: "pending" }, + correlationId: cid, + }); + expect(d).toEqual({ action: "proceed", correlationId: cid }); + }); + + it("continues when webhook row is pending while still authorized", () => { + const d = decidePaymentFinalize({ + orderStatus: "authorized", + receipt: { exists: true, status: "pending" }, + correlationId: cid, + }); + expect(d).toEqual({ action: "proceed", correlationId: cid }); + }); + + it("conflict when webhook is error", () => { + const d = decidePaymentFinalize({ + orderStatus: "payment_pending", + receipt: { exists: true, status: "error" }, + correlationId: cid, + }); + expect(d).toMatchObject({ + action: "noop", + reason: "webhook_error", + httpStatus: 409, + code: "ORDER_STATE_CONFLICT", + }); + }); + + it("conflict when order is in draft", () => { + const d = decidePaymentFinalize({ + orderStatus: "draft", + receipt: { exists: false }, + correlationId: cid, + }); + expect(d).toMatchObject({ + action: "noop", + reason: "order_not_finalizable", + httpStatus: 409, + code: "ORDER_STATE_CONFLICT", + }); + }); + + it("conflict when order is canceled", () => { + const d = decidePaymentFinalize({ + orderStatus: "canceled", + receipt: { exists: false }, + correlationId: cid, + }); + expect(d).toMatchObject({ + action: "noop", + reason: "order_not_finalizable", + httpStatus: 409, + code: "ORDER_STATE_CONFLICT", + }); + }); +}); diff --git a/packages/plugins/commerce/src/kernel/finalize-decision.ts b/packages/plugins/commerce/src/kernel/finalize-decision.ts new file mode 100644 index 000000000..cb8161a5e --- /dev/null +++ b/packages/plugins/commerce/src/kernel/finalize-decision.ts @@ -0,0 +1,141 @@ +/** + * Pure decision step: **may this finalize attempt proceed** given the current + * read model (order phase + webhook receipt row view). + * + * This is **not** the full payment-reconciliation or HTTP error surface. + * Signature verification, provider errors, inventory conflicts, and ledger + * writes live in orchestration and storage; they may introduce additional + * codes and outcomes beyond `FinalizeNoopCode`. + * + * `FinalizeNoopCode` stays intentionally narrow: only outcomes this helper + * can emit today. Do not overload `ORDER_STATE_CONFLICT` for unrelated + * domains here—extend orchestration or add dedicated decision helpers when + * those paths exist. + * + * Storage must insert `webhookReceipts` with a unique `externalEventId`; + * this module only interprets the read model passed in. + */ + +export type OrderPaymentPhase = + | "draft" + | "payment_pending" + | "authorized" + | "paid" + | "payment_conflict" + | "processing" + | "fulfilled" + | "refund_pending" + | "refunded" + | "canceled"; + +/** + * Minimal receipt state for idempotent finalize. **Storage-facing semantics to + * pin before persistence ships:** + * + * - **processed** — this `externalEventId` was fully handled; side effects + * (e.g. order transition) completed successfully. Terminal. + * - **duplicate** — derived duplicate classification from storage/orchestration: + * a delivery must not execute finalize again because it is redundant relative + * to an already-known receipt. Retry is not useful here. + * - **pending** — receipt row exists but processing is incomplete. Retry may be + * valid once that row resolves. + * - **error** — terminal failure recorded for this receipt row. Today this is used + * when the order record disappears while finalization is running; orchestration + * blocks automatic replay unless a future explicit recovery path is added. + */ +export type WebhookReceiptView = + | { exists: false } + | { exists: true; status: "processed" | "duplicate" | "error" | "pending" }; + +/** Internal ids; at HTTP boundary use `commerceErrorCodeToWire()` from `./errors`. */ +export type FinalizeNoopCode = "WEBHOOK_REPLAY_DETECTED" | "ORDER_STATE_CONFLICT"; +export type FinalizeNoopReason = + | "order_already_paid" + | "webhook_receipt_processed" + | "webhook_receipt_duplicate" + | "webhook_error" + | "webhook_pending" + | "order_not_finalizable"; + +export type FinalizeDecision = + | { action: "proceed"; correlationId: string } + | { + action: "noop"; + reason: FinalizeNoopReason; + httpStatus: number; + code: FinalizeNoopCode; + }; + +const FINALIZABLE: ReadonlySet = new Set(["payment_pending", "authorized"]); + +export function decidePaymentFinalize(input: { + orderStatus: OrderPaymentPhase; + receipt: WebhookReceiptView; + correlationId: string; +}): FinalizeDecision { + const { orderStatus, receipt, correlationId } = input; + + if (receipt.exists) { + if (receipt.status === "pending") { + if ( + orderStatus === "payment_pending" || + orderStatus === "authorized" || + orderStatus === "paid" + ) { + return { action: "proceed", correlationId }; + } + + return { + action: "noop", + reason: "webhook_pending", + httpStatus: 409, + code: "ORDER_STATE_CONFLICT", + }; + } + + if (receipt.status === "processed") { + return { + action: "noop", + reason: "webhook_receipt_processed", + httpStatus: 200, + code: "WEBHOOK_REPLAY_DETECTED", + }; + } + + if (receipt.status === "duplicate") { + return { + action: "noop", + reason: "webhook_receipt_duplicate", + httpStatus: 200, + code: "WEBHOOK_REPLAY_DETECTED", + }; + } + + return { + action: "noop", + reason: "webhook_error", + httpStatus: 409, + code: "ORDER_STATE_CONFLICT", + }; + } + + if (orderStatus === "paid") { + return { + action: "noop", + reason: "order_already_paid", + httpStatus: 200, + code: "WEBHOOK_REPLAY_DETECTED", + }; + } + + if (!FINALIZABLE.has(orderStatus)) { + return { + action: "noop", + reason: "order_not_finalizable", + httpStatus: 409, + code: "ORDER_STATE_CONFLICT", + }; + } + + return { action: "proceed", correlationId }; +} diff --git a/packages/plugins/commerce/src/kernel/idempotency-key.test.ts b/packages/plugins/commerce/src/kernel/idempotency-key.test.ts new file mode 100644 index 000000000..6727f9b82 --- /dev/null +++ b/packages/plugins/commerce/src/kernel/idempotency-key.test.ts @@ -0,0 +1,22 @@ +import { describe, expect, it } from "vitest"; + +import { validateIdempotencyKey } from "./idempotency-key.js"; + +describe("validateIdempotencyKey", () => { + it("rejects empty", () => { + expect(validateIdempotencyKey(undefined)).toBe(false); + expect(validateIdempotencyKey("")).toBe(false); + }); + + it("rejects too short", () => { + expect(validateIdempotencyKey("123456789012345")).toBe(false); // 15 + }); + + it("accepts 16-char printable", () => { + expect(validateIdempotencyKey("abcdefghijklmnop")).toBe(true); + }); + + it("rejects non-printable", () => { + expect(validateIdempotencyKey("abc\ndefghijklmnop")).toBe(false); + }); +}); diff --git a/packages/plugins/commerce/src/kernel/idempotency-key.ts b/packages/plugins/commerce/src/kernel/idempotency-key.ts new file mode 100644 index 000000000..5ebaf5edd --- /dev/null +++ b/packages/plugins/commerce/src/kernel/idempotency-key.ts @@ -0,0 +1,19 @@ +import { COMMERCE_LIMITS } from "./limits.js"; + +const PRINTABLE_ASCII = /^[\x21-\x7E]+$/; + +/** + * Validates client-supplied Idempotency-Key (header or body). + * Does not hash — storage layer hashes with route + user scope. + */ +export function validateIdempotencyKey(key: string | undefined): key is string { + if (key === undefined || key === "") return false; + const len = key.length; + if ( + len < COMMERCE_LIMITS.minIdempotencyKeyLength || + len > COMMERCE_LIMITS.maxIdempotencyKeyLength + ) { + return false; + } + return PRINTABLE_ASCII.test(key); +} diff --git a/packages/plugins/commerce/src/kernel/limits.ts b/packages/plugins/commerce/src/kernel/limits.ts new file mode 100644 index 000000000..da2cc477b --- /dev/null +++ b/packages/plugins/commerce/src/kernel/limits.ts @@ -0,0 +1,29 @@ +/** Hard caps — enforce in route handlers before kernel work. */ +export const COMMERCE_LIMITS = { + maxCartLineItems: 50, + maxLineItemQty: 999, + maxIdempotencyKeyLength: 128, + minIdempotencyKeyLength: 16, + /** Server-side idempotency replay window (matches architecture TTL guidance). */ + idempotencyRecordTtlMs: 86_400_000, + /** Default fixed window for public cart/checkout rate limits (ms) */ + defaultRateWindowMs: 60_000, + defaultCheckoutPerIpPerWindow: 30, + defaultCartMutationsPerTokenPerWindow: 120, + defaultWebhookPerIpPerWindow: 120, + /** + * Finalization diagnostics (`queryFinalizationState`) per client IP per window. + * Tuned for moderate dashboard/MCP polling without hammering plugin storage. + */ + defaultFinalizationDiagnosticsPerIpPerWindow: 60, + /** Short KV read-through TTL for finalization diagnostics (Option B). */ + finalizationDiagnosticsCacheTtlMs: 10_000, + /** Bound attacker-controlled strings on webhook JSON (before Stripe raw body lands). */ + maxWebhookFieldLength: 512, + /** Cap on `recommendations` route `limit` query/body field. */ + maxRecommendationsLimit: 20, + /** Max raw webhook payload bytes validated before signature verification. */ + maxWebhookBodyBytes: 65_536, + /** Inventory threshold considered low-stock for product list summary display. */ + lowStockThreshold: 0, +} as const; diff --git a/packages/plugins/commerce/src/kernel/provider-policy.ts b/packages/plugins/commerce/src/kernel/provider-policy.ts new file mode 100644 index 000000000..3b67122bd --- /dev/null +++ b/packages/plugins/commerce/src/kernel/provider-policy.ts @@ -0,0 +1,14 @@ +/** + * Defaults for outbound payment-provider HTTP calls (Layer B applies these). + * Adapters may override per gateway. + */ +export const PROVIDER_HTTP_POLICY = { + initiateTimeoutMs: 15_000, + refundTimeoutMs: 30_000, + /** Max retries for safe GET-style provider status polls only */ + maxIdempotentRetries: 2, + retryBackoffMs: [200, 800] as const, + circuitFailureThreshold: 5, + circuitWindowMs: 60_000, + circuitCooldownMs: 30_000, +} as const; diff --git a/packages/plugins/commerce/src/kernel/rate-limit-window.test.ts b/packages/plugins/commerce/src/kernel/rate-limit-window.test.ts new file mode 100644 index 000000000..adb9f3008 --- /dev/null +++ b/packages/plugins/commerce/src/kernel/rate-limit-window.test.ts @@ -0,0 +1,57 @@ +import { describe, expect, it } from "vitest"; + +import { nextRateLimitState } from "./rate-limit-window.js"; + +describe("nextRateLimitState", () => { + const windowMs = 60_000; + + it("allows first request in empty window", () => { + const r = nextRateLimitState(null, 1_000, 3, windowMs); + expect(r.allowed).toBe(true); + expect(r.bucket).toEqual({ count: 1, windowStartMs: 1_000 }); + }); + + it("increments within window", () => { + const b1 = nextRateLimitState(null, 1_000, 3, windowMs); + const b2 = nextRateLimitState(b1.bucket, 2_000, 3, windowMs); + const b3 = nextRateLimitState(b2.bucket, 3_000, 3, windowMs); + expect(b3.allowed).toBe(true); + expect(b3.bucket.count).toBe(3); + }); + + it("blocks when limit reached", () => { + let bucket = nextRateLimitState(null, 0, 2, windowMs).bucket; + bucket = nextRateLimitState(bucket, 100, 2, windowMs).bucket; + const blocked = nextRateLimitState(bucket, 200, 2, windowMs); + expect(blocked.allowed).toBe(false); + expect(blocked.bucket.count).toBe(2); + }); + + it("resets after window elapses", () => { + let bucket = nextRateLimitState(null, 0, 1, windowMs).bucket; + bucket = nextRateLimitState(bucket, 100, 1, windowMs).bucket; + expect(nextRateLimitState(bucket, 100, 1, windowMs).allowed).toBe(false); + const fresh = nextRateLimitState(bucket, windowMs + 1, 1, windowMs); + expect(fresh.allowed).toBe(true); + expect(fresh.bucket.count).toBe(1); + }); + + it("resets exactly at window boundary", () => { + const first = nextRateLimitState(null, 0, 1, windowMs).bucket; + const second = nextRateLimitState(first, windowMs, 1, windowMs); + expect(second.allowed).toBe(true); + expect(second.bucket).toEqual({ count: 1, windowStartMs: windowMs }); + }); + + it("blocks when window config is invalid", () => { + const denied = nextRateLimitState({ count: 1, windowStartMs: 1_000 }, 2_000, 0, windowMs); + expect(denied.allowed).toBe(false); + expect(denied.bucket).toEqual({ count: 1, windowStartMs: 1_000 }); + }); + + it("blocks when window size config is invalid", () => { + const denied = nextRateLimitState({ count: 1, windowStartMs: 1_000 }, 2_000, 2, -1); + expect(denied.allowed).toBe(false); + expect(denied.bucket).toEqual({ count: 1, windowStartMs: 1_000 }); + }); +}); diff --git a/packages/plugins/commerce/src/kernel/rate-limit-window.ts b/packages/plugins/commerce/src/kernel/rate-limit-window.ts new file mode 100644 index 000000000..b7aff213c --- /dev/null +++ b/packages/plugins/commerce/src/kernel/rate-limit-window.ts @@ -0,0 +1,64 @@ +export type RateBucket = { count: number; windowStartMs: number }; + +/** + * Fixed-window counter (simple, KV-friendly). Call after read-modify-write on KV. + * + * Fail-safe behavior: invalid inputs are treated as a hard rate limit block instead + * of silently disabling the limiter. + */ +export function nextRateLimitState( + prev: RateBucket | null, + nowMs: number, + limit: number, + windowMs: number, +): { allowed: boolean; bucket: RateBucket } { + const previousWindow = + prev === null || !Number.isFinite(prev.count) || !Number.isFinite(prev.windowStartMs) + ? null + : prev; + const safeWindowStartMs = previousWindow?.windowStartMs ?? nowMs; + const safeNowMs = Number.isFinite(nowMs) ? nowMs : Number.NaN; + + if ( + !Number.isFinite(safeNowMs) || + safeNowMs < 0 || + !Number.isFinite(limit) || + !Number.isInteger(limit) || + limit < 1 || + !Number.isFinite(windowMs) || + !Number.isInteger(windowMs) || + windowMs < 1 + ) { + return { + allowed: false, + bucket: { + count: previousWindow ? previousWindow.count : 0, + windowStartMs: safeWindowStartMs, + }, + }; + } + + const previousCount = Math.max(0, Math.trunc(previousWindow ? previousWindow.count : 0)); + + if (!previousWindow || safeNowMs - previousWindow.windowStartMs >= windowMs) { + return { + allowed: true, + bucket: { count: 1, windowStartMs: safeNowMs }, + }; + } + + if (previousCount >= limit) { + return { + allowed: false, + bucket: { + count: previousCount, + windowStartMs: previousWindow.windowStartMs, + }, + }; + } + + return { + allowed: true, + bucket: { count: previousCount + 1, windowStartMs: previousWindow.windowStartMs }, + }; +} diff --git a/packages/plugins/commerce/src/lib/cart-fingerprint.test.ts b/packages/plugins/commerce/src/lib/cart-fingerprint.test.ts new file mode 100644 index 000000000..c86600835 --- /dev/null +++ b/packages/plugins/commerce/src/lib/cart-fingerprint.test.ts @@ -0,0 +1,41 @@ +import { describe, expect, it } from "vitest"; + +import { cartContentFingerprint } from "./cart-fingerprint.js"; + +describe("cartContentFingerprint", () => { + it("changes when line data changes", () => { + const a = cartContentFingerprint([ + { + productId: "p", + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 50, + }, + ]); + const b = cartContentFingerprint([ + { + productId: "p", + quantity: 2, + inventoryVersion: 1, + unitPriceMinor: 50, + }, + ]); + expect(a).not.toBe(b); + }); + + it("is stable under line reorder", () => { + const line1 = { + productId: "a", + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 1, + }; + const line2 = { + productId: "b", + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 2, + }; + expect(cartContentFingerprint([line1, line2])).toBe(cartContentFingerprint([line2, line1])); + }); +}); diff --git a/packages/plugins/commerce/src/lib/cart-fingerprint.ts b/packages/plugins/commerce/src/lib/cart-fingerprint.ts new file mode 100644 index 000000000..1197cc3a1 --- /dev/null +++ b/packages/plugins/commerce/src/lib/cart-fingerprint.ts @@ -0,0 +1,12 @@ +/** + * Stable fingerprint of cart sellable content for idempotency scoping. + * Any change to lines, versions, qty, or prices yields a different hash. + */ + +import type { CartLineItem } from "../types.js"; +import { projectCartLineItemsForFingerprint } from "./cart-lines.js"; + +export function cartContentFingerprint(lines: CartLineItem[]): string { + const normalized = projectCartLineItemsForFingerprint(lines); + return JSON.stringify(normalized); +} diff --git a/packages/plugins/commerce/src/lib/cart-lines.test.ts b/packages/plugins/commerce/src/lib/cart-lines.test.ts new file mode 100644 index 000000000..e32204caf --- /dev/null +++ b/packages/plugins/commerce/src/lib/cart-lines.test.ts @@ -0,0 +1,51 @@ +import { describe, expect, it } from "vitest"; + +import { + projectCartLineItemsForFingerprint, + projectCartLineItemsForStorage, +} from "./cart-lines.js"; + +describe("cart line item projections", () => { + it("projects only stable cart line fields for storage", () => { + const input = [ + { + productId: "sku-1", + quantity: 2, + inventoryVersion: 9, + unitPriceMinor: 1234, + variantId: "variant-1", + extraField: "should disappear", + } as const, + ]; + + const projected = projectCartLineItemsForStorage(input); + + expect(projected).toEqual([ + { + productId: "sku-1", + variantId: "variant-1", + quantity: 2, + inventoryVersion: 9, + unitPriceMinor: 1234, + }, + ]); + }); + + it("normalizes for fingerprinting with deterministic sorting", () => { + const input = [ + { productId: "beta", quantity: 1, inventoryVersion: 1, unitPriceMinor: 100 }, + { productId: "alpha", quantity: 1, variantId: "z", inventoryVersion: 1, unitPriceMinor: 100 }, + { productId: "alpha", quantity: 2, inventoryVersion: 1, unitPriceMinor: 200 }, + { productId: "alpha", quantity: 1, variantId: "a", inventoryVersion: 1, unitPriceMinor: 150 }, + ]; + + const projected = projectCartLineItemsForFingerprint(input); + + expect(projected).toHaveLength(4); + expect(projected[0]?.productId).toBe("alpha"); + expect(projected[0]?.variantId).toBe(""); + expect(projected[1]?.variantId).toBe("a"); + expect(projected[2]?.variantId).toBe("z"); + expect(projected[3]?.productId).toBe("beta"); + }); +}); diff --git a/packages/plugins/commerce/src/lib/cart-lines.ts b/packages/plugins/commerce/src/lib/cart-lines.ts new file mode 100644 index 000000000..8fd5019c4 --- /dev/null +++ b/packages/plugins/commerce/src/lib/cart-lines.ts @@ -0,0 +1,52 @@ +import type { CartLineItem } from "../types.js"; +import { sortedImmutable } from "./sort-immutable.js"; + +export type CanonicalCartLineItem = { + productId: string; + variantId?: string; + quantity: number; + inventoryVersion: number; + unitPriceMinor: number; +}; + +type CartFingerprintLine = { + productId: string; + variantId: string; + quantity: number; + inventoryVersion: number; + unitPriceMinor: number; +}; + +export function projectCartLineItemsForStorage( + lines: ReadonlyArray, +): CanonicalCartLineItem[] { + return lines.map((line) => ({ + productId: line.productId, + variantId: line.variantId, + quantity: line.quantity, + inventoryVersion: line.inventoryVersion, + unitPriceMinor: line.unitPriceMinor, + })); +} + +function compareByProductAndVariant( + left: { productId: string; variantId: string }, + right: { productId: string; variantId: string }, +) { + const productOrder = left.productId.localeCompare(right.productId); + if (productOrder !== 0) return productOrder; + return left.variantId.localeCompare(right.variantId); +} + +export function projectCartLineItemsForFingerprint( + lines: ReadonlyArray, +): CartFingerprintLine[] { + const projected = Array.from(lines, (line) => ({ + productId: line.productId, + variantId: line.variantId ?? "", + quantity: line.quantity, + inventoryVersion: line.inventoryVersion, + unitPriceMinor: line.unitPriceMinor, + })); + return sortedImmutable(projected, compareByProductAndVariant); +} diff --git a/packages/plugins/commerce/src/lib/cart-owner-token.ts b/packages/plugins/commerce/src/lib/cart-owner-token.ts new file mode 100644 index 000000000..c0906448f --- /dev/null +++ b/packages/plugins/commerce/src/lib/cart-owner-token.ts @@ -0,0 +1,41 @@ +import { equalSha256HexDigestAsync, sha256HexAsync } from "../lib/crypto-adapter.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import type { StoredCart } from "../types.js"; + +export type CartOwnerTokenOperation = "read" | "mutate" | "checkout"; + +/** + * The raw `ownerToken` must be presented and match `ownerTokenHash` for all carts. + */ +export async function assertCartOwnerToken( + cart: StoredCart, + ownerToken: string | undefined, + op: CartOwnerTokenOperation, +): Promise { + if (!cart.ownerTokenHash) { + throwCommerceApiError({ + code: "CART_TOKEN_REQUIRED", + message: "Cart ownership token is required but not configured", + }); + } + + const presented = ownerToken?.trim(); + if (!presented) { + const messages: Record = { + read: "An owner token is required to read this cart", + mutate: "An owner token is required to modify this cart", + checkout: "An owner token is required to check out this cart", + }; + throwCommerceApiError({ + code: "CART_TOKEN_REQUIRED", + message: messages[op], + }); + } + const presentedHash = await sha256HexAsync(presented); + if (!(await equalSha256HexDigestAsync(presentedHash, cart.ownerTokenHash))) { + throwCommerceApiError({ + code: "CART_TOKEN_INVALID", + message: "Owner token is invalid", + }); + } +} diff --git a/packages/plugins/commerce/src/lib/cart-validation.ts b/packages/plugins/commerce/src/lib/cart-validation.ts new file mode 100644 index 000000000..c4b5fa1c6 --- /dev/null +++ b/packages/plugins/commerce/src/lib/cart-validation.ts @@ -0,0 +1,22 @@ +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import type { CartLineItem } from "../types.js"; + +export function validateCartLineItems(lines: ReadonlyArray): string | null { + for (const line of lines) { + if ( + !Number.isInteger(line.quantity) || + line.quantity < 1 || + line.quantity > COMMERCE_LIMITS.maxLineItemQty + ) { + return `Line item quantity must be between 1 and ${COMMERCE_LIMITS.maxLineItemQty}`; + } + if (!Number.isInteger(line.inventoryVersion) || line.inventoryVersion < 0) { + return "Line item inventory version must be a non-negative integer"; + } + if (!Number.isInteger(line.unitPriceMinor) || line.unitPriceMinor < 0) { + return "Line item unit price must be a non-negative integer"; + } + } + + return null; +} diff --git a/packages/plugins/commerce/src/lib/catalog-bundles.test.ts b/packages/plugins/commerce/src/lib/catalog-bundles.test.ts new file mode 100644 index 000000000..012789c45 --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-bundles.test.ts @@ -0,0 +1,119 @@ +import { describe, expect, it } from "vitest"; + +import { computeBundleSummary } from "./catalog-bundles.js"; + +const skuA = { + id: "sku_1", + productId: "prod_bundle", + skuCode: "B-A", + status: "active", + unitPriceMinor: 200, + inventoryQuantity: 12, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", +} as const; + +const skuB = { + id: "sku_2", + productId: "prod_parent", + skuCode: "B-B", + status: "active", + unitPriceMinor: 50, + inventoryQuantity: 3, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", +} as const; + +describe("bundle discount summary", () => { + it("computes fixed-discount availability and final price", () => { + const out = computeBundleSummary("bundle_1", "fixed_amount", 180, undefined, [ + { + component: { + id: "c1", + bundleProductId: "bundle_1", + componentSkuId: "sku_1", + quantity: 2, + position: 0, + createdAt: "2026", + updatedAt: "2026", + }, + sku: skuA, + }, + { + component: { + id: "c2", + bundleProductId: "bundle_1", + componentSkuId: "sku_2", + quantity: 1, + position: 1, + createdAt: "2026", + updatedAt: "2026", + }, + sku: skuB, + }, + ]); + expect(out.subtotalMinor).toBe(450); + expect(out.discountAmountMinor).toBe(180); + expect(out.finalPriceMinor).toBe(270); + expect(out.availability).toBe(3); + expect(out.components[0]!.availableBundleQuantity).toBe(6); + expect(out.components[1]!.availableBundleQuantity).toBe(3); + }); + + it("computes percentage discounts with floor behavior", () => { + const out = computeBundleSummary("bundle_1", "percentage", undefined, 2_000, [ + { + component: { + id: "c1", + bundleProductId: "bundle_1", + componentSkuId: "sku_1", + quantity: 2, + position: 0, + createdAt: "2026", + updatedAt: "2026", + }, + sku: skuA, + }, + ]); + expect(out.subtotalMinor).toBe(400); + expect(out.discountAmountMinor).toBe(80); + expect(out.finalPriceMinor).toBe(320); + }); + + it("sets availability to zero when any component is inactive", () => { + const out = computeBundleSummary("bundle_1", "none", undefined, undefined, [ + { + component: { + id: "c1", + bundleProductId: "bundle_1", + componentSkuId: "sku_1", + quantity: 2, + position: 0, + createdAt: "2026", + updatedAt: "2026", + }, + sku: skuA, + }, + { + component: { + id: "c2", + bundleProductId: "bundle_1", + componentSkuId: "sku_2", + quantity: 1, + position: 1, + createdAt: "2026", + updatedAt: "2026", + }, + sku: { ...skuB, status: "inactive" }, + }, + ]); + expect(out.availability).toBe(0); + expect(out.components[1]!.availableBundleQuantity).toBe(0); + }); +}); diff --git a/packages/plugins/commerce/src/lib/catalog-bundles.ts b/packages/plugins/commerce/src/lib/catalog-bundles.ts new file mode 100644 index 000000000..ba65adbde --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-bundles.ts @@ -0,0 +1,84 @@ +import type { BundleDiscountType, StoredBundleComponent, StoredProductSku } from "../types.js"; + +export type BundleComputeComponentSummary = { + componentId: string; + componentSkuId: string; + componentSkuCode: string; + componentProductId: string; + componentPriceMinor: number; + quantityPerBundle: number; + subtotalContributionMinor: number; + availableBundleQuantity: number; +}; + +export type BundleComputeSummary = { + productId: string; + subtotalMinor: number; + discountType: BundleDiscountType; + discountValueMinor: number; + discountValueBps: number; + discountAmountMinor: number; + finalPriceMinor: number; + availability: number; + components: BundleComputeComponentSummary[]; +}; + +export type BundleComputeInputLine = { + component: StoredBundleComponent; + sku: StoredProductSku; +}; + +export function computeBundleSummary( + productId: string, + discountType: BundleDiscountType | undefined, + discountValueMinor: number | undefined, + discountValueBps: number | undefined, + lines: BundleComputeInputLine[], +): BundleComputeSummary { + const type: BundleDiscountType = discountType ?? "none"; + const resolvedDiscountValueMinor = Math.max(0, discountValueMinor ?? 0); + const resolvedDiscountValueBps = Math.min(10_000, Math.max(0, discountValueBps ?? 0)); + + const summaryLines: BundleComputeComponentSummary[] = lines.map((line) => { + const qty = Math.max(1, line.component.quantity); + const componentAvailable = + line.sku.status !== "active" ? 0 : Math.floor(line.sku.inventoryQuantity / qty); + return { + componentId: line.component.id, + componentSkuId: line.component.componentSkuId, + componentSkuCode: line.sku.skuCode, + componentProductId: line.sku.productId, + componentPriceMinor: line.sku.unitPriceMinor, + quantityPerBundle: line.component.quantity, + subtotalContributionMinor: line.sku.unitPriceMinor * line.component.quantity, + availableBundleQuantity: componentAvailable, + }; + }); + + const subtotalMinor = summaryLines.reduce((sum, line) => sum + line.subtotalContributionMinor, 0); + const rawDiscountAmount = + type === "fixed_amount" + ? resolvedDiscountValueMinor + : type === "percentage" + ? Math.floor((subtotalMinor * resolvedDiscountValueBps) / 10_000) + : 0; + const discountAmountMinor = Math.max(0, Math.min(subtotalMinor, rawDiscountAmount)); + const finalPriceMinor = Math.max(0, subtotalMinor - discountAmountMinor); + + const availability = + summaryLines.length === 0 + ? 0 + : Math.min(...summaryLines.map((line) => line.availableBundleQuantity)); + + return { + productId, + subtotalMinor, + discountType: type, + discountValueMinor: resolvedDiscountValueMinor, + discountValueBps: resolvedDiscountValueBps, + discountAmountMinor, + finalPriceMinor, + availability, + components: summaryLines, + }; +} diff --git a/packages/plugins/commerce/src/lib/catalog-domain.test.ts b/packages/plugins/commerce/src/lib/catalog-domain.test.ts new file mode 100644 index 000000000..3f0cc56de --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-domain.test.ts @@ -0,0 +1,109 @@ +import { describe, expect, it } from "vitest"; + +import type { StoredProduct, StoredProductSku } from "../types.js"; +import { applyProductSkuUpdatePatch, applyProductUpdatePatch } from "./catalog-domain.js"; + +const isoNow = "2026-01-01T00:00:00.000Z"; + +function asProductPatch( + value: Parameters[1], +): Parameters[1] { + return value as Parameters[1]; +} + +function asSkuPatch( + value: Parameters[1], +): Parameters[1] { + return value as Parameters[1]; +} + +describe("catalog-domain helpers", () => { + it("prevents immutable product fields from being updated", () => { + const product: StoredProduct = { + id: "prod_1", + type: "simple", + status: "draft", + visibility: "hidden", + slug: "existing", + title: "Original", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2025-12-01T00:00:00.000Z", + updatedAt: "2025-12-01T00:00:00.000Z", + }; + + expect(() => + applyProductUpdatePatch(product, asProductPatch({ type: "bundle" }), isoNow), + ).toThrow(); + }); + + it("prevents slug rewrites on active products", () => { + const product: StoredProduct = { + id: "prod_1", + type: "simple", + status: "active", + visibility: "public", + slug: "existing", + title: "Original", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2025-12-01T00:00:00.000Z", + updatedAt: "2025-12-01T00:00:00.000Z", + }; + + expect(() => + applyProductUpdatePatch(product, asProductPatch({ slug: "new-slug" }), isoNow), + ).toThrow(); + }); + + it("applies safe mutable product and sku updates", () => { + const product: StoredProduct = { + id: "prod_1", + type: "simple", + status: "draft", + visibility: "hidden", + slug: "existing", + title: "Original", + shortDescription: "", + longDescription: "", + featured: false, + sortOrder: 0, + requiresShippingDefault: true, + createdAt: "2025-12-01T00:00:00.000Z", + updatedAt: "2025-12-01T00:00:00.000Z", + }; + + const productResult = applyProductUpdatePatch( + product, + asProductPatch({ title: "Updated" }), + isoNow, + ); + expect(productResult.title).toBe("Updated"); + expect(productResult.updatedAt).toBe(isoNow); + expect(productResult.id).toBe("prod_1"); + + const sku: StoredProductSku = { + id: "sku_1", + productId: "prod_1", + skuCode: "SKU-1", + status: "active", + unitPriceMinor: 1000, + inventoryQuantity: 5, + inventoryVersion: 1, + requiresShipping: true, + isDigital: false, + createdAt: "2025-12-01T00:00:00.000Z", + updatedAt: "2025-12-01T00:00:00.000Z", + }; + + const skuResult = applyProductSkuUpdatePatch(sku, asSkuPatch({ unitPriceMinor: 1200 }), isoNow); + expect(skuResult.unitPriceMinor).toBe(1200); + expect(skuResult.productId).toBe("prod_1"); + }); +}); diff --git a/packages/plugins/commerce/src/lib/catalog-domain.ts b/packages/plugins/commerce/src/lib/catalog-domain.ts new file mode 100644 index 000000000..ad4a4c519 --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-domain.ts @@ -0,0 +1,113 @@ +import { PluginRouteError } from "emdash"; + +import type { + ProductSkuUpdateInput as ProductSkuUpdateInputSchema, + ProductUpdateInput as ProductUpdateInputSchema, +} from "../schemas.js"; +import type { StoredProduct, StoredProductSku } from "../types.js"; + +export const PRODUCT_IMMUTABLE_FIELDS = [ + "id", + "type", + "createdAt", +] as const satisfies readonly (keyof StoredProduct)[]; + +export const PRODUCT_SKU_IMMUTABLE_FIELDS = [ + "id", + "productId", + "createdAt", +] as const satisfies readonly (keyof StoredProductSku)[]; + +type ProductPatch = Omit; +type ProductSkuPatch = Omit; + +type DraftProductForLifecycle = Pick; + +export function applyProductUpdatePatch( + existing: StoredProduct, + patch: T, + nowIso: string, +): StoredProduct { + const patchMap = patch as Record; + + for (const field of PRODUCT_IMMUTABLE_FIELDS) { + const proposed = patchMap[field]; + if (proposed !== undefined && proposed !== existing[field]) { + throw PluginRouteError.badRequest(`Cannot update immutable field: ${field}`); + } + } + + if (patch.slug !== undefined && existing.status === "active" && patch.slug !== existing.slug) { + throw PluginRouteError.badRequest("Cannot change slug after a product is active"); + } + + const next = applyProductLifecycle( + { + ...existing, + ...patch, + updatedAt: nowIso, + }, + nowIso, + ); + return next; +} + +export function applyProductStatusTransition( + existing: StoredProduct, + nextStatus: StoredProduct["status"], + nowIso: string, +): StoredProduct { + return applyProductLifecycle( + { + ...existing, + status: nextStatus, + }, + nowIso, + ); +} + +export function applyProductSkuUpdatePatch( + existing: StoredProductSku, + patch: T, + nowIso: string, +): StoredProductSku { + const patchMap = patch as Record; + for (const field of PRODUCT_SKU_IMMUTABLE_FIELDS) { + const proposed = patchMap[field]; + if (proposed !== undefined && proposed !== existing[field]) { + throw PluginRouteError.badRequest(`Cannot update immutable field: ${field}`); + } + } + + return { + ...existing, + ...patch, + updatedAt: nowIso, + }; +} + +function applyProductLifecycle( + product: T, + nowIso: string, +): T { + if (product.status === "active") { + return { + ...product, + publishedAt: product.publishedAt ?? nowIso, + archivedAt: undefined, + }; + } + + if (product.status === "archived") { + return { + ...product, + archivedAt: nowIso, + }; + } + + return { + ...product, + archivedAt: undefined, + publishedAt: product.publishedAt, + }; +} diff --git a/packages/plugins/commerce/src/lib/catalog-dto.ts b/packages/plugins/commerce/src/lib/catalog-dto.ts new file mode 100644 index 000000000..16f0668c7 --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-dto.ts @@ -0,0 +1,100 @@ +import type { + StoredCategory, + StoredProduct, + StoredProductAttribute, + StoredProductSku, + StoredProductTag, +} from "../types.js"; +import type { BundleComputeSummary } from "./catalog-bundles.js"; + +export type BundleSummaryDTO = BundleComputeSummary; + +export type ProductCategoryDTO = Pick< + StoredCategory, + "id" | "name" | "slug" | "parentId" | "position" +>; + +export type ProductTagDTO = Pick; + +export interface ProductDigitalEntitlementSummary { + skuId: string; + entitlements: Array<{ + entitlementId: string; + digitalAssetId: string; + digitalAssetLabel?: string; + grantedQuantity: number; + downloadLimit?: number; + downloadExpiryDays?: number; + isManualOnly: boolean; + isPrivate: boolean; + }>; +} + +export type VariantMatrixDTO = { + skuId: string; + skuCode: string; + status: StoredProductSku["status"]; + unitPriceMinor: number; + compareAtPriceMinor?: number; + inventoryQuantity: number; + inventoryVersion: number; + requiresShipping: boolean; + isDigital: boolean; + image?: ProductPrimaryImageDTO; + options: Array<{ + attributeId: string; + attributeValueId: string; + }>; +}; + +export interface ProductInventorySummaryDTO { + /** Number of SKUs attached to the product. */ + skuCount: number; + /** Number of SKUs currently active. */ + activeSkuCount: number; + /** Sum of inventory across all SKUs. */ + totalInventoryQuantity: number; +} + +export interface ProductPriceRangeDTO { + minUnitPriceMinor?: number; + maxUnitPriceMinor?: number; +} + +export interface ProductPrimaryImageDTO { + linkId: string; + assetId: string; + provider: string; + externalAssetId: string; + fileName?: string; + altText?: string; +} + +export interface ProductDetailDTO { + product: StoredProduct; + skus: StoredProductSku[]; + attributes?: StoredProductAttribute[]; + variantMatrix?: VariantMatrixDTO[]; + categories: ProductCategoryDTO[]; + tags: ProductTagDTO[]; + digitalEntitlements?: ProductDigitalEntitlementSummary[]; + bundleSummary?: BundleSummaryDTO; + primaryImage?: ProductPrimaryImageDTO; + galleryImages?: ProductPrimaryImageDTO[]; +} + +export interface CatalogListingDTO { + product: StoredProduct; + priceRange: ProductPriceRangeDTO; + inventorySummary: ProductInventorySummaryDTO; + primaryImage?: ProductPrimaryImageDTO; + galleryImages?: ProductPrimaryImageDTO[]; + lowStockSkuCount?: number; + categories: ProductCategoryDTO[]; + tags: ProductTagDTO[]; +} + +export type ProductAdminDTO = CatalogListingDTO & { + /** Explicitly include low-cardinality state for admin surfaces. */ + lowStockSkuCount: number; +}; diff --git a/packages/plugins/commerce/src/lib/catalog-order-snapshots.ts b/packages/plugins/commerce/src/lib/catalog-order-snapshots.ts new file mode 100644 index 000000000..74de46ee3 --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-order-snapshots.ts @@ -0,0 +1,329 @@ +import type { StorageCollection } from "emdash"; + +import type { + OrderLineItemBundleComponentSummary, + OrderLineItemBundleSummary, + OrderLineItemDigitalEntitlementSnapshot, + OrderLineItemImageSnapshot, + OrderLineItemOptionSelection, + OrderLineItemSnapshot, + StoredBundleComponent, + StoredDigitalAsset, + StoredDigitalEntitlement, + StoredInventoryStock, + StoredProduct, + StoredProductAsset, + StoredProductAssetLink, + StoredProductSku, + StoredProductSkuOptionValue, +} from "../types.js"; +import { computeBundleSummary } from "./catalog-bundles.js"; +import { inventoryStockDocId } from "./inventory-stock.js"; +import { sortedImmutable } from "./sort-immutable.js"; + +export type CatalogSnapshotCollections = { + products: Pick, "get" | "query">; + productSkus: Pick, "get" | "query">; + productSkuOptionValues: Pick, "get" | "query">; + productDigitalAssets: Pick, "get" | "query">; + productDigitalEntitlements: Pick, "get" | "query">; + productAssetLinks: Pick, "get" | "query">; + productAssets: Pick, "get" | "query">; + bundleComponents: Pick, "get" | "query">; + /** Required for bundle snapshots: per-component stock versions at checkout. */ + inventoryStock: { get(id: string): Promise }; +}; + +type QueryCollection = Pick, "get" | "query">; + +type SnapshotLineInput = { + productId: string; + variantId?: string; + quantity: number; + unitPriceMinor: number; + inventoryVersion: number; +}; + +export async function buildOrderLineSnapshots( + lines: ReadonlyArray, + currency: string, + catalog: CatalogSnapshotCollections, +): Promise { + const snapshots = await Promise.all( + lines.map((line) => buildOrderLineSnapshot(line, currency, catalog)), + ); + return snapshots; +} + +function createFallbackLineSnapshot( + line: SnapshotLineInput, + currency: string, +): OrderLineItemSnapshot { + const lineSubtotalMinor = line.unitPriceMinor * line.quantity; + return { + productId: line.productId, + skuId: line.variantId ?? line.productId, + productType: "simple", + productTitle: line.productId, + skuCode: line.variantId ?? line.productId, + selectedOptions: [], + currency, + unitPriceMinor: line.unitPriceMinor, + lineSubtotalMinor, + lineDiscountMinor: 0, + lineTotalMinor: lineSubtotalMinor, + requiresShipping: true, + isDigital: false, + }; +} + +async function buildOrderLineSnapshot( + line: SnapshotLineInput, + currency: string, + catalog: CatalogSnapshotCollections, +): Promise { + const product = await catalog.products.get(line.productId); + if (!product) { + return createFallbackLineSnapshot(line, currency); + } + + const base: OrderLineItemSnapshot = { + productId: product.id, + productType: product.type, + productTitle: product.title, + productSlug: product.slug, + skuId: line.variantId ?? line.productId, + skuCode: line.variantId ?? line.productId, + selectedOptions: [], + currency, + unitPriceMinor: line.unitPriceMinor, + lineSubtotalMinor: line.unitPriceMinor * line.quantity, + lineDiscountMinor: 0, + lineTotalMinor: line.unitPriceMinor * line.quantity, + requiresShipping: true, + isDigital: false, + }; + + const sku = await resolveSkuForSnapshot(line, product, catalog.productSkus); + if (sku) { + base.skuId = sku.id; + base.skuCode = sku.skuCode; + base.compareAtPriceMinor = sku.compareAtPriceMinor; + base.requiresShipping = sku.requiresShipping; + base.isDigital = sku.isDigital; + base.unitPriceMinor = sku.unitPriceMinor; + base.lineSubtotalMinor = sku.unitPriceMinor * line.quantity; + base.lineTotalMinor = base.lineSubtotalMinor; + if (product.type === "variable") { + base.selectedOptions = await querySkuOptionSelections(sku.id, catalog.productSkuOptionValues); + } + } + + if (product.type === "bundle") { + const bundleSummary = await buildBundleSummary(product.id, catalog); + if (bundleSummary) { + base.bundleSummary = bundleSummary.summary; + base.unitPriceMinor = bundleSummary.summary.finalPriceMinor; + base.lineSubtotalMinor = bundleSummary.summary.subtotalMinor * line.quantity; + base.lineDiscountMinor = bundleSummary.summary.discountAmountMinor * line.quantity; + base.lineTotalMinor = bundleSummary.summary.finalPriceMinor * line.quantity; + base.requiresShipping = bundleSummary.requiresShipping; + } + } + + const targetType = line.variantId ? "sku" : "product"; + const targetId = line.variantId ?? product.id; + const preferredRoles = line.variantId + ? (["variant_image", "primary_image"] as const) + : (["primary_image"] as const); + base.image = await queryRepresentativeImage({ + productAssetLinks: catalog.productAssetLinks, + productAssets: catalog.productAssets, + targetType, + targetId, + roles: preferredRoles, + }); + if (!base.image && line.variantId) { + base.image = await queryRepresentativeImage({ + productAssetLinks: catalog.productAssetLinks, + productAssets: catalog.productAssets, + targetType: "product", + targetId: product.id, + roles: ["primary_image"], + }); + } + + if (sku) { + const entitlements = await collectDigitalEntitlements(sku.id, catalog); + if (entitlements.length > 0) { + base.digitalEntitlements = entitlements; + } + } + + return base; +} + +async function resolveSkuForSnapshot( + line: SnapshotLineInput, + product: StoredProduct, + productSkus: Pick, "get" | "query">, +): Promise { + if (line.variantId) { + const sku = await productSkus.get(line.variantId); + if (!sku || sku.productId !== line.productId) { + return null; + } + return sku; + } + + if (product.type === "variable") { + return null; + } + + const rows = await productSkus.query({ where: { productId: line.productId }, limit: 5 }); + if (rows.items.length !== 1) { + return null; + } + const row = rows.items[0]; + if (!row) return null; + return row.data; +} + +async function buildBundleSummary( + productId: string, + catalog: CatalogSnapshotCollections, +): Promise<{ summary: OrderLineItemBundleSummary; requiresShipping: boolean } | undefined> { + const componentRows = await catalog.bundleComponents.query({ + where: { bundleProductId: productId }, + }); + if (componentRows.items.length === 0) return undefined; + + const componentLines: { component: StoredBundleComponent; sku: StoredProductSku }[] = []; + for (const row of componentRows.items) { + const component = row.data; + const sku = await catalog.productSkus.get(component.componentSkuId); + if (!sku) continue; + componentLines.push({ component, sku }); + } + if (componentLines.length === 0) return undefined; + + const product = await catalog.products.get(productId); + if (!product) return undefined; + + const summary = computeBundleSummary( + productId, + product.bundleDiscountType, + product.bundleDiscountValueMinor, + product.bundleDiscountValueBps, + componentLines.map((entry) => ({ + component: entry.component, + sku: entry.sku, + })), + ); + const components: OrderLineItemBundleComponentSummary[] = await Promise.all( + summary.components.map(async (component) => { + const stockId = inventoryStockDocId(component.componentProductId, component.componentSkuId); + const stock = await catalog.inventoryStock.get(stockId); + return { + componentId: component.componentId, + componentSkuId: component.componentSkuId, + componentSkuCode: component.componentSkuCode, + componentProductId: component.componentProductId, + componentPriceMinor: component.componentPriceMinor, + quantityPerBundle: component.quantityPerBundle, + subtotalContributionMinor: component.subtotalContributionMinor, + availableBundleQuantity: component.availableBundleQuantity, + componentInventoryVersion: stock?.version ?? -1, + }; + }), + ); + + const out: OrderLineItemBundleSummary = { + productId, + subtotalMinor: summary.subtotalMinor, + discountType: summary.discountType, + discountValueMinor: summary.discountValueMinor, + discountValueBps: summary.discountValueBps, + discountAmountMinor: summary.discountAmountMinor, + finalPriceMinor: summary.finalPriceMinor, + availability: summary.availability, + components, + }; + const requiresShipping = componentLines.some((line) => line.sku.requiresShipping); + return { summary: out, requiresShipping }; +} + +async function collectDigitalEntitlements( + skuId: string, + catalog: CatalogSnapshotCollections, +): Promise { + const entitlements = await catalog.productDigitalEntitlements.query({ + where: { skuId }, + limit: 200, + }); + const out: OrderLineItemDigitalEntitlementSnapshot[] = []; + for (const row of entitlements.items) { + const entitlement = row.data; + const asset = await catalog.productDigitalAssets.get(entitlement.digitalAssetId); + if (!asset) continue; + out.push({ + entitlementId: entitlement.id, + digitalAssetId: entitlement.digitalAssetId, + digitalAssetLabel: asset.label, + grantedQuantity: entitlement.grantedQuantity, + downloadLimit: asset.downloadLimit, + downloadExpiryDays: asset.downloadExpiryDays, + isManualOnly: asset.isManualOnly, + isPrivate: asset.isPrivate, + }); + } + return out; +} + +async function querySkuOptionSelections( + skuId: string, + productSkuOptionValues: QueryCollection, +): Promise { + const options = await productSkuOptionValues.query({ where: { skuId } }); + const ordered = sortedImmutable( + options.items.map((row) => ({ + attributeId: row.data.attributeId, + attributeValueId: row.data.attributeValueId, + })), + (left, right) => + left.attributeId.localeCompare(right.attributeId) || + left.attributeValueId.localeCompare(right.attributeValueId), + ); + return ordered; +} + +async function queryRepresentativeImage(input: { + productAssetLinks: QueryCollection; + productAssets: QueryCollection; + targetType: StoredProductAssetLink["targetType"]; + targetId: string; + roles: readonly StoredProductAssetLink["role"][]; +}): Promise { + const links = await input.productAssetLinks.query({ + where: { targetType: input.targetType, targetId: input.targetId }, + }); + const sorted = sortedImmutable( + links.items.map((row) => row.data), + (left, right) => left.position - right.position || left.id.localeCompare(right.id), + ); + const acceptedRoles = new Set(input.roles); + for (const link of sorted) { + if (!acceptedRoles.has(link.role)) continue; + const asset = await input.productAssets.get(link.assetId); + if (!asset) continue; + return { + linkId: link.id, + assetId: asset.id, + provider: asset.provider, + externalAssetId: asset.externalAssetId, + fileName: asset.fileName, + altText: asset.altText, + }; + } + return undefined; +} diff --git a/packages/plugins/commerce/src/lib/catalog-variants.test.ts b/packages/plugins/commerce/src/lib/catalog-variants.test.ts new file mode 100644 index 000000000..73a349c1f --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-variants.test.ts @@ -0,0 +1,147 @@ +import { describe, expect, it } from "vitest"; + +import type { StoredProductAttribute, StoredProductAttributeValue } from "../types.js"; +import { + collectVariantDefiningAttributes, + validateVariableSkuOptions, +} from "./catalog-variants.js"; + +describe("catalog variant invariants", () => { + const colorAttribute: StoredProductAttribute = { + id: "attr_color", + productId: "prod_1", + name: "Color", + code: "color", + kind: "variant_defining", + position: 0, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }; + const sizeAttribute: StoredProductAttribute = { + id: "attr_size", + productId: "prod_1", + name: "Size", + code: "size", + kind: "variant_defining", + position: 1, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }; + const labelAttribute: StoredProductAttribute = { + id: "attr_label", + productId: "prod_1", + name: "Label", + code: "label", + kind: "descriptive", + position: 2, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }; + + const valueColorRed: StoredProductAttributeValue = { + id: "val_red", + attributeId: "attr_color", + value: "Red", + code: "red", + position: 0, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }; + const valueColorBlue: StoredProductAttributeValue = { + id: "val_blue", + attributeId: "attr_color", + value: "Blue", + code: "blue", + position: 1, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }; + const valueSizeS: StoredProductAttributeValue = { + id: "val_s", + attributeId: "attr_size", + value: "Small", + code: "s", + position: 0, + createdAt: "2026-01-01T00:00:00.000Z", + updatedAt: "2026-01-01T00:00:00.000Z", + }; + it("filters variant-defining attributes", () => { + const selected = collectVariantDefiningAttributes([ + colorAttribute, + sizeAttribute, + labelAttribute, + ]); + expect(selected.map((row) => row.code)).toEqual(["color", "size"]); + }); + + it("rejects SKU options missing or extra variant-defining assignments", () => { + const variantAttributes = collectVariantDefiningAttributes([colorAttribute, sizeAttribute]); + const attributeValues = [valueColorRed, valueColorBlue, valueSizeS]; + expect(() => + validateVariableSkuOptions({ + productId: "prod_1", + variantAttributes, + attributeValues, + optionValues: [{ attributeId: colorAttribute.id, attributeValueId: valueColorRed.id }], + existingSignatures: new Set(), + }), + ).toThrowError(); + expect(() => + validateVariableSkuOptions({ + productId: "prod_1", + variantAttributes, + attributeValues, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: valueColorRed.id }, + { attributeId: sizeAttribute.id, attributeValueId: valueSizeS.id }, + { attributeId: colorAttribute.id, attributeValueId: valueColorBlue.id }, + ], + existingSignatures: new Set(), + }), + ).toThrowError(); + }); + + it("rejects unknown and duplicate option pair definitions", () => { + const variantAttributes = collectVariantDefiningAttributes([colorAttribute, sizeAttribute]); + const attributeValues = [valueColorRed, valueSizeS]; + expect(() => + validateVariableSkuOptions({ + productId: "prod_1", + variantAttributes, + attributeValues, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: valueColorRed.id }, + { attributeId: sizeAttribute.id, attributeValueId: "missing_value" }, + ], + existingSignatures: new Set(), + }), + ).toThrowError(); + + expect(() => + validateVariableSkuOptions({ + productId: "prod_1", + variantAttributes, + attributeValues, + optionValues: [ + { attributeId: colorAttribute.id, attributeValueId: valueColorRed.id }, + { attributeId: colorAttribute.id, attributeValueId: valueColorBlue.id }, + ], + existingSignatures: new Set(), + }), + ).toThrowError(); + }); + + it("rejects duplicate option combinations across SKUs", () => { + const variantAttributes = collectVariantDefiningAttributes([colorAttribute]); + const attributeValues = [valueColorRed, valueColorBlue]; + expect(() => + validateVariableSkuOptions({ + productId: "prod_1", + variantAttributes, + attributeValues, + optionValues: [{ attributeId: colorAttribute.id, attributeValueId: valueColorRed.id }], + existingSignatures: new Set([`${colorAttribute.id}:${valueColorRed.id}`]), + }), + ).toThrowError(); + }); +}); diff --git a/packages/plugins/commerce/src/lib/catalog-variants.ts b/packages/plugins/commerce/src/lib/catalog-variants.ts new file mode 100644 index 000000000..476219fbe --- /dev/null +++ b/packages/plugins/commerce/src/lib/catalog-variants.ts @@ -0,0 +1,103 @@ +import { PluginRouteError } from "emdash"; + +import type { StoredProductAttribute, StoredProductAttributeValue } from "../types.js"; +import { sortedImmutableNoCompare } from "./sort-immutable.js"; + +export type SkuOptionAssignment = { + attributeId: string; + attributeValueId: string; +}; + +export type VariantDefiningAttribute = StoredProductAttribute & { kind: "variant_defining" }; + +export function normalizeSkuOptionSignature(options: readonly SkuOptionAssignment[]): string { + return sortedImmutableNoCompare( + Array.from(options, (row) => `${row.attributeId}:${row.attributeValueId}`), + ).join("|"); +} + +export function collectVariantDefiningAttributes( + attributes: readonly StoredProductAttribute[], +): VariantDefiningAttribute[] { + return attributes.filter( + (attribute): attribute is VariantDefiningAttribute => attribute.kind === "variant_defining", + ); +} + +function buildAllowedValuesByAttribute( + attributeValues: readonly StoredProductAttributeValue[], +): Map> { + const map = new Map>(); + for (const value of attributeValues) { + const set = map.get(value.attributeId) ?? new Set(); + set.add(value.id); + map.set(value.attributeId, set); + } + return map; +} + +export function validateVariableSkuOptions({ + productId, + variantAttributes, + attributeValues, + optionValues, + existingSignatures, +}: { + productId: string; + variantAttributes: readonly VariantDefiningAttribute[]; + attributeValues: readonly StoredProductAttributeValue[]; + optionValues: readonly SkuOptionAssignment[]; + existingSignatures: ReadonlySet; +}) { + const expectedAttributeIds = Array.from(variantAttributes, (attribute) => attribute.id); + const expectedCount = expectedAttributeIds.length; + if (optionValues.length !== expectedCount) { + throw PluginRouteError.badRequest( + `Product ${productId} requires exactly ${expectedCount} option values for variable SKUs`, + ); + } + + const usedAttributeIds = new Set(); + const seenValuePairs = new Set(); + + const allowedValuesByAttribute = buildAllowedValuesByAttribute(attributeValues); + const expectedSet = new Set(expectedAttributeIds); + + for (const option of optionValues) { + if (!expectedSet.has(option.attributeId)) { + throw PluginRouteError.badRequest( + `Option attribute ${option.attributeId} is not variant-defining`, + ); + } + if (usedAttributeIds.has(option.attributeId)) { + throw PluginRouteError.badRequest(`Duplicate option for attribute ${option.attributeId}`); + } + usedAttributeIds.add(option.attributeId); + + const allowedValues = allowedValuesByAttribute.get(option.attributeId); + if (!allowedValues || !allowedValues.has(option.attributeValueId)) { + throw PluginRouteError.badRequest( + `Option value ${option.attributeValueId} is not defined for attribute ${option.attributeId}`, + ); + } + + const pair = `${option.attributeId}:${option.attributeValueId}`; + if (seenValuePairs.has(pair)) { + throw PluginRouteError.badRequest(`Duplicate option assignment pair ${pair}`); + } + seenValuePairs.add(pair); + } + + if (usedAttributeIds.size !== expectedAttributeIds.length) { + throw PluginRouteError.badRequest( + `Missing option values for product ${productId}: expected ${expectedAttributeIds.join(", ")}`, + ); + } + + const signature = normalizeSkuOptionSignature(optionValues); + if (existingSignatures.has(signature)) { + throw PluginRouteError.badRequest(`Duplicate variant combination for product ${productId}`); + } + + return signature; +} diff --git a/packages/plugins/commerce/src/lib/checkout-inventory-validation.ts b/packages/plugins/commerce/src/lib/checkout-inventory-validation.ts new file mode 100644 index 000000000..1685e4b84 --- /dev/null +++ b/packages/plugins/commerce/src/lib/checkout-inventory-validation.ts @@ -0,0 +1,104 @@ +/** + * Validates that cart/checkout line items have sufficient stock using the same + * ownership model as finalization: bundle products use component SKU stock only; + * no bundle-owned inventory row is required. + */ + +import { throwCommerceApiError } from "../route-errors.js"; +import type { + StoredBundleComponent, + StoredInventoryStock, + StoredProduct, + StoredProductSku, +} from "../types.js"; +import { inventoryStockDocId } from "./inventory-stock.js"; + +type GetCollection = { get(id: string): Promise }; + +type QueryBundleComponents = { + query(options?: { + where?: Record; + limit?: number; + }): Promise<{ items: Array<{ id: string; data: StoredBundleComponent }>; hasMore: boolean }>; +}; + +export type CheckoutInventoryValidationPorts = { + products: GetCollection; + bundleComponents: QueryBundleComponents; + productSkus: GetCollection; + inventoryStock: GetCollection; +}; + +type LineLike = { + productId: string; + variantId?: string; + quantity: number; +}; + +export async function validateLineItemsStockForCheckout( + lines: ReadonlyArray, + ports: CheckoutInventoryValidationPorts, +): Promise { + for (const line of lines) { + const product = await ports.products.get(line.productId); + if (!product) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: `Product is not available: ${line.productId}`, + }); + } + + if (product.type === "bundle") { + const componentRows = await ports.bundleComponents.query({ + where: { bundleProductId: line.productId }, + }); + if (componentRows.items.length === 0) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: `Bundle has no components: ${line.productId}`, + }); + } + for (const row of componentRows.items) { + const component = row.data; + const sku = await ports.productSkus.get(component.componentSkuId); + if (!sku) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: `Bundle component SKU missing: ${component.componentSkuId}`, + }); + } + const need = Math.max(1, component.quantity) * line.quantity; + const stockId = inventoryStockDocId(sku.productId, sku.id); + const inv = await ports.inventoryStock.get(stockId); + if (!inv) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: `Product is not available: ${sku.productId}`, + }); + } + if (inv.quantity < need) { + throwCommerceApiError({ + code: "INSUFFICIENT_STOCK", + message: `Insufficient stock for product ${sku.productId}`, + }); + } + } + continue; + } + + const stockId = inventoryStockDocId(line.productId, line.variantId ?? ""); + const inv = await ports.inventoryStock.get(stockId); + if (!inv) { + throwCommerceApiError({ + code: "PRODUCT_UNAVAILABLE", + message: `Product is not available: ${line.productId}`, + }); + } + if (inv.quantity < line.quantity) { + throwCommerceApiError({ + code: "INSUFFICIENT_STOCK", + message: `Insufficient stock for product ${line.productId}`, + }); + } + } +} diff --git a/packages/plugins/commerce/src/lib/crypto-adapter.ts b/packages/plugins/commerce/src/lib/crypto-adapter.ts new file mode 100644 index 000000000..4b59a2d8a --- /dev/null +++ b/packages/plugins/commerce/src/lib/crypto-adapter.ts @@ -0,0 +1,179 @@ +/** + * Runtime-portable crypto primitives. + * + * Prefers the Web Crypto API (`globalThis.crypto.subtle`) available in both + * Cloudflare Workers and modern Node.js (≥ 19 globally, ≥ 15 via + * `globalThis.crypto`). Falls back to `node:crypto` only when `crypto.subtle` + * is absent so the plugin stays usable in older Node environments without + * breaking Workers or edge runtimes. + * + * All public functions are async to accommodate the Web Crypto path. + */ + +const subtle: SubtleCrypto | undefined = + typeof globalThis !== "undefined" && + typeof (globalThis as { crypto?: Crypto }).crypto?.subtle !== "undefined" + ? (globalThis as { crypto: Crypto }).crypto.subtle + : undefined; + +type NodeCryptoModule = typeof import("node:crypto"); +let nodeCryptoModulePromise: Promise | null = null; + +async function nodeCryptoModule(): Promise { + if (!nodeCryptoModulePromise) { + nodeCryptoModulePromise = import("node:crypto").catch(() => null); + } + return nodeCryptoModulePromise; +} + +// --------------------------------------------------------------------------- +// SHA-256 hex digest +// --------------------------------------------------------------------------- + +async function sha256HexWebCrypto(input: string): Promise { + const encoded = new TextEncoder().encode(input); + const buf = await subtle!.digest("SHA-256", encoded); + return Array.from(new Uint8Array(buf), (b) => b.toString(16).padStart(2, "0")).join(""); +} + +async function sha256HexNode(input: string): Promise { + const nodeCrypto = await nodeCryptoModule(); + if (!nodeCrypto) { + throw new Error("Node crypto module unavailable for SHA-256 fallback"); + } + return nodeCrypto.createHash("sha256").update(input, "utf8").digest("hex"); +} + +export async function sha256HexAsync(input: string): Promise { + if (subtle) return sha256HexWebCrypto(input); + return sha256HexNode(input); +} + +// --------------------------------------------------------------------------- +// Constant-time comparison of two 64-char hex SHA-256 digests +// --------------------------------------------------------------------------- + +async function equalSha256HexDigestWebCrypto(a: string, b: string): Promise { + if (a.length !== 64 || b.length !== 64) return false; + const aBytes = hexToUint8Array(a); + const bBytes = hexToUint8Array(b); + if (!aBytes || !bBytes) return false; + // Import both as HMAC keys and sign a fixed message — the only way Web Crypto + // exposes constant-time comparison without timingSafeEqual. + // Alternatively: XOR all bytes and check for zero (not timing-safe in JS). + // We use the XOR approach here; timing-safe equality for 32-byte secrets is + // acceptable because the comparison window is tiny and fixed-length. + let diff = 0; + for (let i = 0; i < 32; i++) { + diff |= (aBytes[i] ?? 0) ^ (bBytes[i] ?? 0); + } + return diff === 0; +} + +async function equalSha256HexDigestNode(a: string, b: string): Promise { + if (a.length !== 64 || b.length !== 64) return false; + try { + const nodeCrypto = await nodeCryptoModule(); + if (!nodeCrypto) return false; + return nodeCrypto.timingSafeEqual(Buffer.from(a, "hex"), Buffer.from(b, "hex")); + } catch { + return false; + } +} + +export async function equalSha256HexDigestAsync(a: string, b: string): Promise { + if (subtle) return equalSha256HexDigestWebCrypto(a, b); + return equalSha256HexDigestNode(a, b); +} + +// --------------------------------------------------------------------------- +// Random bytes → hex string +// --------------------------------------------------------------------------- + +export async function randomHex(byteLength = 24): Promise { + const buf = new Uint8Array(byteLength); + if ( + typeof globalThis !== "undefined" && + typeof (globalThis as { crypto?: Crypto }).crypto?.getRandomValues === "function" + ) { + (globalThis as { crypto: Crypto }).crypto.getRandomValues(buf); + } else { + const nodeCrypto = await nodeCryptoModule(); + if (!nodeCrypto) { + throw new Error("Node crypto module unavailable for random fallback"); + } + const nodeBuf = nodeCrypto.randomBytes(byteLength); + buf.set(nodeBuf); + } + return Array.from(buf, (b) => b.toString(16).padStart(2, "0")).join(""); +} + +// --------------------------------------------------------------------------- +// HMAC-SHA256 (Stripe webhook signature) +// --------------------------------------------------------------------------- + +async function hmacSha256HexWebCrypto(secret: string, message: string): Promise { + const keyMaterial = new TextEncoder().encode(secret); + const key = await subtle!.importKey( + "raw", + keyMaterial, + { name: "HMAC", hash: "SHA-256" }, + false, + ["sign"], + ); + const sig = await subtle!.sign("HMAC", key, new TextEncoder().encode(message)); + return Array.from(new Uint8Array(sig), (b) => b.toString(16).padStart(2, "0")).join(""); +} + +async function hmacSha256HexNode(secret: string, message: string): Promise { + const nodeCrypto = await nodeCryptoModule(); + if (!nodeCrypto) { + throw new Error("Node crypto module unavailable for HMAC fallback"); + } + return nodeCrypto.createHmac("sha256", secret).update(message).digest("hex"); +} + +export async function hmacSha256HexAsync(secret: string, message: string): Promise { + if (subtle) return hmacSha256HexWebCrypto(secret, message); + return hmacSha256HexNode(secret, message); +} + +// --------------------------------------------------------------------------- +// Constant-time hex comparison (generic, for HMAC results) +// --------------------------------------------------------------------------- + +export async function constantTimeEqualHexAsync(a: string, b: string): Promise { + if (a.length !== b.length) return false; + const aBytes = hexToUint8Array(a); + const bBytes = hexToUint8Array(b); + if (!aBytes || !bBytes) return false; + if (subtle) { + let diff = 0; + for (let i = 0; i < aBytes.length; i++) { + diff |= (aBytes[i] ?? 0) ^ (bBytes[i] ?? 0); + } + return diff === 0; + } + try { + const nodeCrypto = await nodeCryptoModule(); + if (!nodeCrypto) return false; + return nodeCrypto.timingSafeEqual(Buffer.from(aBytes), Buffer.from(bBytes)); + } catch { + return false; + } +} + +// --------------------------------------------------------------------------- +// Helpers +// --------------------------------------------------------------------------- + +function hexToUint8Array(hex: string): Uint8Array | null { + if (hex.length % 2 !== 0) return null; + const out = new Uint8Array(hex.length / 2); + for (let i = 0; i < out.length; i++) { + const byte = Number.parseInt(hex.slice(i * 2, i * 2 + 2), 16); + if (Number.isNaN(byte)) return null; + out[i] = byte; + } + return out; +} diff --git a/packages/plugins/commerce/src/lib/finalization-diagnostics-readthrough.ts b/packages/plugins/commerce/src/lib/finalization-diagnostics-readthrough.ts new file mode 100644 index 000000000..c92d91bfc --- /dev/null +++ b/packages/plugins/commerce/src/lib/finalization-diagnostics-readthrough.ts @@ -0,0 +1,112 @@ +/** + * Read-through cache + in-flight coalescing for `queryFinalizationState`. + * + * EmDash serverless defaults: many warm isolates each have their own in-memory + * singleflight map; KV cache + fixed-window rate limits align reads across + * instances and protect storage from dashboard/MCP polling bursts. + */ + +import type { RouteContext } from "emdash"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import type { FinalizationStatus } from "../orchestration/finalize-payment.js"; +import { throwCommerceApiError } from "../route-errors.js"; +import { sha256HexAsync } from "./crypto-adapter.js"; +import { buildRateLimitActorKey } from "./rate-limit-identity.js"; +import { consumeKvRateLimit } from "./rate-limit-kv.js"; + +const CACHE_KEY_PREFIX = "state:finalize_diag:v1:"; + +type CachedEnvelopeV1 = { + v: 1; + expiresAtMs: number; + status: FinalizationStatus; +}; + +const inFlightByStableKey = new Map>(); + +function isFinalizationStatusLike(value: unknown): value is FinalizationStatus { + if (!value || typeof value !== "object") return false; + const o = value as Record; + return ( + typeof o.receiptStatus === "string" && + typeof o.isInventoryApplied === "boolean" && + typeof o.isOrderPaid === "boolean" && + typeof o.isPaymentAttemptSucceeded === "boolean" && + typeof o.isReceiptProcessed === "boolean" && + typeof o.resumeState === "string" + ); +} + +function parseCachedEnvelope(raw: unknown, nowMs: number): FinalizationStatus | null { + if (!raw || typeof raw !== "object") return null; + const o = raw as Record; + if (o.v !== 1) return null; + if (typeof o.expiresAtMs !== "number" || o.expiresAtMs <= nowMs) return null; + if (!isFinalizationStatusLike(o.status)) return null; + return o.status; +} + +export type FinalizationDiagnosticsInput = { + orderId: string; + providerId: string; + externalEventId: string; +}; + +/** + * Rate-limits diagnostics reads per client IP, then returns a cached result when + * fresh, otherwise runs `fetcher` with per-isolate in-flight dedupe. + */ +export async function readFinalizationStatusWithGuards( + ctx: RouteContext, + input: FinalizationDiagnosticsInput, + fetcher: () => Promise, +): Promise { + const nowMs = Date.now(); + const actorKey = await buildRateLimitActorKey(ctx, "finalize_diag"); + + const allowed = await consumeKvRateLimit({ + kv: ctx.kv, + keySuffix: `finalize_diag:ip:${actorKey}`, + limit: COMMERCE_LIMITS.defaultFinalizationDiagnosticsPerIpPerWindow, + windowMs: COMMERCE_LIMITS.defaultRateWindowMs, + nowMs, + }); + if (!allowed) { + throwCommerceApiError({ + code: "RATE_LIMITED", + message: "Too many finalization diagnostics requests; try again shortly", + }); + } + + const stableKey = await sha256HexAsync( + `${input.orderId}\0${input.providerId}\0${input.externalEventId}`, + ); + const kvCacheKey = `${CACHE_KEY_PREFIX}${stableKey.slice(0, 48)}`; + + const cached = parseCachedEnvelope(await ctx.kv.get(kvCacheKey), nowMs); + if (cached) { + return structuredClone(cached); + } + + let pending = inFlightByStableKey.get(stableKey); + if (!pending) { + pending = (async () => { + try { + const status = await fetcher(); + const envelope: CachedEnvelopeV1 = { + v: 1, + expiresAtMs: Date.now() + COMMERCE_LIMITS.finalizationDiagnosticsCacheTtlMs, + status, + }; + await ctx.kv.set(kvCacheKey, envelope); + return status; + } finally { + inFlightByStableKey.delete(stableKey); + } + })(); + inFlightByStableKey.set(stableKey, pending); + } + + return structuredClone(await pending); +} diff --git a/packages/plugins/commerce/src/lib/idempotency-ttl.test.ts b/packages/plugins/commerce/src/lib/idempotency-ttl.test.ts new file mode 100644 index 000000000..c97cd19c4 --- /dev/null +++ b/packages/plugins/commerce/src/lib/idempotency-ttl.test.ts @@ -0,0 +1,22 @@ +import { describe, expect, it } from "vitest"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import { isIdempotencyRecordFresh } from "./idempotency-ttl.js"; + +describe("isIdempotencyRecordFresh", () => { + it("returns false for invalid timestamps", () => { + expect(isIdempotencyRecordFresh("not-a-date", Date.now())).toBe(false); + }); + + it("returns false when older than TTL", () => { + const old = new Date( + Date.now() - COMMERCE_LIMITS.idempotencyRecordTtlMs - 60_000, + ).toISOString(); + expect(isIdempotencyRecordFresh(old, Date.now())).toBe(false); + }); + + it("returns true inside TTL window", () => { + const recent = new Date(Date.now() - 60_000).toISOString(); + expect(isIdempotencyRecordFresh(recent, Date.now())).toBe(true); + }); +}); diff --git a/packages/plugins/commerce/src/lib/idempotency-ttl.ts b/packages/plugins/commerce/src/lib/idempotency-ttl.ts new file mode 100644 index 000000000..155d6e453 --- /dev/null +++ b/packages/plugins/commerce/src/lib/idempotency-ttl.ts @@ -0,0 +1,10 @@ +import { COMMERCE_LIMITS } from "../kernel/limits.js"; + +/** + * Returns true when an idempotency record is still within its TTL window. + */ +export function isIdempotencyRecordFresh(createdAtIso: string, nowMs: number): boolean { + const t = Date.parse(createdAtIso); + if (!Number.isFinite(t)) return false; + return nowMs - t < COMMERCE_LIMITS.idempotencyRecordTtlMs; +} diff --git a/packages/plugins/commerce/src/lib/inventory-stock.ts b/packages/plugins/commerce/src/lib/inventory-stock.ts new file mode 100644 index 000000000..f4dc635fc --- /dev/null +++ b/packages/plugins/commerce/src/lib/inventory-stock.ts @@ -0,0 +1,3 @@ +export function inventoryStockDocId(productId: string, variantId: string): string { + return `stock:${encodeURIComponent(productId)}:${encodeURIComponent(variantId)}`; +} diff --git a/packages/plugins/commerce/src/lib/merge-line-items.test.ts b/packages/plugins/commerce/src/lib/merge-line-items.test.ts new file mode 100644 index 000000000..d6517f8eb --- /dev/null +++ b/packages/plugins/commerce/src/lib/merge-line-items.test.ts @@ -0,0 +1,47 @@ +import { describe, expect, it } from "vitest"; + +import { mergeLineItemsBySku } from "./merge-line-items.js"; + +const INVENTORY_VERSION_CONFLICT_PATTERN = /inventoryVersion/; + +describe("mergeLineItemsBySku", () => { + it("sums quantities for identical SKU snapshots", () => { + const out = mergeLineItemsBySku([ + { + productId: "a", + variantId: "", + quantity: 1, + inventoryVersion: 2, + unitPriceMinor: 100, + }, + { + productId: "a", + variantId: "", + quantity: 3, + inventoryVersion: 2, + unitPriceMinor: 100, + }, + ]); + expect(out).toHaveLength(1); + expect(out[0]!.quantity).toBe(4); + }); + + it("throws when duplicate SKU has mismatched version", () => { + expect(() => + mergeLineItemsBySku([ + { + productId: "a", + quantity: 1, + inventoryVersion: 1, + unitPriceMinor: 100, + }, + { + productId: "a", + quantity: 1, + inventoryVersion: 2, + unitPriceMinor: 100, + }, + ]), + ).toThrow(INVENTORY_VERSION_CONFLICT_PATTERN); + }); +}); diff --git a/packages/plugins/commerce/src/lib/merge-line-items.ts b/packages/plugins/commerce/src/lib/merge-line-items.ts new file mode 100644 index 000000000..eac552094 --- /dev/null +++ b/packages/plugins/commerce/src/lib/merge-line-items.ts @@ -0,0 +1,61 @@ +/** + * Merge duplicate SKU rows so inventory finalize applies one decrement per (productId, variantId). + * Duplicate lines must share the same snapshot version and unit price (enforced at checkout). + */ + +export type MergeableLine = { + productId: string; + variantId?: string; + quantity: number; + inventoryVersion: number; + unitPriceMinor: number; +}; + +export class LineConflictError extends Error { + constructor( + message: string, + public readonly productId: string, + public readonly variantId: string | undefined, + public readonly expected: { inventoryVersion: number; unitPriceMinor: number }, + public readonly actual: { inventoryVersion: number; unitPriceMinor: number }, + ) { + super(message); + this.name = "LineConflictError"; + } +} + +function lineKey(line: MergeableLine): string { + return `${line.productId}\u0000${line.variantId ?? ""}`; +} + +export function mergeLineItemsBySku(lines: T[]): T[] { + const map = new Map(); + for (const line of lines) { + const k = lineKey(line); + const cur = map.get(k); + if (!cur) { + map.set(k, { ...line }); + continue; + } + if (cur.inventoryVersion !== line.inventoryVersion) { + throw new LineConflictError( + `mergeLineItemsBySku: conflicting inventoryVersion for ${line.productId}/${line.variantId ?? ""}`, + line.productId, + line.variantId, + { inventoryVersion: cur.inventoryVersion, unitPriceMinor: cur.unitPriceMinor }, + { inventoryVersion: line.inventoryVersion, unitPriceMinor: line.unitPriceMinor }, + ); + } + if (cur.unitPriceMinor !== line.unitPriceMinor) { + throw new LineConflictError( + `mergeLineItemsBySku: conflicting unitPriceMinor for ${line.productId}/${line.variantId ?? ""}`, + line.productId, + line.variantId, + { inventoryVersion: cur.inventoryVersion, unitPriceMinor: cur.unitPriceMinor }, + { inventoryVersion: line.inventoryVersion, unitPriceMinor: line.unitPriceMinor }, + ); + } + map.set(k, { ...cur, quantity: cur.quantity + line.quantity }); + } + return [...map.values()]; +} diff --git a/packages/plugins/commerce/src/lib/order-inventory-lines.ts b/packages/plugins/commerce/src/lib/order-inventory-lines.ts new file mode 100644 index 000000000..e0979d324 --- /dev/null +++ b/packages/plugins/commerce/src/lib/order-inventory-lines.ts @@ -0,0 +1,68 @@ +/** + * Expands order lines for inventory preflight and mutation: bundle lines become + * one row per component SKU (quantity × bundles). Non-bundle lines pass through. + * Duplicate component SKUs are merged after expansion via {@link mergeLineItemsBySku}. + */ + +import type { OrderLineItem } from "../types.js"; +import { mergeLineItemsBySku } from "./merge-line-items.js"; + +export class BundleSnapshotError extends Error { + constructor( + message: string, + public readonly productId: string, + public readonly code: "MISSING_BUNDLE_SNAPSHOT" | "INVALID_COMPONENT_INVENTORY", + ) { + super(message); + this.name = "BundleSnapshotError"; + } +} + +function expandBundleLineToComponents(line: OrderLineItem): OrderLineItem[] { + const bundle = line.snapshot?.bundleSummary; + if (!bundle || bundle.components.length === 0) { + throw new BundleSnapshotError( + `Bundle snapshot is incomplete for product ${line.productId}`, + line.productId, + "MISSING_BUNDLE_SNAPSHOT", + ); + } + + for (const component of bundle.components) { + if ( + !Number.isFinite(component.componentInventoryVersion) || + component.componentInventoryVersion < 0 + ) { + throw new BundleSnapshotError( + `Bundle snapshot missing component inventory version for product ${line.productId} component ${component.componentId}`, + line.productId, + "INVALID_COMPONENT_INVENTORY", + ); + } + } + + return bundle.components.map((component) => ({ + productId: component.componentProductId, + variantId: component.componentSkuId, + quantity: component.quantityPerBundle * line.quantity, + inventoryVersion: component.componentInventoryVersion, + unitPriceMinor: component.componentPriceMinor, + })); +} + +/** + * Merge cart/order lines, expand bundles to component SKUs, merge again so the + * same component requested by multiple bundle lines is decremented once. + */ +export function toInventoryDeductionLines(lines: ReadonlyArray): OrderLineItem[] { + const mergedBundles = mergeLineItemsBySku([...lines]); + const expanded: OrderLineItem[] = []; + for (const line of mergedBundles) { + if (line.snapshot?.productType === "bundle") { + expanded.push(...expandBundleLineToComponents(line)); + } else { + expanded.push(line); + } + } + return mergeLineItemsBySku(expanded); +} diff --git a/packages/plugins/commerce/src/lib/ordered-rows.test.ts b/packages/plugins/commerce/src/lib/ordered-rows.test.ts new file mode 100644 index 000000000..ae4d68a20 --- /dev/null +++ b/packages/plugins/commerce/src/lib/ordered-rows.test.ts @@ -0,0 +1,170 @@ +import { describe, expect, it } from "vitest"; + +import { + addOrderedRow, + moveOrderedRow, + mutateOrderedChildren, + normalizeOrderedChildren, + normalizeOrderedPosition, + removeOrderedRow, + sortOrderedRowsByPosition, +} from "./ordered-rows.js"; + +type Row = { id: string; position: number; createdAt?: string; updatedAt?: string }; + +describe("ordered rows helpers", () => { + it("sortOrderedRowsByPosition uses createdAt as tiebreaker for equal positions", () => { + const rows: Row[] = [ + { id: "late", position: 0, createdAt: "2026-01-01T00:00:00.000Z" }, + { id: "early", position: 0, createdAt: "2025-01-01T00:00:00.000Z" }, + { id: "next", position: 1, createdAt: "2026-01-01T00:00:00.000Z" }, + ]; + const sorted = sortOrderedRowsByPosition(rows); + expect(sorted.map((row) => row.id)).toEqual(["early", "late", "next"]); + }); + + it("normalizes ordered rows to dense zero-based positions", () => { + const normalized = normalizeOrderedChildren([ + { id: "a", position: 4 }, + { id: "b", position: 9, createdAt: "2026-01-01T00:00:00.000Z" }, + ]); + expect(normalized.map((row) => row.position)).toEqual([0, 1]); + }); + + it("normalizes requested position input", () => { + expect(normalizeOrderedPosition(-4)).toBe(0); + expect(normalizeOrderedPosition(1.9)).toBe(1); + expect(normalizeOrderedPosition(99)).toBe(99); + }); + + it("normalizes positions when adding a row (clamps oversized and negative input)", () => { + const rows: Row[] = [ + { id: "first", position: 0 }, + { id: "second", position: 2 }, + ]; + + const withHead = addOrderedRow([...rows], { id: "head", position: 99 }, -9); + expect(withHead.map((row) => row.position)).toEqual([0, 1, 2]); + expect(withHead.map((row) => row.id)).toEqual(["head", "first", "second"]); + + const withTail = addOrderedRow([...rows], { id: "tail", position: 99 }, 10); + expect(withTail.map((row) => row.position)).toEqual([0, 1, 2]); + expect(withTail.map((row) => row.id)).toEqual(["first", "second", "tail"]); + }); + + it("removes by id and re-normalizes", () => { + const rows: Row[] = [ + { id: "keep", position: 0 }, + { id: "drop", position: 1 }, + { id: "keep2", position: 2 }, + ]; + const kept = removeOrderedRow(rows, "drop"); + expect(kept.map((row) => row.id)).toEqual(["keep", "keep2"]); + expect(kept.map((row) => row.position)).toEqual([0, 1]); + }); + + it("moves a row and keeps index behavior stable", () => { + const rows: Row[] = [ + { id: "left", position: 0 }, + { id: "mid", position: 1 }, + { id: "right", position: 2 }, + ]; + const reordered = moveOrderedRow([...rows], "right", 0); + expect(reordered.map((row) => row.id)).toEqual(["right", "left", "mid"]); + expect(reordered.map((row) => row.position)).toEqual([0, 1, 2]); + }); + + it("moveOrderedRow throws for missing row ids", () => { + const rows: Row[] = [ + { id: "left", position: 0 }, + { id: "mid", position: 1 }, + ]; + expect(() => moveOrderedRow([...rows], "missing", 0)).toThrowError( + "Ordered row not found in target list", + ); + }); + + it("mutateOrderedChildren preserves move not found message overrides", async () => { + const rows: Row[] = [{ id: "left", position: 0 }]; + const collection = { + put: async (_id: string, _row: Row) => {}, + } as any; + + await expect(() => + mutateOrderedChildren({ + collection, + rows, + mutation: { + kind: "move", + rowId: "missing", + requestedPosition: 0, + notFoundMessage: "row not found", + }, + nowIso: "2026-01-01T00:00:00.000Z", + }), + ).rejects.toThrowError("row not found"); + }); + + it("mutateOrderedChildren persists normalized rows after mutation", async () => { + const rows: Row[] = [ + { id: "left", position: 0 }, + { id: "mid", position: 1 }, + { id: "right", position: 2 }, + ]; + const persisted: Row[] = []; + const collection = { + put: async (_id: string, row: Row) => { + persisted.push({ ...row }); + }, + } as any; + + const out = await mutateOrderedChildren({ + collection, + rows, + mutation: { + kind: "move", + rowId: "left", + requestedPosition: 2, + }, + nowIso: "2026-01-01T00:00:00.000Z", + }); + + expect(out.map((row) => row.id)).toEqual(["mid", "right", "left"]); + expect(out.every((row) => row.updatedAt === "2026-01-01T00:00:00.000Z")).toBe(true); + expect(persisted.map((row) => row.id)).toEqual(["mid", "right", "left"]); + }); + + it("mutateOrderedChildren uses batch writes and batch deletion for supported collections", async () => { + const rows: Row[] = [ + { id: "left", position: 0 }, + { id: "mid", position: 1 }, + { id: "right", position: 2 }, + ]; + const persisted: Row[] = []; + const deleted: string[] = []; + const collection = { + putMany: async (items: Array<{ id: string; data: Row }>) => { + for (const item of items) { + persisted.push({ ...item.data }); + } + }, + deleteMany: async (ids: string[]) => { + deleted.push(...ids); + }, + } as any; + + await mutateOrderedChildren({ + collection, + rows, + mutation: { + kind: "remove", + removedRowId: "mid", + }, + nowIso: "2026-01-01T00:00:00.000Z", + }); + + expect(persisted.map((row) => row.id)).toEqual(["left", "right"]); + expect(persisted.every((row) => row.updatedAt === "2026-01-01T00:00:00.000Z")).toBe(true); + expect(deleted).toEqual(["mid"]); + }); +}); diff --git a/packages/plugins/commerce/src/lib/ordered-rows.ts b/packages/plugins/commerce/src/lib/ordered-rows.ts new file mode 100644 index 000000000..2f7cd7452 --- /dev/null +++ b/packages/plugins/commerce/src/lib/ordered-rows.ts @@ -0,0 +1,144 @@ +import { PluginRouteError } from "emdash"; +import type { StorageCollection } from "emdash"; + +import { sortedImmutable } from "./sort-immutable.js"; + +type Collection = StorageCollection; + +export type OrderedRow = { + id: string; + position: number; +}; + +export type OrderedChildMutation = + | { kind: "add"; row: T; requestedPosition: number } + | { kind: "remove"; removedRowId: string } + | { + kind: "move"; + rowId: string; + requestedPosition: number; + notFoundMessage?: string; + }; + +export function sortOrderedRowsByPosition( + rows: T[], +): T[] { + const sorted = sortedImmutable(rows, (left, right) => { + if (left.position === right.position) { + return (left.createdAt ?? "").localeCompare(right.createdAt ?? ""); + } + return left.position - right.position; + }); + return sorted; +} + +export function normalizeOrderedPosition(input: number): number { + return Math.max(0, Math.trunc(input)); +} + +export function normalizeOrderedChildren(rows: T[]): T[] { + return rows.map((row, idx) => ({ + ...row, + position: idx, + })); +} + +export function addOrderedRow( + rows: T[], + row: T, + requestedPosition: number, +): T[] { + const normalizedPosition = Math.min(normalizeOrderedPosition(requestedPosition), rows.length); + const nextOrder = [...rows]; + nextOrder.splice(normalizedPosition, 0, row); + return normalizeOrderedChildren(nextOrder); +} + +export function removeOrderedRow(rows: T[], removedRowId: string): T[] { + return normalizeOrderedChildren(rows.filter((row) => row.id !== removedRowId)); +} + +export function moveOrderedRow( + rows: T[], + rowId: string, + requestedPosition: number, +): T[] { + const fromIndex = rows.findIndex((row) => row.id === rowId); + if (fromIndex === -1) { + throw PluginRouteError.badRequest("Ordered row not found in target list"); + } + + const nextOrder = [...rows]; + const [moving] = nextOrder.splice(fromIndex, 1); + if (!moving) { + throw PluginRouteError.badRequest("Ordered row not found in target list"); + } + + const insertionIndex = Math.min(normalizeOrderedPosition(requestedPosition), rows.length - 1); + nextOrder.splice(insertionIndex, 0, moving); + return normalizeOrderedChildren(nextOrder); +} + +export async function persistOrderedRows( + collection: Collection, + rows: T[], + nowIso: string, +): Promise { + const normalized = normalizeOrderedChildren(rows).map((row) => ({ + ...row, + updatedAt: nowIso, + })); + const items = normalized.map((row) => ({ id: row.id, data: row })); + if (items.length > 0) { + if ("putMany" in collection && typeof collection.putMany === "function") { + await collection.putMany(items); + } else { + for (const item of items) { + await collection.put(item.id, item.data); + } + } + } + return normalized; +} + +export async function mutateOrderedChildren(params: { + collection: Collection; + rows: T[]; + mutation: OrderedChildMutation; + nowIso: string; +}): Promise { + const { collection, rows, mutation, nowIso } = params; + let normalized: T[] = []; + const removeIds: string[] = []; + switch (mutation.kind) { + case "add": + normalized = addOrderedRow(rows, mutation.row, mutation.requestedPosition); + break; + case "remove": + normalized = removeOrderedRow(rows, mutation.removedRowId); + removeIds.push(mutation.removedRowId); + break; + case "move": { + const { rowId, requestedPosition } = mutation; + const fromIndex = rows.findIndex((candidate) => candidate.id === rowId); + if (fromIndex === -1) { + throw PluginRouteError.badRequest( + mutation.notFoundMessage ?? "Ordered row not found in target list", + ); + } + normalized = moveOrderedRow(rows, rowId, requestedPosition); + break; + } + } + const persisted = await persistOrderedRows(collection, normalized, nowIso); + if (removeIds.length > 0) { + if ("deleteMany" in collection && typeof collection.deleteMany === "function") { + await collection.deleteMany(removeIds); + } else { + for (const removeId of removeIds) { + await collection.delete(removeId); + } + } + } + return persisted; +} diff --git a/packages/plugins/commerce/src/lib/rate-limit-identity.ts b/packages/plugins/commerce/src/lib/rate-limit-identity.ts new file mode 100644 index 000000000..443790fad --- /dev/null +++ b/packages/plugins/commerce/src/lib/rate-limit-identity.ts @@ -0,0 +1,68 @@ +/** + * Shared actor identifiers for request-based rate limiting. + * We prefer concrete client IP when available, then trusted proxy headers, + * then deterministic route/session fallbacks. + */ + +import { sha256HexAsync } from "./crypto-adapter.js"; + +type RateLimitIdentityContext = { + request: { + headers: Headers; + url: string; + }; + requestMeta: { + ip?: string | null; + }; +}; + +function normalizeIp(raw?: string | null): string | undefined { + const trimmed = raw?.trim(); + if (!trimmed || trimmed.toLowerCase() === "unknown") { + return undefined; + } + return trimmed; +} + +function parseForwardedIp(raw: string | null): string | undefined { + if (!raw) return undefined; + const first = raw.split(",")[0]?.trim(); + return normalizeIp(first); +} + +function fallbackRateLimitActor(scope: string, ctx: RateLimitIdentityContext): string { + const userAgent = normalizeIp(ctx.request.headers.get("user-agent")); + if (userAgent) { + return `${scope}:ua:${userAgent}`; + } + + const requestId = normalizeIp( + ctx.request.headers.get("x-request-id") || ctx.request.headers.get("cf-ray"), + ); + if (requestId) { + return `${scope}:rid:${requestId}`; + } + + let pathname = "unknown"; + try { + pathname = new URL(ctx.request.url).pathname; + } catch { + // Request urls are usually absolute, but stay deterministic in odd test/runtime cases. + pathname = "/"; + } + return `${scope}:path:${pathname}`; +} + +export async function buildRateLimitActorKey( + ctx: RateLimitIdentityContext, + scope: string, +): Promise { + const ipFromMetadata = normalizeIp(ctx.requestMeta.ip); + const ipFromHeaders = + parseForwardedIp(ctx.request.headers.get("x-forwarded-for")) ?? + parseForwardedIp(ctx.request.headers.get("x-real-ip")); + + const actor = ipFromMetadata ?? ipFromHeaders ?? fallbackRateLimitActor(scope, ctx); + const digest = await sha256HexAsync(actor); + return digest.slice(0, 32); +} diff --git a/packages/plugins/commerce/src/lib/rate-limit-kv.ts b/packages/plugins/commerce/src/lib/rate-limit-kv.ts new file mode 100644 index 000000000..8c301add4 --- /dev/null +++ b/packages/plugins/commerce/src/lib/rate-limit-kv.ts @@ -0,0 +1,41 @@ +/** + * Fixed-window rate limiting using plugin KV (survives across requests in production). + * + * **Best-effort only.** `consumeKvRateLimit` is a read-modify-write cycle with no + * atomic guarantee. Under concurrent requests the counter can undercount, meaning + * the actual rate allowed may exceed the configured limit. This is acceptable for + * abuse throttling and cost control, but must not be relied on as a hard security + * boundary or billing gate. + */ + +import type { KVAccess } from "emdash"; + +import { nextRateLimitState, type RateBucket } from "../kernel/rate-limit-window.js"; + +const BUCKET_KEY = "state:ratelimit:"; + +function parseBucket(raw: unknown): RateBucket | null { + if (raw === null || typeof raw !== "object") return null; + const o = raw as Record; + const count = o.count; + const windowStartMs = o.windowStartMs; + if (typeof count !== "number" || typeof windowStartMs !== "number") return null; + return { count, windowStartMs }; +} + +/** + * @returns `true` if the request is allowed; `false` if rate limited. + */ +export async function consumeKvRateLimit(input: { + kv: KVAccess; + keySuffix: string; + limit: number; + windowMs: number; + nowMs: number; +}): Promise { + const key = `${BUCKET_KEY}${input.keySuffix}`; + const prev = parseBucket(await input.kv.get(key)); + const { allowed, bucket } = nextRateLimitState(prev, input.nowMs, input.limit, input.windowMs); + await input.kv.set(key, bucket); + return allowed; +} diff --git a/packages/plugins/commerce/src/lib/require-post.test.ts b/packages/plugins/commerce/src/lib/require-post.test.ts new file mode 100644 index 000000000..f6886d158 --- /dev/null +++ b/packages/plugins/commerce/src/lib/require-post.test.ts @@ -0,0 +1,31 @@ +import { PluginRouteError } from "emdash"; +import { describe, expect, it } from "vitest"; + +import { requirePost } from "./require-post.js"; + +describe("requirePost", () => { + it("allows POST", () => { + expect(() => + requirePost({ + request: new Request("https://x.test/a", { method: "POST" }), + } as never), + ).not.toThrow(); + }); + + it("rejects GET with 405", () => { + expect(() => + requirePost({ + request: new Request("https://x.test/a", { method: "GET" }), + } as never), + ).toThrow(PluginRouteError); + + try { + requirePost({ + request: new Request("https://x.test/a", { method: "GET" }), + } as never); + } catch (e) { + expect(e).toBeInstanceOf(PluginRouteError); + expect((e as PluginRouteError).status).toBe(405); + } + }); +}); diff --git a/packages/plugins/commerce/src/lib/require-post.ts b/packages/plugins/commerce/src/lib/require-post.ts new file mode 100644 index 000000000..cfcd80f1f --- /dev/null +++ b/packages/plugins/commerce/src/lib/require-post.ts @@ -0,0 +1,9 @@ +import type { RouteContext } from "emdash"; +import { PluginRouteError } from "emdash"; + +/** Aligns with documented route pattern: mutate endpoints should reject GET/HEAD. */ +export function requirePost(ctx: RouteContext): void { + if (ctx.request.method !== "POST") { + throw new PluginRouteError("METHOD_NOT_ALLOWED", "Only POST is allowed", 405); + } +} diff --git a/packages/plugins/commerce/src/lib/sort-immutable.ts b/packages/plugins/commerce/src/lib/sort-immutable.ts new file mode 100644 index 000000000..c6612188c --- /dev/null +++ b/packages/plugins/commerce/src/lib/sort-immutable.ts @@ -0,0 +1,14 @@ +type SortableArray = T[] & { toSorted(compareFn?: (left: T, right: T) => number): T[] }; + +export function sortedImmutable( + items: readonly T[], + compare: (left: T, right: T) => number, +): T[] { + const cloned = [...items]; + return (cloned as SortableArray).toSorted(compare); +} + +export function sortedImmutableNoCompare(items: readonly T[]): T[] { + const cloned = [...items]; + return (cloned as SortableArray).toSorted(); +} diff --git a/packages/plugins/commerce/src/orchestration/finalize-payment-inventory.test.ts b/packages/plugins/commerce/src/orchestration/finalize-payment-inventory.test.ts new file mode 100644 index 000000000..4190b520e --- /dev/null +++ b/packages/plugins/commerce/src/orchestration/finalize-payment-inventory.test.ts @@ -0,0 +1,247 @@ +import { describe, expect, it } from "vitest"; + +import type { OrderLineItem, StoredInventoryLedgerEntry, StoredInventoryStock } from "../types.js"; +import { applyInventoryForOrder, inventoryStockDocId } from "./finalize-payment-inventory.js"; + +type MemOpts = { where?: Record; limit?: number }; + +class MemColl { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } + + async query( + options: MemOpts = {}, + ): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }> { + const where = options.where ?? {}; + const limit = options.limit ?? 1000; + let items = Array.from(this.rows.entries(), ([id, data]) => ({ id, data })); + for (const [field, value] of Object.entries(where)) { + items = items.filter((item) => (item.data as Record)[field] === value); + } + return { items: items.slice(0, limit), hasMore: false }; + } +} + +function bundleOrderLine(overrides: Partial = {}): OrderLineItem { + const bundleProductId = "bundle_tx_1"; + const compProductId = "comp_prod_1"; + const compSkuId = "comp_sku_1"; + return { + productId: bundleProductId, + quantity: 2, + inventoryVersion: 1, + unitPriceMinor: 0, + snapshot: { + productId: bundleProductId, + skuId: bundleProductId, + productType: "bundle", + productTitle: "Bundle", + skuCode: bundleProductId, + selectedOptions: [], + currency: "USD", + unitPriceMinor: 0, + lineSubtotalMinor: 0, + lineDiscountMinor: 0, + lineTotalMinor: 0, + requiresShipping: true, + isDigital: false, + bundleSummary: { + productId: bundleProductId, + subtotalMinor: 1000, + discountType: "none", + discountValueMinor: 0, + discountValueBps: 0, + discountAmountMinor: 0, + finalPriceMinor: 1000, + availability: 10, + components: [ + { + componentId: "bc_1", + componentSkuId: compSkuId, + componentSkuCode: "COMP-1", + componentProductId: compProductId, + componentPriceMinor: 500, + quantityPerBundle: 3, + subtotalContributionMinor: 1500, + availableBundleQuantity: 10, + componentInventoryVersion: 4, + }, + ], + }, + }, + ...overrides, + }; +} + +describe("finalize-payment-inventory bundle expansion", () => { + const now = "2026-04-10T12:00:00.000Z"; + + it("decrements component SKU stock for bundle lines (no bundle-owned stock row)", async () => { + const line = bundleOrderLine(); + const compProductId = "comp_prod_1"; + const compSkuId = "comp_sku_1"; + const stockId = inventoryStockDocId(compProductId, compSkuId); + const inventoryStock = new MemColl( + new Map([ + [ + stockId, + { + productId: compProductId, + variantId: compSkuId, + version: 4, + quantity: 100, + updatedAt: now, + }, + ], + ]), + ); + const inventoryLedger = new MemColl(); + + await applyInventoryForOrder( + { inventoryStock, inventoryLedger }, + { lineItems: [line] }, + "order_bundle_1", + now, + ); + + const after = await inventoryStock.get(stockId); + // 2 bundles × 3 units per bundle = 6 + expect(after?.quantity).toBe(94); + expect(after?.version).toBe(5); + }); + + it("throws ORDER_STATE_CONFLICT when a bundle snapshot lacks valid component versions", async () => { + const bundleProductId = "bundle_legacy_1"; + const line: OrderLineItem = { + productId: bundleProductId, + quantity: 1, + inventoryVersion: 2, + unitPriceMinor: 100, + snapshot: { + productId: bundleProductId, + skuId: bundleProductId, + productType: "bundle", + productTitle: "Legacy", + skuCode: bundleProductId, + selectedOptions: [], + currency: "USD", + unitPriceMinor: 100, + lineSubtotalMinor: 100, + lineDiscountMinor: 0, + lineTotalMinor: 100, + requiresShipping: true, + isDigital: false, + bundleSummary: { + productId: bundleProductId, + subtotalMinor: 100, + discountType: "none", + discountValueMinor: 0, + discountValueBps: 0, + discountAmountMinor: 0, + finalPriceMinor: 100, + availability: 1, + components: [ + { + componentId: "c1", + componentSkuId: "sku_x", + componentSkuCode: "X", + componentProductId: "p_x", + componentPriceMinor: 100, + quantityPerBundle: 1, + subtotalContributionMinor: 100, + availableBundleQuantity: 1, + componentInventoryVersion: -1, + }, + ], + }, + }, + }; + const stockId = inventoryStockDocId(bundleProductId, ""); + const inventoryStock = new MemColl( + new Map([ + [ + stockId, + { + productId: bundleProductId, + variantId: "", + version: 2, + quantity: 5, + updatedAt: now, + }, + ], + ]), + ); + const inventoryLedger = new MemColl(); + + await expect( + applyInventoryForOrder( + { inventoryStock, inventoryLedger }, + { lineItems: [line] }, + "order_legacy_bundle", + now, + ), + ).rejects.toMatchObject({ + code: "ORDER_STATE_CONFLICT", + }); + }); + + it("throws PRODUCT_UNAVAILABLE when authoritative stock row is missing", async () => { + const line: OrderLineItem = { + productId: "simple_legacy_1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + snapshot: { + productId: "simple_legacy_1", + skuId: "simple_legacy_1", + productType: "simple", + productTitle: "Simple Legacy", + skuCode: "SIMPLE-LEGACY", + selectedOptions: [], + currency: "USD", + unitPriceMinor: 500, + lineSubtotalMinor: 500, + lineDiscountMinor: 0, + lineTotalMinor: 500, + requiresShipping: true, + isDigital: false, + }, + }; + const missingStockNow = "2026-04-10T12:00:00.000Z"; + const inventoryStock = new MemColl( + new Map([ + [ + inventoryStockDocId("simple_legacy_1", "legacy_sku"), + { + productId: "simple_legacy_1", + variantId: "legacy_sku", + version: 3, + quantity: 3, + updatedAt: missingStockNow, + }, + ], + ]), + ); + const inventoryLedger = new MemColl(); + + await expect( + applyInventoryForOrder( + { inventoryStock, inventoryLedger }, + { lineItems: [line] }, + "legacy-order", + missingStockNow, + ), + ).rejects.toMatchObject({ + code: "PRODUCT_UNAVAILABLE", + }); + expect(inventoryLedger.rows.size).toBe(0); + }); +}); diff --git a/packages/plugins/commerce/src/orchestration/finalize-payment-inventory.ts b/packages/plugins/commerce/src/orchestration/finalize-payment-inventory.ts new file mode 100644 index 000000000..5a26e6176 --- /dev/null +++ b/packages/plugins/commerce/src/orchestration/finalize-payment-inventory.ts @@ -0,0 +1,339 @@ +import type { StorageCollection } from "emdash"; + +import type { CommerceErrorCode } from "../kernel/errors.js"; +import { inventoryStockDocId } from "../lib/inventory-stock.js"; +import { LineConflictError, mergeLineItemsBySku } from "../lib/merge-line-items.js"; +import { BundleSnapshotError, toInventoryDeductionLines } from "../lib/order-inventory-lines.js"; +import type { OrderLineItem, StoredInventoryLedgerEntry, StoredInventoryStock } from "../types.js"; + +export { inventoryStockDocId }; + +type CollectionGetPut = Pick, "get" | "put">; +type QueryCollection = Pick, "query" | "put">; + +type FinalizeInventoryPorts = { + inventoryLedger: QueryCollection; + inventoryStock: CollectionGetPut; +}; + +export class InventoryFinalizeError extends Error { + constructor( + public code: CommerceErrorCode, + message: string, + public details?: Record, + ) { + super(message); + this.name = "InventoryFinalizeError"; + } +} + +type InventoryMutation = { + line: OrderLineItem; + stockId: string; + currentStock: StoredInventoryStock; + nextStock: StoredInventoryStock; + ledgerId: string; +}; + +function inventoryLedgerEntryId(orderId: string, productId: string, variantId: string): string { + return `line:${encodeURIComponent(orderId)}:${encodeURIComponent(productId)}:${encodeURIComponent(variantId)}`; +} + +function normalizeInventoryMutations( + orderId: string, + lineItems: OrderLineItem[], + stockRows: Map, + nowIso: string, +): InventoryMutation[] { + let merged: OrderLineItem[]; + try { + merged = mergeLineItemsBySku(lineItems); + } catch (error) { + if (error instanceof LineConflictError) { + throw new InventoryFinalizeError("ORDER_STATE_CONFLICT", error.message, { + orderId, + reason: "line_conflict", + productId: error.productId, + variantId: error.variantId ?? null, + expected: error.expected, + actual: error.actual, + }); + } + const msg = error instanceof Error ? error.message : String(error); + throw new InventoryFinalizeError("ORDER_STATE_CONFLICT", msg, { orderId }); + } + + return merged.map((line) => { + const stockId = inventoryStockDocId(line.productId, line.variantId ?? ""); + const stock = stockRows.get(stockId); + if (!stock) { + throw new InventoryFinalizeError( + "PRODUCT_UNAVAILABLE", + `No inventory record for product ${line.productId}`, + { + productId: line.productId, + }, + ); + } + if (stock.version !== line.inventoryVersion) { + throw new InventoryFinalizeError( + "INVENTORY_CHANGED", + "Inventory version changed since checkout", + { productId: line.productId, expected: line.inventoryVersion, current: stock.version }, + ); + } + if (stock.quantity < line.quantity) { + throw new InventoryFinalizeError("INSUFFICIENT_STOCK", "Not enough stock to finalize order", { + productId: line.productId, + requested: line.quantity, + available: stock.quantity, + }); + } + const variantId = line.variantId ?? ""; + return { + line, + stockId, + currentStock: stock, + nextStock: { + ...stock, + version: stock.version + 1, + quantity: stock.quantity - line.quantity, + updatedAt: nowIso, + }, + ledgerId: inventoryLedgerEntryId(orderId, line.productId, variantId), + }; + }); +} + +async function applyInventoryMutation( + ports: FinalizeInventoryPorts, + orderId: string, + nowIso: string, + mutation: InventoryMutation, +): Promise { + const latest = await ports.inventoryStock.get(mutation.stockId); + if (!latest) { + throw new InventoryFinalizeError( + "PRODUCT_UNAVAILABLE", + `No inventory record for product ${mutation.line.productId}`, + { + productId: mutation.line.productId, + }, + ); + } + if (latest.version !== mutation.currentStock.version) { + throw new InventoryFinalizeError( + "INVENTORY_CHANGED", + "Inventory changed between preflight and write", + { + productId: mutation.line.productId, + expectedVersion: mutation.currentStock.version, + currentVersion: latest.version, + }, + ); + } + if (latest.quantity < mutation.line.quantity) { + throw new InventoryFinalizeError("INSUFFICIENT_STOCK", "Not enough stock at write time", { + productId: mutation.line.productId, + requested: mutation.line.quantity, + available: latest.quantity, + }); + } + const entry: StoredInventoryLedgerEntry = { + productId: mutation.line.productId, + variantId: mutation.line.variantId ?? "", + delta: -mutation.line.quantity, + referenceType: "order", + referenceId: orderId, + createdAt: nowIso, + }; + await ports.inventoryLedger.put(mutation.ledgerId, entry); + await ports.inventoryStock.put(mutation.stockId, mutation.nextStock); +} + +async function applyInventoryMutations( + ports: FinalizeInventoryPorts, + orderId: string, + nowIso: string, + stockRows: Map, + orderLines: OrderLineItem[], +): Promise { + const existing = await ports.inventoryLedger.query({ + where: { referenceType: "order", referenceId: orderId }, + limit: 1000, + }); + const seen = new Set(existing.items.map((row) => row.id)); + + let merged: OrderLineItem[]; + try { + merged = toInventoryDeductionLines(orderLines); + } catch (error) { + if (error instanceof BundleSnapshotError) { + throw new InventoryFinalizeError("ORDER_STATE_CONFLICT", error.message, { + reason: + error.code === "MISSING_BUNDLE_SNAPSHOT" + ? "bundle_snapshot_incomplete" + : "bundle_component_invalid_inventory", + productId: error.productId, + }); + } + if (error instanceof LineConflictError) { + throw new InventoryFinalizeError("ORDER_STATE_CONFLICT", error.message, { + reason: "line_conflict", + productId: error.productId, + variantId: error.variantId ?? null, + expected: error.expected, + actual: error.actual, + }); + } + const msg = error instanceof Error ? error.message : String(error); + throw new InventoryFinalizeError("ORDER_STATE_CONFLICT", msg, { orderId }); + } + + /** + * Reconcile pass: for lines where the ledger row was written but the stock + * write did not complete (crash between `inventoryLedger.put` and + * `inventoryStock.put` in `applyInventoryMutation`). + * + * `stock.version === line.inventoryVersion` means the stock was never updated + * despite the ledger entry existing — finish just the stock write. + * `stock.version > inventoryVersion` means the stock was already updated; + * nothing to do for that line. + */ + for (const line of merged) { + const variantId = line.variantId ?? ""; + const stockId = inventoryStockDocId(line.productId, variantId); + const ledgerId = inventoryLedgerEntryId(orderId, line.productId, variantId); + if (!seen.has(ledgerId)) continue; + const stock = stockRows.get(stockId); + if (!stock) { + throw new InventoryFinalizeError( + "PRODUCT_UNAVAILABLE", + `No inventory record for product ${line.productId}`, + { productId: line.productId }, + ); + } + if (stock.version === line.inventoryVersion) { + await ports.inventoryStock.put(stockId, { + ...stock, + version: stock.version + 1, + quantity: stock.quantity - line.quantity, + updatedAt: nowIso, + }); + } + } + + // Apply pass: lines that have no ledger entry yet. + const linesNeedingWork: OrderLineItem[] = []; + for (const line of merged) { + const variantId = line.variantId ?? ""; + const ledgerId = inventoryLedgerEntryId(orderId, line.productId, variantId); + if (seen.has(ledgerId)) continue; + linesNeedingWork.push(line); + } + + const planned = normalizeInventoryMutations(orderId, linesNeedingWork, stockRows, nowIso); + for (const mutation of planned) { + await applyInventoryMutation(ports, orderId, nowIso, mutation); + seen.add(mutation.ledgerId); + } +} + +export function readCurrentStockRows( + inventoryStock: CollectionGetPut, + lines: OrderLineItem[], +): Promise> { + return (async () => { + const out = new Map(); + const stockLineById = new Map(); + let deductionLines: OrderLineItem[]; + try { + deductionLines = toInventoryDeductionLines(lines); + } catch (error) { + if (error instanceof BundleSnapshotError) { + throw new InventoryFinalizeError( + "ORDER_STATE_CONFLICT", + `Unable to build inventory deduction lines: ${error.message}`, + { + reason: + error.code === "MISSING_BUNDLE_SNAPSHOT" + ? "bundle_snapshot_incomplete" + : "bundle_component_invalid_inventory", + productId: error.productId, + }, + ); + } + if (error instanceof LineConflictError) { + throw new InventoryFinalizeError( + "ORDER_STATE_CONFLICT", + `Unable to build inventory deduction lines: ${error.message}`, + { + reason: "line_conflict", + productId: error.productId, + variantId: error.variantId ?? null, + expected: error.expected, + actual: error.actual, + }, + ); + } + const message = error instanceof Error ? error.message : String(error); + throw new InventoryFinalizeError( + "ORDER_STATE_CONFLICT", + `Unable to build inventory deduction lines: ${message}`, + { + reason: "bundle_snapshot_incomplete", + }, + ); + } + for (const line of deductionLines) { + const stockId = inventoryStockDocId(line.productId, line.variantId ?? ""); + stockLineById.set(stockId, line); + } + + const stockRows = await Promise.all( + Array.from(stockLineById.entries()).map(async ([stockId, line]) => ({ + stockId, + productId: line.productId, + stock: await inventoryStock.get(stockId), + })), + ); + for (const { stockId, productId, stock } of stockRows) { + if (!stock) { + throw new InventoryFinalizeError( + "PRODUCT_UNAVAILABLE", + `No inventory record for product ${productId}`, + { + productId, + }, + ); + } + out.set(stockId, stock); + } + return out; + })(); +} + +export async function applyInventoryForOrder( + ports: FinalizeInventoryPorts, + order: { lineItems: OrderLineItem[] }, + orderId: string, + nowIso: string, +): Promise { + const stockRows = await readCurrentStockRows(ports.inventoryStock, order.lineItems); + await applyInventoryMutations(ports, orderId, nowIso, stockRows, order.lineItems); +} + +export function mapInventoryErrorToApiCode(code: CommerceErrorCode): CommerceErrorCode { + return code === "PRODUCT_UNAVAILABLE" || code === "INSUFFICIENT_STOCK" + ? "PAYMENT_CONFLICT" + : code; +} + +export function isTerminalInventoryFailure(code: CommerceErrorCode): boolean { + return ( + code === "PRODUCT_UNAVAILABLE" || + code === "INSUFFICIENT_STOCK" || + code === "INVENTORY_CHANGED" || + code === "ORDER_STATE_CONFLICT" + ); +} diff --git a/packages/plugins/commerce/src/orchestration/finalize-payment-status.test.ts b/packages/plugins/commerce/src/orchestration/finalize-payment-status.test.ts new file mode 100644 index 000000000..62a7e1d10 --- /dev/null +++ b/packages/plugins/commerce/src/orchestration/finalize-payment-status.test.ts @@ -0,0 +1,125 @@ +import { describe, expect, it } from "vitest"; + +import { deriveFinalizationResumeState } from "./finalize-payment-status.js"; + +describe("deriveFinalizationResumeState", () => { + it("returns replay_processed when receipt is already processed", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "processed", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + }), + ).toBe("replay_processed"); + }); + + it("returns replay_processed when receipt row is marked processed through receipt flag", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "missing", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: true, + }), + ).toBe("replay_processed"); + }); + + it("returns replay_duplicate for duplicate receipts", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "duplicate", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + }), + ).toBe("replay_duplicate"); + }); + + it("returns error for terminal error receipts", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "error", + isInventoryApplied: true, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + }), + ).toBe("error"); + }); + + it("returns event_unknown when completed work exists without a receipt row", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "missing", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: false, + }), + ).toBe("event_unknown"); + }); + + it("returns not_started when finalization has not begun", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "missing", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + }), + ).toBe("not_started"); + }); + + it("returns pending_inventory when inventory ledger is not yet written", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "pending", + isInventoryApplied: false, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: false, + }), + ).toBe("pending_inventory"); + }); + + it("returns pending_order when payment phase update has not completed", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "pending", + isInventoryApplied: true, + isOrderPaid: false, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: false, + }), + ).toBe("pending_order"); + }); + + it("returns pending_attempt when payment attempt finalization has not completed", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "pending", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + }), + ).toBe("pending_attempt"); + }); + + it("returns pending_receipt when only receipt write remains", () => { + expect( + deriveFinalizationResumeState({ + receiptStatus: "pending", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: false, + }), + ).toBe("pending_receipt"); + }); +}); diff --git a/packages/plugins/commerce/src/orchestration/finalize-payment-status.ts b/packages/plugins/commerce/src/orchestration/finalize-payment-status.ts new file mode 100644 index 000000000..0073e1695 --- /dev/null +++ b/packages/plugins/commerce/src/orchestration/finalize-payment-status.ts @@ -0,0 +1,54 @@ +import type { WebhookReceiptErrorCode } from "../types.js"; + +export type FinalizationStatus = { + /** Raw webhook-receipt status for quick runbook triage. */ + receiptStatus: "missing" | "pending" | "processed" | "error" | "duplicate"; + /** At least one inventory ledger row exists for this order. */ + isInventoryApplied: boolean; + /** Order paymentPhase is "paid". */ + isOrderPaid: boolean; + /** At least one payment attempt for this order+provider is "succeeded". */ + isPaymentAttemptSucceeded: boolean; + /** Webhook receipt for this event is "processed". */ + isReceiptProcessed: boolean; + /** Optional terminal error classification when `receiptStatus === "error"`. */ + receiptErrorCode?: WebhookReceiptErrorCode; + /** + * Human-readable resume state for operations that consume this helper as a + * status surface (MCP, support tooling, runbooks). + * `event_unknown` means the order/attempt/ledger already indicate completion + * but no receipt row exists for this external event id. + */ + resumeState: + | "not_started" + | "replay_processed" + | "replay_duplicate" + | "error" + | "event_unknown" + | "pending_inventory" + | "pending_order" + | "pending_attempt" + | "pending_receipt"; +}; + +export function deriveFinalizationResumeState(input: { + receiptStatus: FinalizationStatus["receiptStatus"]; + isInventoryApplied: boolean; + isOrderPaid: boolean; + isPaymentAttemptSucceeded: boolean; + isReceiptProcessed: boolean; +}): FinalizationStatus["resumeState"] { + if (input.receiptStatus === "processed" || input.isReceiptProcessed) return "replay_processed"; + if (input.receiptStatus === "duplicate") return "replay_duplicate"; + if (input.receiptStatus === "error") return "error"; + if (input.receiptStatus === "missing") { + if (input.isInventoryApplied && input.isOrderPaid && input.isPaymentAttemptSucceeded) { + return "event_unknown"; + } + return "not_started"; + } + if (!input.isInventoryApplied) return "pending_inventory"; + if (!input.isOrderPaid) return "pending_order"; + if (!input.isPaymentAttemptSucceeded) return "pending_attempt"; + return "pending_receipt"; +} diff --git a/packages/plugins/commerce/src/orchestration/finalize-payment.test.ts b/packages/plugins/commerce/src/orchestration/finalize-payment.test.ts new file mode 100644 index 000000000..e68d1ef76 --- /dev/null +++ b/packages/plugins/commerce/src/orchestration/finalize-payment.test.ts @@ -0,0 +1,2433 @@ +import { beforeAll, describe, expect, it } from "vitest"; + +import { sha256HexAsync } from "../lib/crypto-adapter.js"; +import type { + StoredInventoryLedgerEntry, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredWebhookReceipt, +} from "../types.js"; +import { + finalizePaymentFromWebhook, + type FinalizePaymentPorts, + inventoryStockDocId, + queryFinalizationStatus, + receiptToView, + webhookReceiptDocId, +} from "./finalize-payment.js"; + +/** Raw finalize token matching `FINALIZE_HASH` on test orders. */ +const FINALIZE_RAW = "unit_test_finalize_secret_ok____________"; +let FINALIZE_HASH = ""; + +function asMemCollection(collection: MemColl): MemColl { + return collection; +} + +beforeAll(async () => { + FINALIZE_HASH = await sha256HexAsync(FINALIZE_RAW); +}); + +type MemQueryOptions = { + where?: Record; + limit?: number; + cursor?: string; + orderBy?: Partial>; +}; + +type TestFinalizePaymentPorts = FinalizePaymentPorts & { + orders: MemColl; + webhookReceipts: MemColl; + paymentAttempts: MemColl; + inventoryLedger: MemColl; + inventoryStock: MemColl; +}; + +type MemPaginated = { items: T[]; hasMore: boolean; cursor?: string }; + +class MemColl { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async put(id: string, data: T): Promise { + this.rows.set(id, structuredClone(data)); + } + + async query(options?: MemQueryOptions): Promise> { + const where = options?.where ?? {}; + const limit = Math.min(options?.limit ?? 50, 100); + const orderBy = options?.orderBy; + const items: Array<{ id: string; data: T }> = []; + for (const [id, data] of this.rows) { + const ok = Object.entries(where).every( + ([k, v]) => (data as Record)[k] === v, + ); + if (ok) items.push({ id, data: structuredClone(data) }); + } + if (orderBy && Object.keys(orderBy).length > 0) { + items.sort((a, b) => { + for (const [field, dir] of Object.entries(orderBy) as Array< + ["createdAt" | "orderId" | "providerId" | "status", "asc" | "desc"] + >) { + if ( + field !== "createdAt" && + field !== "orderId" && + field !== "providerId" && + field !== "status" + ) + continue; + const rowA = a.data as Record; + const rowB = b.data as Record; + const av = rowA[field]; + const bv = rowB[field]; + if (av === bv) continue; + if (dir === "desc") return String(av).localeCompare(String(bv)) * -1; + return String(av).localeCompare(String(bv)); + } + return a.id.localeCompare(b.id); + }); + } + const trimmed = items.slice(0, limit); + return { items: trimmed, hasMore: false }; + } +} + +function withOneTimePutFailure(collection: MemColl): MemColl { + let shouldFail = true; + return { + get rows() { + return collection.rows; + }, + get: (id: string) => collection.get(id), + query: (options?: MemQueryOptions) => collection.query(options), + put: async (id: string, data: T): Promise => { + if (shouldFail) { + shouldFail = false; + throw new Error("simulated storage write failure"); + } + await collection.put(id, data); + }, + } as MemColl; +} + +/** Succeeds on the first `succeedCount` puts, then fails exactly once. */ +function withNthPutFailure( + collection: MemColl, + failOnNth: number, +): MemColl { + let callCount = 0; + let hasFailed = false; + return { + get rows() { + return collection.rows; + }, + get: (id: string) => collection.get(id), + query: (options?: MemQueryOptions) => collection.query(options), + put: async (id: string, data: T): Promise => { + callCount++; + if (callCount === failOnNth && !hasFailed) { + hasFailed = true; + throw new Error("simulated storage write failure"); + } + await collection.put(id, data); + }, + } as MemColl; +} + +type MemCollWithPutIfAbsent = MemColl & { + putIfAbsent(id: string, data: T): Promise; +}; +type MemCollWithClaiming = MemCollWithPutIfAbsent & { + compareAndSwap(id: string, expectedVersion: string, data: T): Promise; +}; + +function memCollWithPutIfAbsent(collection: MemColl): MemCollWithClaiming { + return { + get rows() { + return collection.rows; + }, + get: collection.get.bind(collection), + query: collection.query.bind(collection), + put: collection.put.bind(collection), + putIfAbsent: async (id: string, data: T): Promise => { + if (collection.rows.has(id)) return false; + collection.rows.set(id, structuredClone(data)); + return true; + }, + compareAndSwap: async (id: string, expectedVersion: string, data: T): Promise => { + const existing = collection.rows.get(id); + if (!existing) return false; + const version = (existing as Record).updatedAt; + if (typeof version !== "string" || version !== expectedVersion) return false; + collection.rows.set(id, structuredClone(data)); + return true; + }, + } as MemCollWithClaiming; +} + +function stealWebhookClaim( + webhookRows: Map, + receiptId: string, +): void { + const current = webhookRows.get(receiptId); + if (!current) return; + webhookRows.set(receiptId, { + ...current, + claimOwner: "other-worker", + claimToken: "stolen-token", + claimVersion: "2026-04-02T11:00:00.000Z", + }); +} + +function portsFromState(state: { + orders: Map; + webhookReceipts: Map; + paymentAttempts: Map; + inventoryLedger: Map; + inventoryStock: Map; +}): TestFinalizePaymentPorts { + return { + orders: new MemColl(state.orders), + webhookReceipts: new MemColl(state.webhookReceipts), + paymentAttempts: new MemColl(state.paymentAttempts), + inventoryLedger: new MemColl(state.inventoryLedger), + inventoryStock: new MemColl(state.inventoryStock), + } as TestFinalizePaymentPorts; +} + +const now = "2026-04-02T12:00:00.000Z"; + +function baseOrder(overrides: Partial = {}): StoredOrder { + return { + cartId: "cart_1", + paymentPhase: "payment_pending", + currency: "USD", + lineItems: [ + { + productId: "p1", + quantity: 2, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + ], + totalMinor: 1000, + finalizeTokenHash: FINALIZE_HASH, + createdAt: now, + updatedAt: now, + ...overrides, + }; +} + +describe("finalizePaymentFromWebhook", () => { + it("finalizes: paid order, processed receipt, stock decrement, ledger row, attempt succeeded", async () => { + const orderId = "order_1"; + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_1", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const ports = portsFromState(state); + const ext = "evt_test_finalize"; + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid-1", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "completed", orderId }); + + const rid = webhookReceiptDocId("stripe", ext); + const receipt = await ports.webhookReceipts.get(rid); + expect(receipt?.status).toBe("processed"); + + const order = await ports.orders.get(orderId); + expect(order?.paymentPhase).toBe("paid"); + + const stock = await ports.inventoryStock.get(stockId); + expect(stock?.quantity).toBe(8); + expect(stock?.version).toBe(4); + + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + expect(ledger.items[0]!.data.delta).toBe(-2); + expect(ledger.items[0]!.data.referenceId).toBe(orderId); + + const pa = await ports.paymentAttempts.get("pa_1"); + expect(pa?.status).toBe("succeeded"); + }); + + it("merges duplicate SKU lines into one inventory movement", async () => { + const orderId = "order_merge"; + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + { + productId: "p1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + ], + totalMinor: 1000, + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const ports = portsFromState(state); + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: "evt_merge_lines", + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res.kind).toBe("completed"); + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + expect(ledger.items[0]!.data.delta).toBe(-2); + const stock = await ports.inventoryStock.get(stockId); + expect(stock?.quantity).toBe(8); + }); + + it("chooses the earliest pending provider-specific payment attempt", async () => { + const orderId = "order_attempts"; + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + ], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "attempt_newest", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: "2026-04-02T12:00:02.000Z", + updatedAt: "2026-04-02T12:00:02.000Z", + }, + ], + [ + "attempt_earliest", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: "2026-04-02T12:00:00.000Z", + updatedAt: "2026-04-02T12:00:00.000Z", + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const ports = portsFromState(state); + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: "evt_attempts", + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "completed", orderId }); + + const chosen = await ports.paymentAttempts.get("attempt_earliest"); + const ignored = await ports.paymentAttempts.get("attempt_newest"); + expect(chosen?.status).toBe("succeeded"); + expect(ignored?.status).toBe("pending"); + }); + + it("does not partially apply stock if preflight catches an invalid line", async () => { + const orderId = "order_partial_fail"; + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + { + productId: "p2", + variantId: "v1", + quantity: 9, + inventoryVersion: 3, + unitPriceMinor: 250, + }, + ], + totalMinor: 7250, + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + [ + inventoryStockDocId("p2", "v1"), + { + productId: "p2", + variantId: "v1", + version: 3, + quantity: 2, + updatedAt: now, + }, + ], + ]), + }; + const ports = portsFromState(state); + const extId = "evt_partial_fail"; + const result = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(result).toMatchObject({ + kind: "api_error", + error: { code: "PAYMENT_CONFLICT" }, + }); + + const firstStock = await ports.inventoryStock.get(inventoryStockDocId("p1", "")); + expect(firstStock?.quantity).toBe(10); + const firstVersion = firstStock?.version; + const secondStock = await ports.inventoryStock.get(inventoryStockDocId("p2", "v1")); + expect(secondStock?.quantity).toBe(2); + expect(secondStock?.version).toBe(3); + expect(firstVersion).toBe(3); + + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(0); + const order = await ports.orders.get(orderId); + expect(order?.paymentPhase).toBe("payment_pending"); + const receipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("error"); + }); + + it("resumes safely when order persistence fails after inventory write", async () => { + const orderId = "order_resume_order_fail"; + const extId = "evt_order_fail"; + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [ + { + productId: "p1", + quantity: 1, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + ], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + const basePorts = portsFromState(state) as FinalizePaymentPorts & { + orders: MemColl; + }; + const ports = { + ...basePorts, + orders: withOneTimePutFailure(asMemCollection(basePorts.orders)), + }; + + const first = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(first).toMatchObject({ kind: "api_error", error: { code: "ORDER_STATE_CONFLICT" } }); + + const stock = await basePorts.inventoryStock.get(inventoryStockDocId("p1", "")); + expect(stock?.quantity).toBe(9); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + + const second = await finalizePaymentFromWebhook(basePorts, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toEqual({ kind: "completed", orderId }); + + const paidOrder = await basePorts.orders.get(orderId); + expect(paidOrder?.paymentPhase).toBe("paid"); + const receipt = await basePorts.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("processed"); + }); + + it("retries safely when payment-attempt finalization fails", async () => { + const orderId = "order_resume_attempt_fail"; + const extId = "evt_attempt_fail"; + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_retry", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + inventoryStockDocId("p1", ""), + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + const ports = portsFromState(state); + const basePorts = { + ...ports, + paymentAttempts: withOneTimePutFailure(asMemCollection(ports.paymentAttempts)), + } as typeof ports; + + const first = await finalizePaymentFromWebhook(basePorts, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(first).toMatchObject({ kind: "api_error", error: { code: "ORDER_STATE_CONFLICT" } }); + + const paidOrder = await ports.orders.get(orderId); + expect(paidOrder?.paymentPhase).toBe("paid"); + + const pendingAttempt = await ports.paymentAttempts.query({ + where: { orderId: orderId, providerId: "stripe", status: "pending" }, + limit: 5, + }); + expect(pendingAttempt.items).toHaveLength(1); + + const receipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("pending"); + + const second = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toEqual({ kind: "completed", orderId }); + + const succeededAttempt = await ports.paymentAttempts.query({ + where: { orderId: orderId, providerId: "stripe", status: "succeeded" }, + limit: 5, + }); + expect(succeededAttempt.items).toHaveLength(1); + const retryReceipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(retryReceipt?.status).toBe("processed"); + }); + + it("rejects finalize when token is missing but order requires one", async () => { + const orderId = "order_1"; + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const res = await finalizePaymentFromWebhook(portsFromState(state), { + orderId, + providerId: "stripe", + externalEventId: "evt_no_tok", + correlationId: "cid", + finalizeToken: "", + nowIso: now, + }); + + expect(res).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_TOKEN_REQUIRED" }, + }); + }); + + it("rejects finalize when token does not match", async () => { + const orderId = "order_1"; + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const res = await finalizePaymentFromWebhook(portsFromState(state), { + orderId, + providerId: "stripe", + externalEventId: "evt_bad_tok", + correlationId: "cid", + finalizeToken: "wrong_token___________________________", + nowIso: now, + }); + + expect(res).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_TOKEN_INVALID" }, + }); + }); + + it("duplicate externalEventId replay returns replay (200-class semantics)", async () => { + const orderId = "order_1"; + const ext = "evt_dup"; + const rid = webhookReceiptDocId("stripe", ext); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map([ + [ + rid, + { + providerId: "stripe", + externalEventId: ext, + orderId, + status: "processed", + createdAt: now, + updatedAt: now, + }, + ], + ]), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const ports = portsFromState(state); + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: "", + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_processed" }); + }); + + it("order already paid without receipt row still replays", async () => { + const orderId = "order_1"; + const state = { + orders: new Map([[orderId, baseOrder({ paymentPhase: "paid" })]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const res = await finalizePaymentFromWebhook(portsFromState(state), { + orderId, + providerId: "stripe", + externalEventId: "evt_x", + correlationId: "cid", + finalizeToken: "", + nowIso: now, + }); + + expect(res.kind).toBe("replay"); + if (res.kind === "replay") expect(res.reason).toBe("order_already_paid"); + }); + + it("resumes completion for a paid order with a pending webhook receipt", async () => { + const orderId = "order_paid_pending"; + const ext = "evt_paid_pending"; + const rid = webhookReceiptDocId("stripe", ext); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + paymentPhase: "paid", + lineItems: [ + { + productId: "p1", + quantity: 2, + inventoryVersion: 3, + unitPriceMinor: 500, + }, + ], + }), + ], + ]), + webhookReceipts: new Map([ + [ + rid, + { + providerId: "stripe", + externalEventId: ext, + orderId, + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + paymentAttempts: new Map([ + [ + "pa_paid", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + const ports = portsFromState(state); + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "completed", orderId }); + const paidOrder = await ports.orders.get(orderId); + expect(paidOrder?.paymentPhase).toBe("paid"); + const receipt = await ports.webhookReceipts.get(rid); + expect(receipt?.status).toBe("processed"); + const attempt = await ports.paymentAttempts.get("pa_paid"); + expect(attempt?.status).toBe("succeeded"); + const final = await queryFinalizationStatus(ports, orderId, "stripe", ext); + expect(final).toMatchObject({ + resumeState: "replay_processed", + receiptStatus: "processed", + isInventoryApplied: false, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + }); + }); + + it("pending receipt still requires finalize token", async () => { + const orderId = "order_1"; + const ext = "evt_pending"; + const rid = webhookReceiptDocId("stripe", ext); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map([ + [ + rid, + { + providerId: "stripe", + externalEventId: ext, + orderId, + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const ports = portsFromState(state); + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: "", + nowIso: now, + }); + + expect(res).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_TOKEN_REQUIRED" }, + }); + const pendingStatus = await queryFinalizationStatus(ports, orderId, "stripe", ext); + expect(pendingStatus).toMatchObject({ + receiptStatus: "pending", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + resumeState: "pending_inventory", + }); + }); + + it("keeps a pending event resumable when finalize token is initially missing", async () => { + const orderId = "order_1"; + const ext = "evt_pending_retry"; + const rid = webhookReceiptDocId("stripe", ext); + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map([ + [ + rid, + { + providerId: "stripe", + externalEventId: ext, + orderId, + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + paymentAttempts: new Map([ + [ + "pa_pending_retry", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const ports = portsFromState(state); + const first = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: "", + nowIso: now, + }); + expect(first).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_TOKEN_REQUIRED" }, + }); + const preRetryStatus = await queryFinalizationStatus(ports, orderId, "stripe", ext); + expect(preRetryStatus).toMatchObject({ + receiptStatus: "pending", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + resumeState: "pending_inventory", + }); + const pending = await ports.webhookReceipts.get(rid); + expect(pending?.status).toBe("pending"); + + const second = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toEqual({ kind: "completed", orderId }); + + const final = await queryFinalizationStatus(ports, orderId, "stripe", ext); + expect(final).toMatchObject({ + receiptStatus: "processed", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + resumeState: "replay_processed", + }); + + const stock = await ports.inventoryStock.get(stockId); + expect(stock?.version).toBe(4); + expect(stock?.quantity).toBe(8); + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + }); + + it("marks pending receipt as error when order leaves finalizable phase between reads", async () => { + const orderId = "order_state_conflict"; + const ext = "evt_state_conflict"; + const rid = webhookReceiptDocId("stripe", ext); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const basePorts = portsFromState(state) as FinalizePaymentPorts & { + orders: MemColl; + }; + let getCount = 0; + const orderStateMutatingOrders = { + ...basePorts.orders, + get: async (id: string) => { + const row = await basePorts.orders.get(id); + getCount += 1; + if (row && getCount === 2 && id === orderId) { + const drifted = { ...row, paymentPhase: "processing" as const }; + basePorts.orders.rows.set(id, drifted); + return drifted; + } + return row; + }, + }; + + const ports = { ...basePorts, orders: orderStateMutatingOrders } as FinalizePaymentPorts; + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(res).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_STATE_CONFLICT" }, + }); + + const receipt = await basePorts.webhookReceipts.get(rid); + expect(receipt?.status).toBe("error"); + expect(receipt?.errorCode).toBe("ORDER_STATE_CONFLICT"); + }); + + it("marks pending receipt as error when order disappears between reads", async () => { + const orderId = "order_disappears"; + const ext = "evt_disappears"; + const rid = webhookReceiptDocId("stripe", ext); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const basePorts = portsFromState(state) as FinalizePaymentPorts & { + orders: MemColl; + }; + let orderReadCount = 0; + const disappearingOrders = { + ...basePorts.orders, + get: async (id: string) => { + const row = await basePorts.orders.get(id); + orderReadCount += 1; + if (id === orderId && orderReadCount >= 2) { + basePorts.orders.rows.delete(id); + return null; + } + return row; + }, + } as MemColl; + + const ports = { ...basePorts, orders: disappearingOrders } as FinalizePaymentPorts; + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(res).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_NOT_FOUND", message: "Order not found" }, + }); + + const receipt = await basePorts.webhookReceipts.get(rid); + expect(receipt?.status).toBe("error"); + expect(receipt?.errorCode).toBe("ORDER_NOT_FOUND"); + expect(receipt?.errorDetails).toMatchObject({ orderId, correlationId: "cid" }); + const order = await basePorts.orders.get(orderId); + expect(order).toBeNull(); + }); + + it("inventory version mismatch sets payment_conflict and returns INVENTORY_CHANGED", async () => { + const orderId = "order_1"; + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 99, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const ports = portsFromState(state); + const ext = "evt_inv"; + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toMatchObject({ + kind: "api_error", + error: { code: "INVENTORY_CHANGED" }, + }); + const status = await queryFinalizationStatus(ports, orderId, "stripe", ext); + expect(status).toMatchObject({ + receiptStatus: "error", + isInventoryApplied: false, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + receiptErrorCode: "INVENTORY_CHANGED", + resumeState: "error", + }); + const order = await ports.orders.get(orderId); + expect(order?.paymentPhase).toBe("payment_pending"); + const rid = webhookReceiptDocId("stripe", ext); + const rec = await ports.webhookReceipts.get(rid); + expect(rec?.status).toBe("error"); + expect(rec?.errorCode).toBe("INVENTORY_CHANGED"); + }); + + it("terminalized inventory mismatch receipt blocks same-event replay", async () => { + const orderId = "order_1"; + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map(), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 99, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const ports = portsFromState(state); + const ext = "evt_inv_terminal"; + const first = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(first).toMatchObject({ + kind: "api_error", + error: { code: "INVENTORY_CHANGED" }, + }); + + const second = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: ext, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_STATE_CONFLICT" }, + }); + + const rid = webhookReceiptDocId("stripe", ext); + const receipt = await ports.webhookReceipts.get(rid); + expect(receipt?.status).toBe("error"); + }); + + it("receiptToView maps storage rows for the kernel", () => { + expect(receiptToView(null)).toEqual({ exists: false }); + expect( + receiptToView({ + providerId: "stripe", + externalEventId: "e", + orderId: "o", + status: "duplicate", + createdAt: now, + updatedAt: now, + }), + ).toEqual({ exists: true, status: "duplicate" }); + }); + + it("resumes correctly when ledger write succeeds but stock write fails", async () => { + /** + * Sharpest inventory edge: `inventoryLedger.put` succeeds but + * `inventoryStock.put` throws. The receipt is left `pending`, ledger row + * exists, stock is still at the pre-mutation version. + * + * On retry the reconcile pass in `applyInventoryMutations` must detect + * "ledger exists, stock.version === inventoryVersion" and finish the stock + * write without re-writing the ledger. + */ + const orderId = "order_ledger_ok_stock_fail"; + const extId = "evt_stock_fail"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_lsf", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + // Wrap inventoryStock so the first put (stock update) fails. + const ports = { + ...basePorts, + inventoryStock: withOneTimePutFailure(asMemCollection(basePorts.inventoryStock)), + } as FinalizePaymentPorts; + + // First attempt: ledger write succeeds, stock write throws (hard storage error). + await expect( + finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }), + ).rejects.toThrow("simulated storage write failure"); + const interrupted = await queryFinalizationStatus(basePorts, orderId, "stripe", extId); + expect(interrupted).toMatchObject({ + receiptStatus: "pending", + isInventoryApplied: true, + isOrderPaid: false, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + resumeState: "pending_order", + }); + + // After first attempt: ledger row must exist, stock must NOT yet be updated. + const ledgerAfterFirst = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledgerAfterFirst.items).toHaveLength(1); + const stockAfterFirst = await basePorts.inventoryStock.get(stockDocId); + expect(stockAfterFirst?.version).toBe(3); // stock unchanged + expect(stockAfterFirst?.quantity).toBe(10); // quantity unchanged + + // Second attempt on basePorts (stock write works): reconcile pass should + // detect ledger-exists + stock.version === inventoryVersion and finish it. + const second = await finalizePaymentFromWebhook(basePorts, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toEqual({ kind: "completed", orderId }); + + const stockAfterRetry = await basePorts.inventoryStock.get(stockDocId); + expect(stockAfterRetry?.version).toBe(4); // stock updated + expect(stockAfterRetry?.quantity).toBe(8); // 10 - 2 + + const ledgerAfterRetry = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledgerAfterRetry.items).toHaveLength(1); // no duplicate ledger row + + const status = await queryFinalizationStatus(basePorts, orderId, "stripe", extId); + expect(status).toMatchObject({ + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + receiptStatus: "processed", + resumeState: "replay_processed", + }); + }); + + it("retries safely when payment attempt finalization write fails", async () => { + const orderId = "order_pending_attempt"; + const extId = "evt_attempt_fail"; + const stockId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_retry_attempt", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + stockId, + { + productId: "p1", + variantId: "", + version: 3, + quantity: 10, + updatedAt: now, + }, + ], + ]), + }; + + const basePorts = portsFromState(state) as FinalizePaymentPorts & { + paymentAttempts: MemColl; + }; + const ports = { + ...basePorts, + paymentAttempts: withNthPutFailure(basePorts.paymentAttempts, 1), + } as FinalizePaymentPorts; + + const first = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(first).toMatchObject({ + kind: "api_error", + error: { code: "ORDER_STATE_CONFLICT" }, + }); + const pendingAttempt = await queryFinalizationStatus(basePorts, orderId, "stripe", extId); + expect(pendingAttempt).toMatchObject({ + receiptStatus: "pending", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: false, + isReceiptProcessed: false, + resumeState: "pending_attempt", + }); + + const attemptBeforeRetry = await basePorts.paymentAttempts.get("pa_retry_attempt"); + expect(attemptBeforeRetry?.status).toBe("pending"); + const stockAfterFirst = await basePorts.inventoryStock.get(stockId); + expect(stockAfterFirst?.version).toBe(4); + expect(stockAfterFirst?.quantity).toBe(8); + + const receipt = await basePorts.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("pending"); + + const second = await finalizePaymentFromWebhook(basePorts, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toEqual({ kind: "completed", orderId }); + + const attemptAfterRetry = await basePorts.paymentAttempts.get("pa_retry_attempt"); + expect(attemptAfterRetry?.status).toBe("succeeded"); + const status = await queryFinalizationStatus(basePorts, orderId, "stripe", extId); + expect(status).toMatchObject({ + receiptStatus: "processed", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + resumeState: "replay_processed", + }); + + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + }); + + it("completes on retry when final receipt processed write fails", async () => { + /** + * Everything succeeds (inventory, order→paid, payment attempt→succeeded) + * but the final `webhookReceipts.put(status: "processed")` throws. + * + * Receipt is left `pending`. On retry: order is already paid, inventory + * is already applied, attempt is already succeeded. Only the receipt + * write needs to complete. + */ + const orderId = "order_receipt_fail"; + const extId = "evt_receipt_fail"; + const state = { + orders: new Map([[orderId, baseOrder()]]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_rf", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [ + inventoryStockDocId("p1", ""), + { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }, + ], + ]), + }; + + const basePorts = portsFromState(state); + // The second webhookReceipts.put (status→processed) fails; the first + // (status→pending) must succeed so the receipt is left in pending state. + const ports = { + ...basePorts, + webhookReceipts: withNthPutFailure(asMemCollection(basePorts.webhookReceipts), 2), + }; + + // First attempt: throws when writing status→processed. + await expect( + finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }), + ).rejects.toThrow("simulated storage write failure"); + + // After first attempt: all side effects must be done except receipt→processed. + const status = await queryFinalizationStatus(basePorts, orderId, "stripe", extId); + expect(status.isInventoryApplied).toBe(true); + expect(status.isOrderPaid).toBe(true); + expect(status.isPaymentAttemptSucceeded).toBe(true); + expect(status.isReceiptProcessed).toBe(false); // this is the unfinished bit + expect(status).toMatchObject({ + resumeState: "pending_receipt", + receiptStatus: "pending", + }); + + const pendingReceipt = await basePorts.webhookReceipts.get( + webhookReceiptDocId("stripe", extId), + ); + expect(pendingReceipt?.status).toBe("pending"); + + // Second attempt on basePorts: should complete just the receipt write. + const second = await finalizePaymentFromWebhook(basePorts, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + expect(second).toEqual({ kind: "completed", orderId }); + + const finalStatus = await queryFinalizationStatus(basePorts, orderId, "stripe", extId); + expect(finalStatus).toMatchObject({ + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + receiptStatus: "processed", + resumeState: "replay_processed", + }); + }); + + it("reports event_unknown when order is fully settled but receipt row is missing", async () => { + const orderId = "order_event_unknown"; + const extId = "evt_missing_receipt"; + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + paymentPhase: "paid", + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_event_unknown", + { orderId, providerId: "stripe", status: "succeeded", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map([ + [ + "ledger_event_unknown", + { + productId: "p1", + variantId: "", + delta: -2, + referenceType: "order", + referenceId: orderId, + createdAt: now, + }, + ], + ]), + inventoryStock: new Map(), + }; + + const ports = portsFromState(state); + const status = await queryFinalizationStatus(ports, orderId, "stripe", extId); + expect(status).toMatchObject({ + receiptStatus: "missing", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: false, + resumeState: "event_unknown", + }); + }); + + it("concurrent same-event finalize: preserves single terminal side effect and replay-safe follow-up", async () => { + /** + * Two concurrent deliveries of the same gateway event should converge on one + * terminalized payment state and remain replay-safe once finalized. + */ + const orderId = "order_concurrent"; + const extId = "evt_concurrent"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_concurrent", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const ports = portsFromState(state); + const logs = Array<{ level: "info" | "warn"; message: string; data?: unknown }>(); + const portsWithLogs = { + ...ports, + log: { + info: (message: string, data?: unknown) => logs.push({ level: "info", message, data }), + warn: (message: string, data?: unknown) => logs.push({ level: "warn", message, data }), + }, + }; + const input = { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }; + + const [r1, r2] = await Promise.all([ + finalizePaymentFromWebhook(portsWithLogs, input), + finalizePaymentFromWebhook(portsWithLogs, input), + ]); + + // In-process race windows may drive both through `proceed`; idempotent writes + // should still converge on one terminal state. + expect(r1).toEqual({ kind: "completed", orderId }); + expect(r2).toEqual({ kind: "completed", orderId }); + + // Stock is decremented exactly once (idempotent overwrites, same values). + const finalStock = await ports.inventoryStock.get(stockDocId); + expect(finalStock?.version).toBe(4); + expect(finalStock?.quantity).toBe(8); // 10 - 2 + + // Ledger has exactly one entry (both wrote the same id). + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + + const replay = await finalizePaymentFromWebhook(portsWithLogs, input); + expect(replay).toEqual({ kind: "replay", reason: "webhook_receipt_processed" }); + + const finalStatus = await queryFinalizationStatus(portsWithLogs, orderId, "stripe", extId); + expect(finalStatus).toMatchObject({ + receiptStatus: "processed", + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + resumeState: "replay_processed", + }); + + expect(logs.some((entry) => entry.message === "commerce.finalize.inventory_reconcile")).toBe( + true, + ); + expect( + logs.some((entry) => entry.message === "commerce.finalize.payment_attempt_update_attempt"), + ).toBe(true); + expect(logs.some((entry) => entry.message === "commerce.finalize.completed")).toBe(true); + expect(logs.some((entry) => entry.message === "commerce.finalize.noop")).toBe(true); + }); + + it("claim-aware same-event concurrency: only one worker applies side effects", async () => { + const orderId = "order_claim_once"; + const extId = "evt_claim_once"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_claim_once", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const claimableReceipts = basePorts.webhookReceipts as MemColl; + const ports = { + ...basePorts, + webhookReceipts: { + get: claimableReceipts.get.bind(claimableReceipts), + query: claimableReceipts.query.bind(claimableReceipts), + put: claimableReceipts.put.bind(claimableReceipts), + putIfAbsent: async (id: string, data: StoredWebhookReceipt): Promise => { + if (claimableReceipts.rows.has(id)) return false; + await claimableReceipts.put(id, data); + return true; + }, + } as FinalizePaymentPorts["webhookReceipts"], + } as FinalizePaymentPorts; + const input = { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }; + + const [first, second] = await Promise.all([ + finalizePaymentFromWebhook(ports, input), + finalizePaymentFromWebhook(ports, input), + ]); + const outcomes = [first, second].map((result) => result.kind); + expect(outcomes).toContain("completed"); + expect(outcomes).toContain("replay"); + + const stock = await ports.inventoryStock.get(stockDocId); + expect(stock?.version).toBe(4); + expect(stock?.quantity).toBe(8); + + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + + const receipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("processed"); + }); + + it("does not steal a fresh claimed in-flight webhook receipt", async () => { + const orderId = "order_fresh_claim"; + const extId = "evt_fresh_claim"; + const stockDocId = inventoryStockDocId("p1", ""); + const freshClaimExpiresAt = "2026-04-02T12:00:30.000Z"; + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map([ + [ + webhookReceiptDocId("stripe", extId), + { + providerId: "stripe", + externalEventId: extId, + orderId, + status: "pending", + correlationId: "cid", + createdAt: now, + updatedAt: now, + claimState: "claimed", + claimOwner: "other-worker", + claimToken: "other-token", + claimVersion: now, + claimExpiresAt: freshClaimExpiresAt, + }, + ], + ]), + paymentAttempts: new Map([ + [ + "pa_fresh_claim", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const claimableReceipts = basePorts.webhookReceipts as MemColl; + const ports = { + ...basePorts, + webhookReceipts: { + get: claimableReceipts.get.bind(claimableReceipts), + query: claimableReceipts.query.bind(claimableReceipts), + put: claimableReceipts.put.bind(claimableReceipts), + putIfAbsent: async (id: string, data: StoredWebhookReceipt): Promise => { + if (claimableReceipts.rows.has(id)) return false; + await claimableReceipts.put(id, data); + return true; + }, + } as FinalizePaymentPorts["webhookReceipts"], + } as FinalizePaymentPorts; + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_in_flight" }); + + const order = await ports.orders.get(orderId); + expect(order?.paymentPhase).toBe("payment_pending"); + const receipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.claimState).toBe("claimed"); + }); + + it("reclaims a stale claimed webhook receipt and completes finalize", async () => { + const orderId = "order_stale_claim"; + const extId = "evt_stale_claim"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map([ + [ + webhookReceiptDocId("stripe", extId), + { + providerId: "stripe", + externalEventId: extId, + orderId, + status: "pending", + correlationId: "cid", + createdAt: "2026-04-02T11:00:00.000Z", + updatedAt: now, + claimState: "claimed", + claimOwner: "other-worker", + claimToken: "other-token", + claimVersion: "2026-04-02T11:00:00.000Z", + claimExpiresAt: "2026-04-02T11:59:00.000Z", + }, + ], + ]), + paymentAttempts: new Map([ + [ + "pa_stale_claim", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const ports = { + ...basePorts, + webhookReceipts: memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ), + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "completed", orderId }); + const receipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("processed"); + expect(receipt?.claimState).toBe("released"); + }); + + it("treats malformed claimExpiresAt as a replay-safe lease boundary", async () => { + const orderId = "order_strict_bad_claim_expires_at"; + const extId = "evt_strict_bad_claim_expires_at"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map([ + [ + webhookReceiptDocId("stripe", extId), + { + providerId: "stripe", + externalEventId: extId, + orderId, + status: "pending", + correlationId: "cid", + createdAt: now, + updatedAt: now, + claimState: "claimed", + claimOwner: "other-worker", + claimToken: "other-token", + claimVersion: now, + claimExpiresAt: "definitely-not-an-rfc3339-timestamp", + }, + ], + ]), + paymentAttempts: new Map([ + [ + "pa_strict_bad_claim_expires_at", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const ports = { + ...basePorts, + webhookReceipts: memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ), + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_claim_retry_failed" }); + + const order = await basePorts.orders.get(orderId); + expect(order?.paymentPhase).toBe("payment_pending"); + const pa = await basePorts.paymentAttempts.get("pa_strict_bad_claim_expires_at"); + expect(pa?.status).toBe("pending"); + const stock = await basePorts.inventoryStock.get(stockDocId); + expect(stock?.quantity).toBe(10); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(0); + const receipt = await basePorts.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("pending"); + expect(receipt?.claimState).toBe("claimed"); + }); + + it("aborts before side-effects when observed claim lease is already expired", async () => { + const orderId = "order_claim_expired_while_inflight"; + const extId = "evt_claim_expired_while_inflight"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_claim_expired_while_inflight", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const claimableReceipts = memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ); + const webhookRows = claimableReceipts.rows; + const ports = { + ...basePorts, + webhookReceipts: { + get: claimableReceipts.get.bind(claimableReceipts), + put: claimableReceipts.put.bind(claimableReceipts), + query: claimableReceipts.query.bind(claimableReceipts), + rows: webhookRows, + compareAndSwap: claimableReceipts.compareAndSwap.bind(claimableReceipts), + putIfAbsent: async (id: string, data: StoredWebhookReceipt): Promise => { + const inserted = await claimableReceipts.putIfAbsent(id, data); + if (inserted) { + const current = webhookRows.get(id); + if (current) { + webhookRows.set(id, { + ...current, + claimExpiresAt: "2026-04-02T11:00:00.000Z", + }); + } + } + return inserted; + }, + }, + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_claim_retry_failed" }); + + const order = await basePorts.orders.get(orderId); + expect(order?.paymentPhase).toBe("payment_pending"); + const pa = await basePorts.paymentAttempts.get("pa_claim_expired_while_inflight"); + expect(pa?.status).toBe("pending"); + const stock = await basePorts.inventoryStock.get(stockDocId); + expect(stock?.quantity).toBe(10); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(0); + const receipt = await basePorts.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("pending"); + expect(receipt?.claimExpiresAt).toBe("2026-04-02T11:00:00.000Z"); + }); + + it("aborts before order/payment writes when claim is stolen after inventory step", async () => { + const orderId = "order_claim_stolen_before_finalize_writes"; + const extId = "evt_claim_stolen_before_finalize_writes"; + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [], + totalMinor: 0, + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_claim_stolen_before_finalize_writes", + { + orderId, + providerId: "stripe", + status: "pending", + createdAt: now, + updatedAt: now, + }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map(), + }; + + const basePorts = portsFromState(state); + const webhookRows = basePorts.webhookReceipts.rows; + const webhookReceipts = memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ); + const rid = webhookReceiptDocId("stripe", extId); + const ports = { + ...basePorts, + inventoryLedger: { + rows: basePorts.inventoryLedger.rows, + get: basePorts.inventoryLedger.get.bind(basePorts.inventoryLedger), + put: basePorts.inventoryLedger.put.bind(basePorts.inventoryLedger), + query: async (options: Parameters["query"]>[0]) => { + const result = await basePorts.inventoryLedger.query(options); + stealWebhookClaim(webhookRows, rid); + return result; + }, + }, + webhookReceipts, + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_in_flight" }); + + const order = await basePorts.orders.get(orderId); + expect(order?.paymentPhase).toBe("payment_pending"); + const pa = await basePorts.paymentAttempts.get("pa_claim_stolen_before_finalize_writes"); + expect(pa?.status).toBe("pending"); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(0); + const receipt = await basePorts.webhookReceipts.get(rid); + expect(receipt?.status).toBe("pending"); + expect(receipt?.claimOwner).toBe("other-worker"); + }); + + it("aborts before payment attempt update when claim is stolen during order write", async () => { + const orderId = "order_claim_stolen_during_order_write"; + const extId = "evt_claim_stolen_during_order_write"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_claim_stolen_during_order_write", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const webhookRows = basePorts.webhookReceipts.rows; + const webhookReceipts = memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ); + const rid = webhookReceiptDocId("stripe", extId); + const ports = { + ...basePorts, + orders: { + rows: basePorts.orders.rows, + get: basePorts.orders.get.bind(basePorts.orders), + query: basePorts.orders.query.bind(basePorts.orders), + put: async (id: string, data: StoredOrder): Promise => { + stealWebhookClaim(webhookRows, rid); + await basePorts.orders.put(id, data); + }, + }, + webhookReceipts, + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_in_flight" }); + + const order = await basePorts.orders.get(orderId); + expect(order?.paymentPhase).toBe("paid"); + const pa = await basePorts.paymentAttempts.get("pa_claim_stolen_during_order_write"); + expect(pa?.status).toBe("pending"); + const stock = await basePorts.inventoryStock.get(stockDocId); + expect(stock?.quantity).toBe(8); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + const receipt = await basePorts.webhookReceipts.get(rid); + expect(receipt?.status).toBe("pending"); + expect(receipt?.claimOwner).toBe("other-worker"); + }); + + it("aborts before processed receipt when claim is stolen during payment attempt write", async () => { + const orderId = "order_claim_stolen_during_attempt_write"; + const extId = "evt_claim_stolen_during_attempt_write"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_claim_stolen_during_attempt_write", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const webhookRows = basePorts.webhookReceipts.rows; + const webhookReceipts = memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ); + const rid = webhookReceiptDocId("stripe", extId); + const ports = { + ...basePorts, + paymentAttempts: { + rows: basePorts.paymentAttempts.rows, + get: basePorts.paymentAttempts.get.bind(basePorts.paymentAttempts), + query: basePorts.paymentAttempts.query.bind(basePorts.paymentAttempts), + put: async (id: string, data: StoredPaymentAttempt): Promise => { + stealWebhookClaim(webhookRows, rid); + await basePorts.paymentAttempts.put(id, data); + }, + }, + webhookReceipts, + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_in_flight" }); + + const order = await basePorts.orders.get(orderId); + expect(order?.paymentPhase).toBe("paid"); + const pa = await basePorts.paymentAttempts.get("pa_claim_stolen_during_attempt_write"); + expect(pa?.status).toBe("succeeded"); + const stock = await basePorts.inventoryStock.get(stockDocId); + expect(stock?.quantity).toBe(8); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + const receipt = await basePorts.webhookReceipts.get(rid); + expect(receipt?.status).toBe("pending"); + expect(receipt?.claimOwner).toBe("other-worker"); + }); + + it("aborts when another worker marks receipt processed during order write", async () => { + const orderId = "order_claim_processed_during_order_write"; + const extId = "evt_claim_processed_during_order_write"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_claim_processed_during_order_write", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const webhookRows = basePorts.webhookReceipts.rows; + const webhookReceipts = memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ); + const rid = webhookReceiptDocId("stripe", extId); + const ports = { + ...basePorts, + orders: { + rows: basePorts.orders.rows, + get: basePorts.orders.get.bind(basePorts.orders), + query: basePorts.orders.query.bind(basePorts.orders), + put: async (id: string, data: StoredOrder): Promise => { + await basePorts.orders.put(id, data); + const current = webhookRows.get(rid); + if (current) { + webhookRows.set(rid, { + ...current, + status: "processed", + claimState: "released", + claimOwner: undefined, + claimToken: undefined, + claimVersion: undefined, + claimExpiresAt: undefined, + updatedAt: now, + }); + } + }, + }, + webhookReceipts, + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "replay", reason: "webhook_receipt_processed" }); + + const order = await basePorts.orders.get(orderId); + expect(order?.paymentPhase).toBe("paid"); + const pa = await basePorts.paymentAttempts.get("pa_claim_processed_during_order_write"); + expect(pa?.status).toBe("pending"); + const stock = await basePorts.inventoryStock.get(stockDocId); + expect(stock?.quantity).toBe(8); + const ledger = await basePorts.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + const receipt = await basePorts.webhookReceipts.get(rid); + expect(receipt?.status).toBe("processed"); + expect(receipt?.claimState).toBe("released"); + }); + + it("pending receipt with unparseable updatedAt is treated as stale claim and finalizes", async () => { + const orderId = "order_bad_receipt_ts"; + const extId = "evt_bad_receipt_ts"; + const rid = webhookReceiptDocId("stripe", extId); + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 1, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map([ + [ + rid, + { + providerId: "stripe", + externalEventId: extId, + orderId, + status: "pending", + correlationId: "cid", + createdAt: now, + updatedAt: "not-an-iso-timestamp", + }, + ], + ]), + paymentAttempts: new Map([ + [ + "pa_bad_ts", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const basePorts = portsFromState(state); + const ports = { + ...basePorts, + webhookReceipts: memCollWithPutIfAbsent( + basePorts.webhookReceipts as MemColl, + ), + } as FinalizePaymentPorts; + + const res = await finalizePaymentFromWebhook(ports, { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }); + + expect(res).toEqual({ kind: "completed", orderId }); + const receipt = await ports.webhookReceipts.get(rid); + expect(receipt?.status).toBe("processed"); + }); + + it("stress: many in-process duplicate same-event finalizations converge on one inventory result", async () => { + const orderId = "order_concurrent_many"; + const extId = "evt_concurrent_many"; + const stockDocId = inventoryStockDocId("p1", ""); + const state = { + orders: new Map([ + [ + orderId, + baseOrder({ + lineItems: [{ productId: "p1", quantity: 2, inventoryVersion: 3, unitPriceMinor: 500 }], + }), + ], + ]), + webhookReceipts: new Map(), + paymentAttempts: new Map([ + [ + "pa_concurrent_many", + { orderId, providerId: "stripe", status: "pending", createdAt: now, updatedAt: now }, + ], + ]), + inventoryLedger: new Map(), + inventoryStock: new Map([ + [stockDocId, { productId: "p1", variantId: "", version: 3, quantity: 10, updatedAt: now }], + ]), + }; + + const ports = portsFromState(state); + const input = { + orderId, + providerId: "stripe", + externalEventId: extId, + correlationId: "cid", + finalizeToken: FINALIZE_RAW, + nowIso: now, + }; + + const results = await Promise.all( + Array.from({ length: 8 }, (_index) => finalizePaymentFromWebhook(ports, input)), + ); + expect(results).toHaveLength(8); + for (const result of results) { + expect(result).toEqual({ kind: "completed", orderId }); + } + + const finalStock = await ports.inventoryStock.get(stockDocId); + expect(finalStock?.version).toBe(4); + expect(finalStock?.quantity).toBe(8); + + const ledger = await ports.inventoryLedger.query({ limit: 10 }); + expect(ledger.items).toHaveLength(1); + + const receipt = await ports.webhookReceipts.get(webhookReceiptDocId("stripe", extId)); + expect(receipt?.status).toBe("processed"); + }); +}); diff --git a/packages/plugins/commerce/src/orchestration/finalize-payment.ts b/packages/plugins/commerce/src/orchestration/finalize-payment.ts new file mode 100644 index 000000000..6b954d539 --- /dev/null +++ b/packages/plugins/commerce/src/orchestration/finalize-payment.ts @@ -0,0 +1,964 @@ +/** + * Storage-backed payment finalization (webhook path). + * + * Ordering follows architecture §20.5: claim a `webhookReceipts` row (`pending`) → + * preflight inventory + stock/ledger mutation + order status update. + * + * `decidePaymentFinalize` interprets the read model only; this module performs writes. + * + * **Concurrency:** `webhookReceipts.putIfAbsent` (when available) plus pending/fresh claim + * rules serialize same-event overlap; terminal receipt rows short-circuit losers without + * overwriting `processed`/`duplicate`/`error` state. + */ + +import type { CommerceApiErrorInput } from "../kernel/api-errors.js"; +import type { CommerceErrorCode } from "../kernel/errors.js"; +import { decidePaymentFinalize, type WebhookReceiptView } from "../kernel/finalize-decision.js"; +import { equalSha256HexDigestAsync, sha256HexAsync } from "../lib/crypto-adapter.js"; +import type { + StoredInventoryLedgerEntry, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredWebhookReceipt, + WebhookReceiptErrorCode, + WebhookReceiptClaimState, +} from "../types.js"; +import { + InventoryFinalizeError, + applyInventoryForOrder, + inventoryStockDocId, + isTerminalInventoryFailure, + mapInventoryErrorToApiCode, +} from "./finalize-payment-inventory.js"; +import { + deriveFinalizationResumeState, + type FinalizationStatus, +} from "./finalize-payment-status.js"; + +type FinalizeQueryPage = { + items: Array<{ id: string; data: T }>; + hasMore: boolean; + cursor?: string; +}; + +type FinalizeQueryOptions = { + where?: Record; + limit?: number; + orderBy?: Partial>; + cursor?: string; +}; + +export type FinalizeLogPort = { + info(message: string, data?: unknown): void; + warn(message: string, data?: unknown): void; +}; + +/** Narrow storage surface for tests and `ctx.storage` (structural match). */ +export type FinalizeCollection = { + get(id: string): Promise; + put(id: string, data: T): Promise; + putIfAbsent?(id: string, data: T): Promise; + compareAndSwap?(id: string, expectedVersion: string, data: T): Promise; +}; + +export type QueryableCollection = FinalizeCollection & { + query(options?: FinalizeQueryOptions): Promise>; +}; + +export type FinalizePaymentAttemptCollection = FinalizeCollection & { + query( + options?: FinalizeQueryOptions, + ): Promise>; +}; + +export type FinalizePaymentPorts = { + orders: FinalizeCollection; + webhookReceipts: FinalizeCollection; + paymentAttempts: FinalizePaymentAttemptCollection; + inventoryLedger: QueryableCollection; + inventoryStock: QueryableCollection; + log?: FinalizeLogPort; +}; + +const WEBHOOK_RECEIPT_CLAIM_LEASE_WINDOW_MS = 30_000; +const FINALIZE_INVARIANT_CHECKS = process.env.COMMERCE_ENABLE_FINALIZE_INVARIANT_CHECKS === "1"; +/** + * Canonical finalize control-flow now always uses strict lease semantics. + * `COMMERCE_USE_LEASED_FINALIZE` is retained for rollout evidence and + * operational command parity only. + */ + +export type FinalizeWebhookInput = { + orderId: string; + providerId: string; + externalEventId: string; + correlationId: string; + /** Required for all orders. */ + finalizeToken: string; + /** Inject clock in tests. */ + nowIso?: string; +}; + +export type FinalizeWebhookResult = + | { kind: "completed"; orderId: string } + | { kind: "replay"; reason: string } + | { kind: "api_error"; error: CommerceApiErrorInput }; + +type FinalizeFlowDecision = + | { kind: "noop"; result: FinalizeWebhookResult; reason: string } + | { kind: "invalid_token"; result: FinalizeWebhookResult } + | { kind: "proceed"; existingReceipt: StoredWebhookReceipt | null }; + +type FinalizeLogContext = { + orderId: string; + providerId: string; + externalEventId: string; + correlationId: string; +}; + +function buildFinalizeLogContext(input: FinalizeWebhookInput): FinalizeLogContext { + return { + orderId: input.orderId, + providerId: input.providerId, + externalEventId: input.externalEventId, + correlationId: input.correlationId, + }; +} + +/** Stable document id for a webhook receipt (primary-key dedupe per event). */ +export function webhookReceiptDocId(providerId: string, externalEventId: string): string { + return `wr:${encodeURIComponent(providerId)}:${encodeURIComponent(externalEventId)}`; +} + +export function receiptToView(stored: StoredWebhookReceipt | null): WebhookReceiptView { + if (!stored) return { exists: false }; + return { exists: true, status: stored.status }; +} + +function noopToResult( + decision: Extract, { action: "noop" }>, + orderId: string, +): FinalizeWebhookResult { + if (decision.httpStatus === 200) { + return { kind: "replay", reason: decision.reason }; + } + return { + kind: "api_error", + error: { + code: "ORDER_STATE_CONFLICT", + message: noopConflictMessage(decision.reason), + details: { reason: decision.reason, orderId }, + }, + }; +} + +async function buildFinalizationDecision( + order: StoredOrder, + existingReceipt: StoredWebhookReceipt | null, + correlationId: string, + orderId: string, + inputFinalizeToken: string | undefined, +): Promise { + const decision = decidePaymentFinalize({ + orderStatus: order.paymentPhase, + receipt: receiptToView(existingReceipt), + correlationId, + }); + if (decision.action === "noop") { + return { kind: "noop", result: noopToResult(decision, orderId), reason: decision.reason }; + } + const tokenErr = await verifyFinalizeToken(order, inputFinalizeToken); + if (tokenErr) { + return { kind: "invalid_token", result: tokenErr }; + } + return { kind: "proceed", existingReceipt }; +} + +/** + * A receipt in `pending` status means finalization has started but may not be + * complete. Specifically: + * - inventory may or may not have been applied + * - order phase may or may not have been set to `paid` + * - payment attempt may or may not have been marked `succeeded` + * + * `pending` is the "retry me" signal — not a terminal state. The next call to + * `finalizePaymentFromWebhook` for the same event will resume from wherever the + * previous attempt stopped. + * + * Terminal receipt states: + * - `processed` — all side effects completed successfully + * - `error` — a non-retryable failure was recorded; do not auto-replay + * - `duplicate` — event is a known redundant delivery; treat as replay + */ +function createPendingReceipt( + input: FinalizeWebhookInput, + existingReceipt: StoredWebhookReceipt | null, + nowIso: string, + claimState?: WebhookReceiptClaimState, + claimVersion?: string, + claimOwner?: string, + claimToken?: string, + claimExpiresAt?: string, +): StoredWebhookReceipt { + return { + providerId: input.providerId, + externalEventId: input.externalEventId, + orderId: input.orderId, + status: "pending", + correlationId: input.correlationId, + createdAt: existingReceipt?.createdAt ?? nowIso, + updatedAt: nowIso, + claimState, + claimVersion, + claimOwner, + claimToken, + claimExpiresAt, + }; +} + +type ClaimWebhookReceiptResult = + | { kind: "acquired"; persisted: boolean; receipt: StoredWebhookReceipt } + | { kind: "replay"; result: FinalizeWebhookResult }; + +function createClaimContext(nowIso: string): { + claimOwner: string; + claimToken: string; + claimVersion: string; + claimExpiresAt: string; +} { + const claimToken = + typeof globalThis.crypto?.randomUUID === "function" + ? globalThis.crypto.randomUUID() + : `${Date.now().toString(36)}-${Math.random().toString(16).slice(2, 10)}`; + const nowMs = Date.parse(nowIso); + const claimExpiresAt = Number.isFinite(nowMs) + ? new Date(nowMs + WEBHOOK_RECEIPT_CLAIM_LEASE_WINDOW_MS).toISOString() + : nowIso; + + return { + claimOwner: `worker:${claimToken}`, + claimToken, + claimVersion: nowIso, + claimExpiresAt, + }; +} + +function parseClaimTimestampMs(timestamp: string | undefined): number | null { + const value = Date.parse(timestamp ?? ""); + return Number.isFinite(value) ? value : null; +} + +function isClaimLeaseExpired(claimExpiresAt: string | undefined, nowIso: string): boolean { + const nowMs = parseClaimTimestampMs(nowIso); + const expiresMs = parseClaimTimestampMs(claimExpiresAt); + if (nowMs === null || expiresMs === null) return true; + return nowMs > expiresMs; +} + +function canTakeClaim( + existing: StoredWebhookReceipt, + nowIso: string, +): { canTake: boolean; reason: FinalizeWebhookResult } { + switch (existing.claimState) { + case "claimed": { + const nowMs = parseClaimTimestampMs(nowIso); + const expiresMs = parseClaimTimestampMs(existing.claimExpiresAt); + if (nowMs === null || expiresMs === null) { + return { + canTake: false, + reason: { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }, + }; + } + const isInFlight = nowMs <= expiresMs; + if (isInFlight) { + return { canTake: false, reason: { kind: "replay", reason: "webhook_receipt_in_flight" } }; + } + return { + canTake: true, + reason: { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }, + }; + } + case "unclaimed": + case "released": + default: + return { + canTake: true, + reason: { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }, + }; + } +} + +function withClaimedMetadata( + receipt: StoredWebhookReceipt, + claimContext: ReturnType, + expectedVersion: string, + nowIso: string, +): StoredWebhookReceipt { + return { + ...receipt, + claimState: "claimed", + claimOwner: claimContext.claimOwner, + claimToken: claimContext.claimToken, + claimVersion: expectedVersion, + claimExpiresAt: claimContext.claimExpiresAt, + updatedAt: nowIso, + }; +} + +async function claimWebhookReceipt({ + ports, + receiptId, + receipt, + nowIso, +}: { + ports: FinalizePaymentPorts; + receiptId: string; + receipt: StoredWebhookReceipt; + nowIso: string; +}): Promise { + if (!ports.webhookReceipts.putIfAbsent) { + return { kind: "acquired", persisted: false, receipt }; + } + + const claimContext = createClaimContext(nowIso); + const stagedReceipt = createPendingReceipt( + { + orderId: receipt.orderId, + providerId: receipt.providerId, + externalEventId: receipt.externalEventId, + correlationId: receipt.correlationId ?? "", + finalizeToken: "", + }, + receipt, + nowIso, + "claimed", + claimContext.claimVersion, + claimContext.claimOwner, + claimContext.claimToken, + claimContext.claimExpiresAt, + ); + + const claimedNow = await ports.webhookReceipts.putIfAbsent(receiptId, stagedReceipt); + if (claimedNow) { + return { kind: "acquired", persisted: true, receipt: stagedReceipt }; + } + + const existing = await ports.webhookReceipts.get(receiptId); + if (!existing) { + const replayInsert = await ports.webhookReceipts.putIfAbsent(receiptId, stagedReceipt); + if (replayInsert) return { kind: "acquired", persisted: true, receipt: stagedReceipt }; + return { + kind: "replay", + result: { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }, + }; + } + + if (existing.status === "processed") { + return { kind: "replay", result: { kind: "replay", reason: "webhook_receipt_processed" } }; + } + if (existing.status === "duplicate") { + return { kind: "replay", result: { kind: "replay", reason: "webhook_receipt_duplicate" } }; + } + if (existing.status === "error") { + return { kind: "replay", result: { kind: "replay", reason: "webhook_error" } }; + } + + const { canTake, reason } = canTakeClaim(existing, nowIso); + if (!canTake) { + return { + kind: "replay", + result: reason, + }; + } + + const claimedExistingReceipt = withClaimedMetadata( + existing, + claimContext, + existing.updatedAt, + nowIso, + ); + if (!ports.webhookReceipts.compareAndSwap) { + return { kind: "acquired", persisted: false, receipt: claimedExistingReceipt }; + } + + const stolen = await ports.webhookReceipts.compareAndSwap( + receiptId, + existing.updatedAt, + claimedExistingReceipt, + ); + if (!stolen) { + return { + kind: "replay", + result: { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }, + }; + } + + return { kind: "acquired", persisted: true, receipt: claimedExistingReceipt }; +} + +function noopConflictMessage(reason: string): string { + switch (reason) { + case "webhook_pending": + return "Webhook receipt is still pending processing"; + case "webhook_error": + return "Webhook receipt is in a terminal error state"; + case "order_not_finalizable": + return "Order is not in a finalizable payment state"; + default: + return "Finalize could not proceed"; + } +} + +async function verifyFinalizeToken( + order: StoredOrder, + token: string | undefined, +): Promise { + const expected = order.finalizeTokenHash; + if (!token) { + return { + kind: "api_error", + error: { + code: "ORDER_TOKEN_REQUIRED", + message: "finalizeToken is required to finalize this order", + }, + }; + } + const digest = await sha256HexAsync(token); + if (!(await equalSha256HexDigestAsync(digest, expected))) { + return { + kind: "api_error", + error: { + code: "ORDER_TOKEN_INVALID", + message: "Invalid finalize token for this order", + }, + }; + } + return null; +} + +async function persistReceiptStatus( + ports: FinalizePaymentPorts, + receiptId: string, + receipt: StoredWebhookReceipt, + status: StoredWebhookReceipt["status"], + nowIso: string, + errorCode?: StoredWebhookReceipt["errorCode"], + errorDetails?: Record, +): Promise { + const isTerminal = status === "processed" || status === "duplicate" || status === "error"; + await ports.webhookReceipts.put(receiptId, { + ...receipt, + status, + errorCode: status === "error" ? errorCode : undefined, + errorDetails: status === "error" ? (errorDetails ?? receipt.errorDetails) : undefined, + claimState: isTerminal ? "released" : receipt.claimState, + claimOwner: isTerminal ? undefined : receipt.claimOwner, + claimToken: isTerminal ? undefined : receipt.claimToken, + claimExpiresAt: isTerminal ? undefined : receipt.claimExpiresAt, + claimVersion: isTerminal ? undefined : receipt.claimVersion, + updatedAt: nowIso, + }); +} + +function getActiveClaim( + receipt: StoredWebhookReceipt, +): { + claimOwner: string; + claimToken: string; + claimVersion: string; + claimExpiresAt?: string; +} | null { + if ( + receipt.claimState !== "claimed" || + !receipt.claimOwner || + !receipt.claimToken || + !receipt.claimVersion + ) { + return null; + } + + return { + claimOwner: receipt.claimOwner, + claimToken: receipt.claimToken, + claimVersion: receipt.claimVersion, + claimExpiresAt: receipt.claimExpiresAt, + }; +} + +async function assertClaimStillActive( + ports: FinalizePaymentPorts, + receiptId: string, + claimedReceipt: StoredWebhookReceipt, + nowIso: string, +): Promise { + const activeClaim = getActiveClaim(claimedReceipt); + if (!activeClaim) return null; + + const liveReceipt = await ports.webhookReceipts.get(receiptId); + if (!liveReceipt) { + return { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }; + } + + if (liveReceipt.status === "processed") { + return { kind: "replay", reason: "webhook_receipt_processed" }; + } + if (liveReceipt.status === "duplicate") { + return { kind: "replay", reason: "webhook_receipt_duplicate" }; + } + if (liveReceipt.status === "error") { + return { kind: "replay", reason: "webhook_error" }; + } + + if (liveReceipt.claimState !== "claimed") { + return { kind: "replay", reason: "webhook_receipt_in_flight" }; + } + + if ( + liveReceipt.claimOwner !== activeClaim.claimOwner || + liveReceipt.claimToken !== activeClaim.claimToken || + liveReceipt.claimVersion !== activeClaim.claimVersion + ) { + return { kind: "replay", reason: "webhook_receipt_in_flight" }; + } + + if (isClaimLeaseExpired(liveReceipt.claimExpiresAt, nowIso)) { + return { kind: "replay", reason: "webhook_receipt_claim_retry_failed" }; + } + + return null; +} + +function mapInventoryFinalizeErrorToReceiptCode(code: CommerceErrorCode): WebhookReceiptErrorCode { + if (code === "PRODUCT_UNAVAILABLE") return "PRODUCT_UNAVAILABLE"; + if (code === "INSUFFICIENT_STOCK") return "INSUFFICIENT_STOCK"; + if (code === "INVENTORY_CHANGED") return "INVENTORY_CHANGED"; + if (code === "ORDER_STATE_CONFLICT") return "ORDER_STATE_CONFLICT"; + return "ORDER_STATE_CONFLICT"; +} + +async function markPaymentAttemptSucceeded( + ports: FinalizePaymentPorts, + orderId: string, + providerId: string, + nowIso: string, +): Promise { + const res = await ports.paymentAttempts.query({ + where: { orderId, providerId, status: "pending" }, + orderBy: { createdAt: "asc" }, + limit: 1, + }); + const match = res.items[0]; + if (!match) return; + + const next: StoredPaymentAttempt = { + ...match.data, + status: "succeeded", + updatedAt: nowIso, + }; + await ports.paymentAttempts.put(match.id, next); +} + +/** + * Finalization state transitions — what each combination means for retry: + * + * | Receipt | Order phase | Interpretation | + * |-------------|-------------------|---------------------------------------| + * | (none) | payment_pending | Nothing written; safe to start fresh | + * | pending | payment_pending | Partial progress; resume from here | + * | pending | paid | Last write (receipt→processed) failed | + * | processed | paid | Replay; all side effects complete | + * | error | any | Terminal; do not auto-retry | + * | duplicate | any | Replay; redundant delivery | + * + * Cross-worker concurrency caveat: + * if a process stalls while processing an event (for longer than the claim window), + * another worker may start and replay this event. The claim window keeps overlap low, + * and idempotent writes keep the path safe if this still happens. + * + * A `pending` receipt means the current node claimed this event and something + * failed partway through. This function handles all partial-success sub-cases: + * - inventory ledger written, stock write incomplete → reconcile pass + * - inventory done, order.put failed → skip inventory, retry order + * - order paid, attempt update failed → skip both, retry attempt + * - everything done except receipt→processed → skip all writes, mark processed + * When inventory preconditions are permanently invalid (missing stock, + * insufficient stock, or stale version snapshot), the receipt transitions to + * `error` so retries do not replay known terminal conflicts. + */ +/** + * Single authoritative finalize entry for gateway webhooks (Stripe first). + */ +export async function finalizePaymentFromWebhook( + ports: FinalizePaymentPorts, + input: FinalizeWebhookInput, +): Promise { + const nowIso = input.nowIso ?? new Date().toISOString(); + const logContext = buildFinalizeLogContext(input); + const receiptId = webhookReceiptDocId(input.providerId, input.externalEventId); + + const order = await ports.orders.get(input.orderId); + if (!order) { + ports.log?.warn("commerce.finalize.order_not_found", { + ...logContext, + stage: "initial_lookup", + }); + return { + kind: "api_error", + error: { code: "ORDER_NOT_FOUND", message: "Order not found" }, + }; + } + + const existingReceipt = await ports.webhookReceipts.get(receiptId); + const decision = await buildFinalizationDecision( + order, + existingReceipt, + input.correlationId, + input.orderId, + input.finalizeToken, + ); + switch (decision.kind) { + case "noop": + ports.log?.info("commerce.finalize.noop", { + ...logContext, + reason: decision.reason, + }); + return decision.result; + case "invalid_token": + ports.log?.warn("commerce.finalize.token_rejected", logContext); + return decision.result; + case "proceed": + break; + default: + break; + } + + const stagedReceipt = createPendingReceipt(input, decision.existingReceipt, nowIso); + const claim = await claimWebhookReceipt({ + ports, + receiptId, + receipt: stagedReceipt, + nowIso, + }); + if (claim.kind === "replay") { + ports.log?.info("commerce.finalize.noop", { + ...logContext, + reason: "webhook_receipt_claim_in_flight", + }); + return claim.result; + } + + const pendingReceipt = claim.receipt; + if (!claim.persisted) { + await ports.webhookReceipts.put(receiptId, pendingReceipt); + } + ports.log?.info("commerce.finalize.receipt_pending", { + ...logContext, + stage: "pending_receipt_written", + priorReceiptStatus: decision.existingReceipt?.status, + }); + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + + const freshOrder = await ports.orders.get(input.orderId); + if (!freshOrder) { + ports.log?.warn("commerce.finalize.order_not_found", { + ...logContext, + stage: "post_pending_lookup", + }); + + /** + * Operational meaning of `error` today: + * order row disappeared while finalization was running. + * Treat as terminal and escalate rather than auto-retrying indefinitely. + */ + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + await persistReceiptStatus( + ports, + receiptId, + pendingReceipt, + "error", + nowIso, + "ORDER_NOT_FOUND", + { orderId: input.orderId, correlationId: input.correlationId }, + ); + return { + kind: "api_error", + error: { code: "ORDER_NOT_FOUND", message: "Order not found" }, + }; + } + + const shouldApplyInventory = freshOrder.paymentPhase !== "paid"; + if (shouldApplyInventory) { + if (freshOrder.paymentPhase !== "payment_pending" && freshOrder.paymentPhase !== "authorized") { + ports.log?.warn("commerce.finalize.order_not_finalizable", { + ...logContext, + paymentPhase: freshOrder.paymentPhase, + }); + /** + * Order moved to a non-finalizable phase between the initial read and + * the pending-receipt write (e.g. concurrent finalize completed first). + * Mark the receipt `error` so it does not stay stuck in `pending` + * and operators get a clear terminal signal. + */ + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + await persistReceiptStatus( + ports, + receiptId, + pendingReceipt, + "error", + nowIso, + "ORDER_STATE_CONFLICT", + { paymentPhase: freshOrder.paymentPhase }, + ); + return { + kind: "api_error", + error: { + code: "ORDER_STATE_CONFLICT", + message: "Order is not in a finalizable payment state", + details: { paymentPhase: freshOrder.paymentPhase }, + }, + }; + } + + try { + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + ports.log?.info("commerce.finalize.inventory_reconcile", { + ...logContext, + paymentPhase: freshOrder.paymentPhase, + }); + await applyInventoryForOrder(ports, freshOrder, input.orderId, nowIso); + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + ports.log?.info("commerce.finalize.inventory_applied", { + ...logContext, + orderId: input.orderId, + }); + } catch (err) { + if (err instanceof InventoryFinalizeError) { + const apiCode = mapInventoryErrorToApiCode(err.code); + if (isTerminalInventoryFailure(err.code)) { + ports.log?.warn("commerce.finalize.inventory_failed_terminal", { + ...logContext, + code: apiCode, + details: err.details, + }); + { + const claimCheck = await assertClaimStillActive( + ports, + receiptId, + pendingReceipt, + nowIso, + ); + if (claimCheck) return claimCheck; + } + await persistReceiptStatus( + ports, + receiptId, + pendingReceipt, + "error", + nowIso, + mapInventoryFinalizeErrorToReceiptCode(err.code), + { + ...err.details, + inventoryErrorCode: err.code, + commerceErrorCode: apiCode, + }, + ); + } else { + ports.log?.warn("commerce.finalize.inventory_failed", { + ...logContext, + code: apiCode, + details: err.details, + }); + } + return { + kind: "api_error", + error: { + code: apiCode, + message: err.message, + details: err.details, + }, + }; + } + throw err; + } + } + + if (freshOrder.paymentPhase !== "paid") { + ports.log?.info("commerce.finalize.order_settlement_attempt", { + ...logContext, + orderId: input.orderId, + paymentPhase: freshOrder.paymentPhase, + }); + const paidOrder: StoredOrder = { + ...freshOrder, + paymentPhase: "paid", + updatedAt: nowIso, + }; + try { + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + await ports.orders.put(input.orderId, paidOrder); + } catch (err) { + ports.log?.warn("commerce.finalize.order_update_failed", { + ...logContext, + details: err instanceof Error ? err.message : String(err), + }); + return { + kind: "api_error", + error: { + code: "ORDER_STATE_CONFLICT", + message: "Failed to persist order finalization", + details: { orderId: input.orderId }, + }, + }; + } + } + + try { + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + ports.log?.info("commerce.finalize.payment_attempt_update_attempt", { + ...logContext, + orderId: input.orderId, + providerId: input.providerId, + }); + await markPaymentAttemptSucceeded(ports, input.orderId, input.providerId, nowIso); + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + } catch (err) { + ports.log?.warn("commerce.finalize.attempt_update_failed", { + ...logContext, + details: err instanceof Error ? err.message : String(err), + }); + return { + kind: "api_error", + error: { + code: "ORDER_STATE_CONFLICT", + message: "Failed to persist payment attempt finalization", + details: { orderId: input.orderId }, + }, + }; + } + + /** + * Intentionally let this fail loudly. + * All prior side effects are persisted; with `pendingReceipt` + resume logic, + * retry is safe and expected to complete this final write. + */ + try { + { + const claimCheck = await assertClaimStillActive(ports, receiptId, pendingReceipt, nowIso); + if (claimCheck) return claimCheck; + } + ports.log?.info("commerce.finalize.receipt_processed", { + ...logContext, + stage: "finalize", + }); + await persistReceiptStatus(ports, receiptId, pendingReceipt, "processed", nowIso); + } catch (err) { + ports.log?.warn("commerce.finalize.receipt_processed_write_failed", { + ...logContext, + details: err instanceof Error ? err.message : String(err), + }); + throw err; + } + + ports.log?.info("commerce.finalize.completed", { + ...logContext, + stage: "completed", + }); + + if (FINALIZE_INVARIANT_CHECKS) { + await validateFinalizationInvariants(ports, input, logContext); + } + + return { kind: "completed", orderId: input.orderId }; +} + +export async function queryFinalizationStatus( + ports: FinalizePaymentPorts, + orderId: string, + providerId: string, + externalEventId: string, +): Promise { + const receiptId = webhookReceiptDocId(providerId, externalEventId); + const [order, receipt, ledgerPage, attemptPage] = await Promise.all([ + ports.orders.get(orderId), + ports.webhookReceipts.get(receiptId), + ports.inventoryLedger.query({ + where: { referenceType: "order", referenceId: orderId }, + limit: 1, + }), + ports.paymentAttempts.query({ where: { orderId, providerId, status: "succeeded" }, limit: 1 }), + ]); + const status: FinalizationStatus = { + receiptStatus: receipt?.status ?? "missing", + isInventoryApplied: ledgerPage.items.length > 0, + isOrderPaid: order?.paymentPhase === "paid", + isPaymentAttemptSucceeded: attemptPage.items.length > 0, + isReceiptProcessed: receipt?.status === "processed", + receiptErrorCode: receipt?.errorCode, + resumeState: "not_started", + }; + status.resumeState = deriveFinalizationResumeState(status); + return status; +} + +async function validateFinalizationInvariants( + ports: FinalizePaymentPorts, + input: FinalizeWebhookInput, + logContext: FinalizeLogContext, +): Promise { + const status = await queryFinalizationStatus( + ports, + input.orderId, + input.providerId, + input.externalEventId, + ); + if (!status.isOrderPaid) { + ports.log?.warn("commerce.finalize.invariant_failed", { + ...logContext, + reason: "order_not_paid_after_complete", + resumeState: status.resumeState, + }); + } + if (!status.isPaymentAttemptSucceeded) { + ports.log?.warn("commerce.finalize.invariant_failed", { + ...logContext, + reason: "payment_attempt_not_succeeded_after_complete", + resumeState: status.resumeState, + }); + } + if (!status.isInventoryApplied) { + ports.log?.warn("commerce.finalize.invariant_failed", { + ...logContext, + reason: "inventory_not_applied_after_complete", + resumeState: status.resumeState, + }); + } +} + +export type { FinalizationStatus } from "./finalize-payment-status.js"; +export { deriveFinalizationResumeState } from "./finalize-payment-status.js"; +export { inventoryStockDocId } from "./finalize-payment-inventory.js"; diff --git a/packages/plugins/commerce/src/route-errors.ts b/packages/plugins/commerce/src/route-errors.ts new file mode 100644 index 000000000..f4ad22b0d --- /dev/null +++ b/packages/plugins/commerce/src/route-errors.ts @@ -0,0 +1,15 @@ +/** + * Bridge kernel {@link toCommerceApiError} to {@link PluginRouteError} for route handlers. + */ + +import { PluginRouteError } from "emdash"; + +import { toCommerceApiError, type CommerceApiErrorInput } from "./kernel/api-errors.js"; + +export function throwCommerceApiError(input: CommerceApiErrorInput): never { + const e = toCommerceApiError(input); + throw new PluginRouteError(e.code, e.message, e.httpStatus, { + retryable: e.retryable, + details: e.details, + }); +} diff --git a/packages/plugins/commerce/src/schemas.ts b/packages/plugins/commerce/src/schemas.ts new file mode 100644 index 000000000..183ea8316 --- /dev/null +++ b/packages/plugins/commerce/src/schemas.ts @@ -0,0 +1,494 @@ +/** + * Zod input validation for commerce plugin routes. + */ + +import { z } from "astro/zod"; + +import { COMMERCE_LIMITS } from "./kernel/limits.js"; + +const bounded = (max: number) => z.string().min(1).max(max); +type BundleDiscountType = "none" | "fixed_amount" | "percentage"; +type BundleDiscountInput = { + bundleDiscountType?: BundleDiscountType; + bundleDiscountValueMinor?: number; + bundleDiscountValueBps?: number; +}; + +function addBundleDiscountIssue(ctx: z.RefinementCtx, message: string, path: string[]): void { + ctx.addIssue({ + code: z.ZodIssueCode.custom, + message, + path, + }); +} + +function validateBundleDiscountForProductType( + ctx: z.RefinementCtx, + productType: "simple" | "variable" | "bundle", + input: BundleDiscountInput, +): void { + const hasDiscountType = input.bundleDiscountType !== undefined; + const hasFixedAmountValue = input.bundleDiscountValueMinor !== undefined; + const hasBpsValue = input.bundleDiscountValueBps !== undefined; + const discountType: BundleDiscountType = input.bundleDiscountType ?? "none"; + + if (productType !== "bundle") { + if (hasDiscountType || hasFixedAmountValue || hasBpsValue) { + addBundleDiscountIssue(ctx, "Bundle discount fields are only supported for bundle products", [ + "bundleDiscountType", + ]); + } + return; + } + + if (discountType === "fixed_amount" && hasBpsValue) { + addBundleDiscountIssue(ctx, "bundleDiscountValueBps can only be used with percentage bundles", [ + "bundleDiscountValueBps", + ]); + return; + } + + if (discountType === "percentage" && hasFixedAmountValue) { + addBundleDiscountIssue( + ctx, + "bundleDiscountValueMinor can only be used with fixed-amount bundles", + ["bundleDiscountValueMinor"], + ); + return; + } + + if (discountType === "none" && (hasFixedAmountValue || hasBpsValue)) { + addBundleDiscountIssue(ctx, "Bundle discount values cannot be set when discount type is none", [ + "bundleDiscountValueMinor", + "bundleDiscountValueBps", + ]); + } +} + +function validateBundleDiscountPatchShape(ctx: z.RefinementCtx, input: BundleDiscountInput): void { + const hasFixedAmountValue = input.bundleDiscountValueMinor !== undefined; + const hasBpsValue = input.bundleDiscountValueBps !== undefined; + + if (input.bundleDiscountType === "fixed_amount" && hasBpsValue) { + addBundleDiscountIssue(ctx, "bundleDiscountValueBps can only be used with percentage bundles", [ + "bundleDiscountValueBps", + ]); + } + + if (input.bundleDiscountType === "percentage" && hasFixedAmountValue) { + addBundleDiscountIssue( + ctx, + "bundleDiscountValueMinor can only be used with fixed-amount bundles", + ["bundleDiscountValueMinor"], + ); + } +} + +/** + * Shared cart line item fragment — same invariants enforced at cart boundary + * and re-checked at checkout (defence in depth, not duplication). + */ +export const cartLineItemSchema = z.object({ + productId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength), + variantId: z.string().min(0).max(COMMERCE_LIMITS.maxWebhookFieldLength).optional(), + quantity: z + .number() + .int() + .min(1, "Quantity must be at least 1") + .max( + COMMERCE_LIMITS.maxLineItemQty, + `Quantity must not exceed ${COMMERCE_LIMITS.maxLineItemQty}`, + ), + /** + * Snapshot of the inventory version at the time the item was added to the cart. + * Used for optimistic concurrency during finalize. + */ + inventoryVersion: z.number().int().min(0, "Inventory version must be a non-negative integer"), + /** Price in the smallest currency unit (e.g. cents). Must be non-negative. */ + unitPriceMinor: z.number().int().min(0, "Unit price must be a non-negative integer"), +}); + +export type CartLineItemInput = z.infer; + +export const cartUpsertInputSchema = z.object({ + cartId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength), + currency: z.string().min(3).max(3).toUpperCase(), + lineItems: z + .array(cartLineItemSchema) + .min(0) + .max( + COMMERCE_LIMITS.maxCartLineItems, + `Cart must not exceed ${COMMERCE_LIMITS.maxCartLineItems} line items`, + ), + /** + * Required when mutating an existing cart. + * Absent on first creation — the server issues a fresh token and returns it once. + */ + ownerToken: z.string().min(16).max(256).optional(), +}); + +export type CartUpsertInput = z.infer; + +export const cartGetInputSchema = z.object({ + cartId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength), + /** + * Required to prove ownership for reads. + */ + ownerToken: z.string().min(16).max(256), +}); + +export type CartGetInput = z.infer; + +export const checkoutInputSchema = z.object({ + cartId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength), + /** Optional when `Idempotency-Key` header is set. */ + idempotencyKey: z.string().optional(), + /** + * Required for checkout to verify cart ownership. + */ + ownerToken: z.string().min(16).max(256), +}); + +export type CheckoutInput = z.infer; + +/** + * Possession proof for order read: must match checkout's `finalizeToken` for this `orderId`. + */ +export const checkoutGetOrderInputSchema = z.object({ + orderId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength), + finalizeToken: z.string().min(16).max(256), +}); + +export type CheckoutGetOrderInput = z.infer; + +const stripeWebhookEventDataSchema = z.object({ + id: bounded(COMMERCE_LIMITS.maxWebhookFieldLength), + type: z.string().min(1).max(128), + data: z.object({ + object: z.object({ + id: z.string().min(1).max(COMMERCE_LIMITS.maxWebhookFieldLength).optional(), + metadata: z + .record(z.string(), z.string().max(COMMERCE_LIMITS.maxWebhookFieldLength)) + .optional(), + }), + }), +}); + +const stripeWebhookEventInputSchema = stripeWebhookEventDataSchema; + +export const stripeWebhookInputSchema = stripeWebhookEventInputSchema; + +export type StripeWebhookInput = z.infer; +export type StripeWebhookEventInput = z.infer; + +export const recommendationsInputSchema = z.object({ + /** Hint for “similar to this product” (catalog id). */ + productId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength).optional(), + variantId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength).optional(), + cartId: bounded(COMMERCE_LIMITS.maxWebhookFieldLength).optional(), + limit: z.coerce.number().int().min(1).max(COMMERCE_LIMITS.maxRecommendationsLimit).optional(), +}); + +export type RecommendationsInput = z.infer; + +export const productCreateInputSchema = z + .object({ + type: z.enum(["simple", "variable", "bundle"]).default("simple"), + status: z.enum(["draft", "active", "archived"]).default("draft"), + visibility: z.enum(["public", "hidden"]).default("hidden"), + slug: z.string().trim().min(2).max(128).toLowerCase(), + title: z.string().trim().min(1).max(160), + shortDescription: z.string().trim().max(320).default(""), + longDescription: z.string().trim().max(8_000).default(""), + brand: z.string().trim().max(128).optional(), + vendor: z.string().trim().max(128).optional(), + featured: z.boolean().default(false), + sortOrder: z.number().int().min(0).max(10_000).default(0), + requiresShippingDefault: z.boolean().default(true), + taxClassDefault: z.string().trim().max(64).optional(), + attributes: z + .array( + z.object({ + name: z.string().trim().min(1).max(128), + code: z.string().trim().min(1).max(64).toLowerCase(), + kind: z.enum(["variant_defining", "descriptive"]).default("descriptive"), + position: z.number().int().min(0).max(10_000).default(0), + values: z + .array( + z.object({ + value: z.string().trim().min(1).max(128), + code: z.string().trim().min(1).max(64).toLowerCase(), + position: z.number().int().min(0).max(10_000).default(0), + }), + ) + .min(1) + .default([]), + }), + ) + .default([]), + bundleDiscountType: z.enum(["none", "fixed_amount", "percentage"]).default("none"), + bundleDiscountValueMinor: z.number().int().min(0).optional(), + bundleDiscountValueBps: z.number().int().min(0).max(10_000).optional(), + }) + .superRefine((input, ctx) => { + validateBundleDiscountForProductType(ctx, input.type, input); + }); +export type ProductCreateInput = z.input; + +export const productGetInputSchema = z.object({ + productId: z.string().trim().min(3).max(128), +}); +export type ProductGetInput = z.infer; + +export const productListInputSchema = z.object({ + type: z.enum(["simple", "variable", "bundle"]).optional(), + status: z.enum(["draft", "active", "archived"]).optional(), + visibility: z.enum(["public", "hidden"]).optional(), + categoryId: bounded(128).optional(), + tagId: bounded(128).optional(), + limit: z.coerce.number().int().min(1).max(100).default(50), +}); +export type ProductListInput = z.infer; + +export const productSkuCreateInputSchema = z.object({ + productId: z.string().trim().min(3).max(128), + skuCode: z.string().trim().min(1).max(128), + status: z.enum(["active", "inactive"]).default("active"), + unitPriceMinor: z.number().int().min(0), + compareAtPriceMinor: z.number().int().min(0).optional(), + inventoryQuantity: z.number().int().min(0), + inventoryVersion: z.number().int().min(0).default(1), + requiresShipping: z.boolean().default(true), + isDigital: z.boolean().default(false), + optionValues: z + .array( + z.object({ + attributeId: z.string().trim().min(3).max(128), + attributeValueId: z.string().trim().min(3).max(128), + }), + ) + .default([]), +}); +export type ProductSkuCreateInput = z.input; + +export const productSkuListInputSchema = z.object({ + productId: z.string().trim().min(3).max(128), + limit: z.coerce.number().int().min(1).max(100).default(100), +}); +export type ProductSkuListInput = z.infer; + +export const productUpdateInputSchema = z + .object({ + productId: z.string().trim().min(3).max(128), + type: z.enum(["simple", "variable", "bundle"]).optional(), + status: z.enum(["draft", "active", "archived"]).optional(), + visibility: z.enum(["public", "hidden"]).optional(), + slug: z.string().trim().min(2).max(128).toLowerCase().optional(), + title: z.string().trim().min(1).max(160).optional(), + shortDescription: z.string().trim().max(320).optional(), + longDescription: z.string().trim().max(8_000).optional(), + brand: z.string().trim().max(128).optional(), + vendor: z.string().trim().max(128).optional(), + featured: z.boolean().optional(), + sortOrder: z.number().int().min(0).max(10_000).optional(), + requiresShippingDefault: z.boolean().optional(), + taxClassDefault: z.string().trim().max(64).optional(), + bundleDiscountType: z.enum(["none", "fixed_amount", "percentage"]).optional(), + bundleDiscountValueMinor: z.number().int().min(0).optional(), + bundleDiscountValueBps: z.number().int().min(0).max(10_000).optional(), + }) + .superRefine((input, ctx) => { + validateBundleDiscountPatchShape(ctx, input); + }); +export type ProductUpdateInput = z.infer; + +export const productStateInputSchema = z.object({ + productId: z.string().trim().min(3).max(128), + status: z.enum(["draft", "active", "archived"]), +}); +export type ProductStateInput = z.infer; + +export const productSkuUpdateInputSchema = z.object({ + skuId: z.string().trim().min(3).max(128), + skuCode: z.string().trim().min(1).max(128).optional(), + status: z.enum(["active", "inactive"]).optional(), + unitPriceMinor: z.number().int().min(0).optional(), + compareAtPriceMinor: z.number().int().min(0).optional(), + inventoryQuantity: z.number().int().min(0).optional(), + inventoryVersion: z.number().int().min(0).optional(), + requiresShipping: z.boolean().optional(), + isDigital: z.boolean().optional(), +}); +export type ProductSkuUpdateInput = z.infer; + +export const productSkuStateInputSchema = z.object({ + skuId: z.string().trim().min(3).max(128), + status: z.enum(["active", "inactive"]), +}); +export type ProductSkuStateInput = z.infer; + +export const productAssetRegisterInputSchema = z + .object({ + externalAssetId: bounded(128), + provider: z.string().trim().min(1).max(64).default("media"), + fileName: z.string().trim().max(260).optional(), + altText: z.string().trim().max(260).optional(), + mimeType: z.string().trim().max(128).optional(), + byteSize: z.number().int().min(0).optional(), + width: z.number().int().min(1).max(20_000).optional(), + height: z.number().int().min(1).max(20_000).optional(), + metadata: z.record(z.string(), z.unknown()).optional(), + }) + .strict(); +export type ProductAssetRegisterInput = z.infer; + +export const productAssetLinkInputSchema = z + .object({ + assetId: z.string().trim().min(3).max(128), + targetType: z.enum(["product", "sku"]), + targetId: z.string().trim().min(3).max(128), + role: z.enum(["primary_image", "gallery_image", "variant_image"]).default("gallery_image"), + position: z.number().int().min(0).default(0), + }) + .strict(); +export type ProductAssetLinkInput = z.input; + +export const productAssetUnlinkInputSchema = z + .object({ + linkId: z.string().trim().min(3).max(128), + }) + .strict(); +export type ProductAssetUnlinkInput = z.infer; + +export const productAssetReorderInputSchema = z + .object({ + linkId: z.string().trim().min(3).max(128), + position: z.number().int().min(0), + }) + .strict(); +export type ProductAssetReorderInput = z.infer; + +export const bundleComponentAddInputSchema = z + .object({ + bundleProductId: bounded(128), + componentSkuId: bounded(128), + quantity: z.number().int().min(1), + position: z.number().int().min(0).default(0), + }) + .strict(); +export type BundleComponentAddInput = z.infer; + +export const bundleComponentRemoveInputSchema = z + .object({ + bundleComponentId: bounded(128), + }) + .strict(); +export type BundleComponentRemoveInput = z.infer; + +export const bundleComponentReorderInputSchema = z + .object({ + bundleComponentId: bounded(128), + position: z.number().int().min(0), + }) + .strict(); +export type BundleComponentReorderInput = z.infer; + +export const bundleComputeInputSchema = z + .object({ + productId: bounded(128), + }) + .strict(); +export type BundleComputeInput = z.infer; + +export const categoryCreateInputSchema = z + .object({ + name: z.string().trim().min(1).max(128), + slug: z.string().trim().min(2).max(128).toLowerCase(), + parentId: z.string().trim().min(3).max(128).optional(), + position: z.number().int().min(0).max(10_000).default(0), + }) + .strict(); +export type CategoryCreateInput = z.infer; + +export const categoryListInputSchema = z + .object({ + parentId: z.string().trim().min(3).max(128).optional(), + limit: z.coerce.number().int().min(1).max(100).default(100), + }) + .strict(); +export type CategoryListInput = z.infer; + +export const productCategoryLinkInputSchema = z + .object({ + productId: bounded(128), + categoryId: bounded(128), + }) + .strict(); +export type ProductCategoryLinkInput = z.infer; + +export const productCategoryUnlinkInputSchema = z + .object({ + linkId: bounded(128), + }) + .strict(); +export type ProductCategoryUnlinkInput = z.infer; + +export const tagCreateInputSchema = z + .object({ + name: z.string().trim().min(1).max(128), + slug: z.string().trim().min(2).max(128).toLowerCase(), + }) + .strict(); +export type TagCreateInput = z.infer; + +export const tagListInputSchema = z + .object({ + limit: z.coerce.number().int().min(1).max(100).default(100), + }) + .strict(); +export type TagListInput = z.infer; + +export const productTagLinkInputSchema = z + .object({ + productId: bounded(128), + tagId: bounded(128), + }) + .strict(); +export type ProductTagLinkInput = z.infer; + +export const productTagUnlinkInputSchema = z + .object({ + linkId: bounded(128), + }) + .strict(); +export type ProductTagUnlinkInput = z.infer; + +export const digitalAssetCreateInputSchema = z + .object({ + externalAssetId: bounded(128), + provider: z.string().trim().min(1).max(64).default("media"), + label: z.string().trim().max(260).optional(), + downloadLimit: z.number().int().min(1).optional(), + downloadExpiryDays: z.number().int().min(1).optional(), + isManualOnly: z.boolean().default(false), + isPrivate: z.boolean().default(true), + metadata: z.record(z.string(), z.unknown()).optional(), + }) + .strict(); +export type DigitalAssetCreateInput = z.input; + +export const digitalEntitlementCreateInputSchema = z + .object({ + skuId: bounded(128), + digitalAssetId: bounded(128), + grantedQuantity: z.number().int().min(1).default(1), + }) + .strict(); +export type DigitalEntitlementCreateInput = z.infer; + +export const digitalEntitlementRemoveInputSchema = z + .object({ + entitlementId: bounded(128), + }) + .strict(); +export type DigitalEntitlementRemoveInput = z.infer; diff --git a/packages/plugins/commerce/src/services/commerce-extension-seams.test.ts b/packages/plugins/commerce/src/services/commerce-extension-seams.test.ts new file mode 100644 index 000000000..66512bcb9 --- /dev/null +++ b/packages/plugins/commerce/src/services/commerce-extension-seams.test.ts @@ -0,0 +1,331 @@ +import { describe, expect, it, vi } from "vitest"; + +import { COMMERCE_LIMITS } from "../kernel/limits.js"; +import * as rateLimitKv from "../lib/rate-limit-kv.js"; +import { webhookReceiptDocId } from "../orchestration/finalize-payment.js"; +import type { + StoredInventoryLedgerEntry, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredWebhookReceipt, +} from "../types.js"; +import { createRecommendationsRoute, queryFinalizationState } from "./commerce-extension-seams.js"; + +interface StoredCollection { + get(id: string): Promise; + query(options?: { + where?: Record; + limit?: number; + }): Promise<{ items: Array<{ id: string; data: T }>; hasMore: boolean }>; +} + +class MemKv { + store = new Map(); + + async get(key: string): Promise { + const row = this.store.get(key); + return row === undefined ? null : (row as T); + } + + async set(key: string, value: unknown): Promise { + this.store.set(key, value); + } + + async delete(key: string): Promise { + return this.store.delete(key); + } + + async list(): Promise> { + return Array.from(this.store.entries(), ([key, value]) => ({ key, value })); + } +} + +class MemCollection implements StoredCollection { + constructor(public readonly rows = new Map()) {} + + async get(id: string): Promise { + const row = this.rows.get(id); + return row ? structuredClone(row) : null; + } + + async query(options?: { where?: Record; limit?: number }) { + const where = options?.where ?? {}; + const limit = options?.limit ?? 50; + const items = [...this.rows] + .filter(([_, row]) => + Object.entries(where).every( + ([field, value]) => (row as Record)[field] === value, + ), + ) + .slice(0, limit) + .map(([id, data]) => ({ id, data: structuredClone(data) })); + return { items, hasMore: false }; + } +} + +describe("createRecommendationsRoute", () => { + const ctx = (input: { limit?: number }) => + ({ + request: new Request("https://example.test/recommendations", { method: "POST" }), + input, + }) as never; + + it("returns enabled response from a recommendation resolver", async () => { + const route = createRecommendationsRoute({ + providerId: "local-recs", + resolver: async () => ({ + productIds: ["p1", "p2", "p1", ""], + reason: "fallback", + }), + }); + + const out = await route(ctx({ limit: 2 })); + expect(out).toEqual({ + ok: true, + enabled: true, + strategy: "provider", + productIds: ["p1", "p2"], + providerId: "local-recs", + reason: "fallback", + }); + }); + + it("degrades to disabled output when resolver is missing", async () => { + const route = createRecommendationsRoute(); + const out = await route(ctx({ limit: 3 })); + expect(out).toEqual({ + ok: true, + enabled: false, + strategy: "disabled", + productIds: [], + reason: "no_recommender_configured", + }); + }); +}); + +describe("queryFinalizationState", () => { + const order: StoredOrder = { + cartId: "cart_1", + paymentPhase: "paid", + currency: "USD", + lineItems: [], + finalizeTokenHash: "placeholder-finalize-token-hash", + totalMinor: 1000, + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }; + + const paymentAttempt: StoredPaymentAttempt = { + orderId: "order_1", + providerId: "stripe", + status: "succeeded", + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }; + + const ledgerEntry: StoredInventoryLedgerEntry = { + productId: "prod_1", + variantId: "", + delta: -1, + referenceType: "order", + referenceId: "order_1", + createdAt: "2026-04-03T12:00:00.000Z", + }; + + const stock: StoredInventoryStock = { + productId: "prod_1", + variantId: "", + version: 1, + quantity: 1, + updatedAt: "2026-04-03T12:00:00.000Z", + }; + + const receipt: StoredWebhookReceipt = { + providerId: "stripe", + externalEventId: "evt_1", + orderId: "order_1", + status: "processed", + createdAt: "2026-04-03T12:00:00.000Z", + updatedAt: "2026-04-03T12:00:00.000Z", + }; + + it("reflects finalized state across read-only service seam", async () => { + const orders = new MemCollection(new Map([["order_1", order]])); + const attempts = new MemCollection(new Map([["a1", paymentAttempt]])); + const inventoryLedger = new MemCollection(new Map([["l1", ledgerEntry]])); + const inventoryStock = new MemCollection(new Map([["s1", stock]])); + const webhookReceipts = new MemCollection( + new Map([[webhookReceiptDocId("stripe", "evt_1"), receipt]]), + ); + + const out = await queryFinalizationState( + { + request: new Request("https://example.test/webhooks/stripe", { method: "POST" }), + storage: { + orders, + paymentAttempts: attempts, + inventoryLedger, + inventoryStock, + webhookReceipts, + }, + requestMeta: { ip: "127.0.0.1" }, + kv: new MemKv(), + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never, + { + orderId: "order_1", + providerId: "stripe", + externalEventId: "evt_1", + }, + ); + expect(out).toMatchObject({ + isInventoryApplied: true, + isOrderPaid: true, + isPaymentAttemptSucceeded: true, + isReceiptProcessed: true, + receiptStatus: "processed", + resumeState: "replay_processed", + }); + }); + + it("rate-limits finalization diagnostics per IP", async () => { + const orders = new MemCollection(new Map([["order_1", order]])); + const attempts = new MemCollection(new Map([["a1", paymentAttempt]])); + const inventoryLedger = new MemCollection(new Map([["l1", ledgerEntry]])); + const inventoryStock = new MemCollection(new Map([["s1", stock]])); + const webhookReceipts = new MemCollection( + new Map([[webhookReceiptDocId("stripe", "evt_1"), receipt]]), + ); + const ctxBase = { + request: new Request("https://example.test/diagnostics", { method: "POST" }), + storage: { + orders, + paymentAttempts: attempts, + inventoryLedger, + inventoryStock, + webhookReceipts, + }, + requestMeta: { ip: "127.0.0.1" }, + kv: new MemKv(), + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never; + + const consumeSpy = vi + .spyOn(rateLimitKv, "consumeKvRateLimit") + .mockImplementation(async (options) => { + expect(options.limit).toBe(COMMERCE_LIMITS.defaultFinalizationDiagnosticsPerIpPerWindow); + expect(options.windowMs).toBe(COMMERCE_LIMITS.defaultRateWindowMs); + expect(options.keySuffix.startsWith("finalize_diag:ip:")).toBe(true); + return false; + }); + const getSpy = vi.spyOn(orders, "get"); + await expect( + queryFinalizationState(ctxBase, { + orderId: "order_1", + providerId: "stripe", + externalEventId: "evt_1", + }), + ).rejects.toMatchObject({ code: "rate_limited" }); + expect(consumeSpy).toHaveBeenCalledTimes(1); + expect(getSpy).toHaveBeenCalledTimes(0); + consumeSpy.mockRestore(); + getSpy.mockRestore(); + }); + + it("coalesces concurrent identical diagnostics reads (single storage pass)", async () => { + const orders = new MemCollection(new Map([["order_1", order]])); + const attempts = new MemCollection(new Map([["a1", paymentAttempt]])); + const inventoryLedger = new MemCollection(new Map([["l1", ledgerEntry]])); + const inventoryStock = new MemCollection(new Map([["s1", stock]])); + const webhookReceipts = new MemCollection( + new Map([[webhookReceiptDocId("stripe", "evt_1"), receipt]]), + ); + const getSpy = vi.spyOn(orders, "get"); + + const ctxBase = { + request: new Request("https://example.test/diagnostics", { method: "POST" }), + storage: { + orders, + paymentAttempts: attempts, + inventoryLedger, + inventoryStock, + webhookReceipts, + }, + requestMeta: { ip: "10.0.0.2" }, + kv: new MemKv(), + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never; + + const input = { + orderId: "order_1", + providerId: "stripe", + externalEventId: "evt_1", + }; + + await Promise.all([ + queryFinalizationState(ctxBase, input), + queryFinalizationState(ctxBase, input), + ]); + + expect(getSpy.mock.calls.filter((c) => c[0] === "order_1").length).toBe(1); + getSpy.mockRestore(); + }); + + it("serves fresh-enough cached diagnostics without re-querying storage", async () => { + const orders = new MemCollection(new Map([["order_1", order]])); + const attempts = new MemCollection(new Map([["a1", paymentAttempt]])); + const inventoryLedger = new MemCollection(new Map([["l1", ledgerEntry]])); + const inventoryStock = new MemCollection(new Map([["s1", stock]])); + const webhookReceipts = new MemCollection( + new Map([[webhookReceiptDocId("stripe", "evt_1"), receipt]]), + ); + const getSpy = vi.spyOn(orders, "get"); + + const ctxBase = { + request: new Request("https://example.test/diagnostics", { method: "POST" }), + storage: { + orders, + paymentAttempts: attempts, + inventoryLedger, + inventoryStock, + webhookReceipts, + }, + requestMeta: { ip: "10.0.0.3" }, + kv: new MemKv(), + log: { + info: () => undefined, + warn: () => undefined, + error: () => undefined, + debug: () => undefined, + }, + } as never; + + const input = { + orderId: "order_1", + providerId: "stripe", + externalEventId: "evt_1", + }; + + await queryFinalizationState(ctxBase, input); + await queryFinalizationState(ctxBase, input); + + expect(getSpy.mock.calls.filter((c) => c[0] === "order_1").length).toBe(1); + getSpy.mockRestore(); + }); +}); diff --git a/packages/plugins/commerce/src/services/commerce-extension-seams.ts b/packages/plugins/commerce/src/services/commerce-extension-seams.ts new file mode 100644 index 000000000..e1a46d53c --- /dev/null +++ b/packages/plugins/commerce/src/services/commerce-extension-seams.ts @@ -0,0 +1,95 @@ +/** + * Stable service seams for extension and MCP consumers. + * + * These helpers expose read-only or adapter-based entry points so third-party + * packages can integrate without replacing kernel-owned mutation logic. + */ + +import type { RouteContext } from "emdash"; + +import { asCollection } from "../handlers/catalog-conflict.js"; +import { + createRecommendationsHandler, + type RecommendationsHandlerOptions, + type RecommendationsResponse, +} from "../handlers/recommendations.js"; +import { + handlePaymentWebhook, + type CommerceWebhookAdapter, + type WebhookFinalizeResponse, +} from "../handlers/webhook-handler.js"; +import { readFinalizationStatusWithGuards } from "../lib/finalization-diagnostics-readthrough.js"; +import { + queryFinalizationStatus, + type FinalizationStatus, + type FinalizePaymentPorts, +} from "../orchestration/finalize-payment.js"; +import type { RecommendationsInput } from "../schemas.js"; +import type { + StoredInventoryLedgerEntry, + StoredInventoryStock, + StoredOrder, + StoredPaymentAttempt, + StoredWebhookReceipt, +} from "../types.js"; +import { + COMMERCE_MCP_ACTORS, + type CommerceMcpActor, + type CommerceMcpOperationContext, +} from "./commerce-provider-contracts.js"; + +function buildFinalizePorts(ctx: RouteContext): FinalizePaymentPorts { + return { + orders: asCollection(ctx.storage.orders), + webhookReceipts: asCollection(ctx.storage.webhookReceipts), + paymentAttempts: asCollection(ctx.storage.paymentAttempts), + inventoryLedger: asCollection(ctx.storage.inventoryLedger), + inventoryStock: asCollection(ctx.storage.inventoryStock), + log: ctx.log, + }; +} + +export type { FinalizationStatus, CommerceWebhookAdapter, RecommendationsResponse }; + +export { COMMERCE_MCP_ACTORS }; +export type { CommerceMcpActor, CommerceMcpOperationContext }; + +export function createRecommendationsRoute( + options: RecommendationsHandlerOptions = {}, +): (ctx: RouteContext) => Promise { + return createRecommendationsHandler(options); +} + +export function createPaymentWebhookRoute( + adapter: CommerceWebhookAdapter, +): (ctx: RouteContext) => Promise { + return (ctx: RouteContext) => handlePaymentWebhook(ctx, adapter); +} + +export type FinalizationStatusInput = { + orderId: string; + providerId: string; + externalEventId: string; +}; + +/** + * Stable read-only status helper for MCP/tooling and operational diagnostics. + * Returned state includes both binary checkpoints and a resumability hint so + * callers can drive a controlled retry policy from one query. + * + * Serverless Option B: per-IP KV rate limit, short KV read-through cache, and + * in-isolate in-flight coalescing for identical keys (warm Workers/processes). + */ +export async function queryFinalizationState( + ctx: RouteContext, + input: FinalizationStatusInput, +): Promise { + return readFinalizationStatusWithGuards(ctx, input, () => + queryFinalizationStatus( + buildFinalizePorts(ctx), + input.orderId, + input.providerId, + input.externalEventId, + ), + ); +} diff --git a/packages/plugins/commerce/src/services/commerce-provider-contracts.test.ts b/packages/plugins/commerce/src/services/commerce-provider-contracts.test.ts new file mode 100644 index 000000000..8544eb18b --- /dev/null +++ b/packages/plugins/commerce/src/services/commerce-provider-contracts.test.ts @@ -0,0 +1,26 @@ +import { describe, expect, it } from "vitest"; + +import { + PAYMENT_DEFAULTS, + COMMERCE_MCP_ACTORS, + resolvePaymentProviderId, +} from "./commerce-provider-contracts.js"; + +describe("commerce-provider-contracts", () => { + it("resolves an empty or missing payment provider id to the default", () => { + expect(resolvePaymentProviderId(undefined)).toBe(PAYMENT_DEFAULTS.defaultPaymentProviderId); + expect(resolvePaymentProviderId("")).toBe(PAYMENT_DEFAULTS.defaultPaymentProviderId); + expect(resolvePaymentProviderId(" ")).toBe(PAYMENT_DEFAULTS.defaultPaymentProviderId); + }); + + it("preserves explicit provider ids", () => { + expect(resolvePaymentProviderId("stripe")).toBe("stripe"); + expect(resolvePaymentProviderId("paypal")).toBe("paypal"); + }); + + it("exports deterministic MCP actor contract", () => { + expect(Object.keys(COMMERCE_MCP_ACTORS)).toEqual(["system", "merchant", "agent", "customer"]); + expect(COMMERCE_MCP_ACTORS.system).toBe("system"); + expect(COMMERCE_MCP_ACTORS.customer).toBe("customer"); + }); +}); diff --git a/packages/plugins/commerce/src/services/commerce-provider-contracts.ts b/packages/plugins/commerce/src/services/commerce-provider-contracts.ts new file mode 100644 index 000000000..7dccfb4cb --- /dev/null +++ b/packages/plugins/commerce/src/services/commerce-provider-contracts.ts @@ -0,0 +1,80 @@ +import type { RouteContext } from "emdash"; + +export type CommerceProviderType = "payment" | "shipping" | "tax" | "fulfillment"; + +const DEFAULT_PAYMENT_PROVIDER_ID = "stripe"; + +/** Standard checkout/provider default used by the money path contracts. */ +export const PAYMENT_DEFAULTS = { + defaultPaymentProviderId: DEFAULT_PAYMENT_PROVIDER_ID, +} as const; + +/** + * Stripe webhook signature verification bounds (shared by `webhooks/stripe` and tests). + * Tolerance is seconds of clock skew allowed between signature timestamp and server time. + */ +export const STRIPE_WEBHOOK_SIGNATURE = { + defaultToleranceSeconds: 300, + minToleranceSeconds: 30, + maxToleranceSeconds: 7_200, +} as const; + +/** + * Resolve a provider identifier from user input and preserve deterministic defaults. + * Empty/whitespace values are treated as "unset" and map to the checkout default. + */ +export function resolvePaymentProviderId(value: string | undefined): string { + const normalized = value?.trim() ?? ""; + return normalized.length > 0 ? normalized : PAYMENT_DEFAULTS.defaultPaymentProviderId; +} + +export interface CommerceProviderDescriptor { + providerId: string; + providerType: CommerceProviderType; + isActive: boolean; + displayName?: string; +} + +export interface CommerceWebhookInput { + orderId: string; + externalEventId: string; + finalizeToken: string; +} + +/** + * Provider-specific webhook adapter contract for third-party payment integrations. + * The adapter is responsible for request authenticity checks and extracting + * domain inputs for finalize orchestration. + */ +export interface CommerceWebhookAdapter { + /** Canonical provider id used in receipts/attempts and payment diagnostics. */ + providerId: string; + /** Verify request authenticity and freshness before any checkout mutation is performed. */ + verifyRequest(ctx: RouteContext): Promise; + /** Convert a raw provider request into finalized orchestration input fields. */ + buildFinalizeInput(ctx: RouteContext): CommerceWebhookInput; + /** Stable request correlation for logs and replay diagnostics. */ + buildCorrelationId(ctx: RouteContext): string; + /** Provider-scoped suffix for webhook rate-limit keys. */ + buildRateLimitSuffix(ctx: RouteContext): string; +} + +export type CommerceWebhookFinalizeResponse = + | { ok: true; replay: true; reason: string } + | { ok: true; replay: false; orderId: string }; + +export const COMMERCE_MCP_ACTORS = { + system: "system", + merchant: "merchant", + agent: "agent", + customer: "customer", +} as const; + +export type CommerceMcpActor = keyof typeof COMMERCE_MCP_ACTORS; + +export type CommerceMcpOperationContext = { + actor: CommerceMcpActor; + actorId?: string; + requestId?: string; + traceId?: string; +}; diff --git a/packages/plugins/commerce/src/settings-keys.ts b/packages/plugins/commerce/src/settings-keys.ts new file mode 100644 index 000000000..b234464ac --- /dev/null +++ b/packages/plugins/commerce/src/settings-keys.ts @@ -0,0 +1,10 @@ +/** + * KV keys for admin `settingsSchema` (EmDash stores these under the plugin prefix). + * Read with `ctx.kv.get("settings:stripeSecretKey")` etc. + */ +export const COMMERCE_SETTINGS_KEYS = { + stripePublishableKey: "settings:stripePublishableKey", + stripeSecretKey: "settings:stripeSecretKey", + stripeWebhookSecret: "settings:stripeWebhookSecret", + defaultCurrency: "settings:defaultCurrency", +} as const; diff --git a/packages/plugins/commerce/src/storage.ts b/packages/plugins/commerce/src/storage.ts new file mode 100644 index 000000000..7784920be --- /dev/null +++ b/packages/plugins/commerce/src/storage.ts @@ -0,0 +1,308 @@ +/** + * Declared plugin storage collections and indexes (EmDash `_plugin_storage`). + */ + +import type { PluginStorageConfig } from "emdash"; + +export type CommerceStorage = PluginStorageConfig & { + products: { + indexes: ["type", "status", "visibility", "slug", "createdAt", "updatedAt", "featured"]; + uniqueIndexes: [["slug"]]; + }; + productAttributes: { + indexes: [ + "productId", + "kind", + "code", + "position", + ["productId", "kind"], + ["productId", "code"], + ]; + uniqueIndexes: [["productId", "code"]]; + }; + productAttributeValues: { + indexes: ["attributeId", "code", "position", ["attributeId", "code"]]; + uniqueIndexes: [["attributeId", "code"]]; + }; + productSkuOptionValues: { + indexes: ["skuId", "attributeId", "attributeValueId"]; + uniqueIndexes: [["skuId", "attributeId"]]; + }; + digitalAssets: { + indexes: [ + "provider", + "externalAssetId", + "label", + "isPrivate", + "isManualOnly", + "createdAt", + ["provider", "externalAssetId"], + ]; + uniqueIndexes: [["provider", "externalAssetId"]]; + }; + digitalEntitlements: { + indexes: ["skuId", "digitalAssetId", "createdAt"]; + uniqueIndexes: [["skuId", "digitalAssetId"]]; + }; + categories: { + indexes: [ + "slug", + "name", + "parentId", + "position", + ["parentId", "position"], + ["parentId", "slug"], + ]; + uniqueIndexes: [["slug"]]; + }; + productCategoryLinks: { + indexes: ["productId", "categoryId"]; + uniqueIndexes: [["productId", "categoryId"]]; + }; + productTags: { + indexes: ["slug", "name", "createdAt"]; + uniqueIndexes: [["slug"]]; + }; + productTagLinks: { + indexes: ["productId", "tagId"]; + uniqueIndexes: [["productId", "tagId"]]; + }; + bundleComponents: { + indexes: [ + "bundleProductId", + "componentSkuId", + "position", + "createdAt", + ["bundleProductId", "position"], + ]; + uniqueIndexes: [["bundleProductId", "componentSkuId"]]; + }; + productAssets: { + indexes: [ + "provider", + "externalAssetId", + "createdAt", + "updatedAt", + ["provider", "externalAssetId"], + ]; + uniqueIndexes: [["provider", "externalAssetId"]]; + }; + productAssetLinks: { + indexes: [ + "targetType", + "targetId", + "role", + "position", + "createdAt", + "assetId", + ["targetType", "targetId"], + ]; + uniqueIndexes: [["targetType", "targetId", "assetId"]]; + }; + productSkus: { + indexes: ["productId", "status", "requiresShipping", "createdAt", "skuCode"]; + uniqueIndexes: [["skuCode"]]; + }; + orders: { + indexes: ["paymentPhase", "createdAt", "cartId"]; + }; + carts: { + indexes: ["updatedAt"]; + }; + paymentAttempts: { + indexes: [ + "orderId", + "providerId", + "status", + "createdAt", + ["orderId", "status"], + ["orderId", "providerId", "status"], + ["providerId", "createdAt"], + ]; + }; + webhookReceipts: { + indexes: [ + "providerId", + "externalEventId", + "orderId", + "status", + "createdAt", + ["providerId", "externalEventId"], + ["orderId", "createdAt"], + ]; + uniqueIndexes: [["providerId", "externalEventId"]]; + }; + idempotencyKeys: { + indexes: ["route", "createdAt", ["keyHash", "route"]]; + uniqueIndexes: [["keyHash", "route"]]; + }; + inventoryLedger: { + indexes: [ + "productId", + "variantId", + "referenceType", + "referenceId", + "createdAt", + ["productId", "createdAt"], + ["variantId", "createdAt"], + ["referenceType", "referenceId"], + ]; + uniqueIndexes: [["referenceType", "referenceId", "productId", "variantId"]]; + }; + /** Materialized per SKU stock + monotonic version for finalize-time checks. */ + inventoryStock: { + indexes: ["productId", "variantId", "updatedAt", ["productId", "variantId"]]; + uniqueIndexes: [["productId", "variantId"]]; + }; +}; + +export const COMMERCE_STORAGE_CONFIG: PluginStorageConfig = { + products: { + indexes: ["type", "status", "visibility", "slug", "createdAt", "updatedAt", "featured"], + uniqueIndexes: [["slug"]], + }, + productAttributes: { + indexes: [ + "productId", + "kind", + "code", + "position", + ["productId", "kind"], + ["productId", "code"], + ], + uniqueIndexes: [["productId", "code"]], + }, + productAttributeValues: { + indexes: ["attributeId", "code", "position", ["attributeId", "code"]], + uniqueIndexes: [["attributeId", "code"]], + }, + productSkuOptionValues: { + indexes: ["skuId", "attributeId", "attributeValueId"], + uniqueIndexes: [["skuId", "attributeId"]], + }, + digitalAssets: { + indexes: [ + "provider", + "externalAssetId", + "label", + "isPrivate", + "isManualOnly", + "createdAt", + ["provider", "externalAssetId"], + ], + uniqueIndexes: [["provider", "externalAssetId"]], + }, + digitalEntitlements: { + indexes: ["skuId", "digitalAssetId", "createdAt"], + uniqueIndexes: [["skuId", "digitalAssetId"]], + }, + categories: { + indexes: [ + "slug", + "name", + "parentId", + "position", + ["parentId", "position"], + ["parentId", "slug"], + ], + uniqueIndexes: [["slug"]], + }, + productCategoryLinks: { + indexes: ["productId", "categoryId"], + uniqueIndexes: [["productId", "categoryId"]], + }, + productTags: { + indexes: ["slug", "name", "createdAt"], + uniqueIndexes: [["slug"]], + }, + productTagLinks: { + indexes: ["productId", "tagId"], + uniqueIndexes: [["productId", "tagId"]], + }, + bundleComponents: { + indexes: [ + "bundleProductId", + "componentSkuId", + "position", + "createdAt", + ["bundleProductId", "position"], + ], + uniqueIndexes: [["bundleProductId", "componentSkuId"]], + }, + productAssets: { + indexes: [ + "provider", + "externalAssetId", + "createdAt", + "updatedAt", + ["provider", "externalAssetId"], + ], + uniqueIndexes: [["provider", "externalAssetId"]], + }, + productAssetLinks: { + indexes: [ + "targetType", + "targetId", + "role", + "position", + "createdAt", + "assetId", + ["targetType", "targetId"], + ], + uniqueIndexes: [["targetType", "targetId", "assetId"]], + }, + productSkus: { + indexes: ["productId", "status", "requiresShipping", "createdAt", "skuCode"], + uniqueIndexes: [["skuCode"]], + }, + orders: { + indexes: ["paymentPhase", "createdAt", "cartId"], + }, + carts: { + indexes: ["updatedAt"], + }, + paymentAttempts: { + indexes: [ + "orderId", + "providerId", + "status", + "createdAt", + ["orderId", "status"], + ["orderId", "providerId", "status"], + ["providerId", "createdAt"], + ], + }, + webhookReceipts: { + indexes: [ + "providerId", + "externalEventId", + "orderId", + "status", + "createdAt", + ["providerId", "externalEventId"], + ["orderId", "createdAt"], + ], + uniqueIndexes: [["providerId", "externalEventId"]], + }, + idempotencyKeys: { + indexes: ["route", "createdAt", ["keyHash", "route"]], + uniqueIndexes: [["keyHash", "route"]], + }, + inventoryLedger: { + indexes: [ + "productId", + "variantId", + "referenceType", + "referenceId", + "createdAt", + ["productId", "createdAt"], + ["variantId", "createdAt"], + ["referenceType", "referenceId"], + ], + uniqueIndexes: [["referenceType", "referenceId", "productId", "variantId"]], + }, + inventoryStock: { + indexes: ["productId", "variantId", "updatedAt", ["productId", "variantId"]], + uniqueIndexes: [["productId", "variantId"]], + }, +}; diff --git a/packages/plugins/commerce/src/types.ts b/packages/plugins/commerce/src/types.ts new file mode 100644 index 000000000..b46b30ce8 --- /dev/null +++ b/packages/plugins/commerce/src/types.ts @@ -0,0 +1,384 @@ +/** + * Plugin storage document shapes for commerce (stage-1 vertical slice). + * Field names use camelCase so they match indexed JSON paths. + */ + +import type { OrderPaymentPhase } from "./kernel/finalize-decision.js"; + +export type { OrderPaymentPhase }; + +export interface CartLineItem { + /** + * Stable catalog reference. Human-readable titles and `shortDescription` for + * embeddings live on the product document, not on this row. + */ + productId: string; + /** Empty string when the catalog does not use variants. */ + variantId?: string; + quantity: number; + /** Inventory version captured when the line was last mutated (optimistic finalize). */ + inventoryVersion: number; + unitPriceMinor: number; +} + +export interface StoredCart { + currency: string; + lineItems: CartLineItem[]; + /** + * SHA-256 hex of the owner token issued at cart creation. + * Reads (`cart/get`) and mutations (`cart/upsert`) must present the matching raw token. + */ + ownerTokenHash: string; + createdAt: string; + updatedAt: string; +} + +export interface OrderLineItem { + snapshot?: OrderLineItemSnapshot; + /** Catalog id for historical order display. */ + productId: string; + variantId?: string; + quantity: number; + inventoryVersion: number; + unitPriceMinor: number; +} + +export interface OrderLineItemOptionSelection { + attributeId: string; + attributeValueId: string; +} + +export interface OrderLineItemImageSnapshot { + linkId: string; + assetId: string; + provider: string; + externalAssetId: string; + fileName?: string; + altText?: string; +} + +export interface OrderLineItemDigitalEntitlementSnapshot { + entitlementId: string; + digitalAssetId: string; + digitalAssetLabel?: string; + grantedQuantity: number; + downloadLimit?: number; + downloadExpiryDays?: number; + isManualOnly: boolean; + isPrivate: boolean; +} + +export interface OrderLineItemBundleComponentSummary { + componentId: string; + componentSkuId: string; + componentSkuCode: string; + componentProductId: string; + componentPriceMinor: number; + quantityPerBundle: number; + subtotalContributionMinor: number; + availableBundleQuantity: number; + /** + * Component SKU stock `version` captured at checkout for optimistic finalize. + * Missing or negative values indicate an incomplete bundle snapshot and should + * fail finalize reconciliation. + */ + componentInventoryVersion: number; +} + +export interface OrderLineItemBundleSummary { + productId: string; + subtotalMinor: number; + discountType: BundleDiscountType; + discountValueMinor: number; + discountValueBps: number; + discountAmountMinor: number; + finalPriceMinor: number; + availability: number; + components: OrderLineItemBundleComponentSummary[]; +} + +export interface OrderLineItemSnapshot { + productId: string; + skuId: string; + productType: ProductType; + productTitle: string; + productSlug?: string; + skuCode: string; + selectedOptions: OrderLineItemOptionSelection[]; + currency: string; + unitPriceMinor: number; + compareAtPriceMinor?: number; + lineSubtotalMinor: number; + lineDiscountMinor: number; + lineTotalMinor: number; + requiresShipping: boolean; + isDigital: boolean; + image?: OrderLineItemImageSnapshot; + bundleSummary?: OrderLineItemBundleSummary; + digitalEntitlements?: OrderLineItemDigitalEntitlementSnapshot[]; +} + +export interface StoredOrder { + cartId: string; + paymentPhase: OrderPaymentPhase; + currency: string; + lineItems: OrderLineItem[]; + totalMinor: number; + /** + * SHA-256 hex of the finalize token generated by checkout. + * Webhook finalize and order reads must present the matching raw token (e.g. copied from + * PaymentIntent metadata) or verification fails. + */ + finalizeTokenHash: string; + createdAt: string; + updatedAt: string; +} + +export type PaymentAttemptStatus = "pending" | "succeeded" | "failed"; + +export interface StoredPaymentAttempt { + orderId: string; + providerId: string; + status: PaymentAttemptStatus; + externalRef?: string; + createdAt: string; + updatedAt: string; +} + +export type WebhookReceiptStatus = "processed" | "duplicate" | "pending" | "error"; + +export type WebhookReceiptErrorCode = + | "ORDER_NOT_FOUND" + | "ORDER_STATE_CONFLICT" + | "INVENTORY_CHANGED" + | "INSUFFICIENT_STOCK" + | "PRODUCT_UNAVAILABLE" + | "VARIANT_UNAVAILABLE"; + +export type WebhookReceiptClaimState = "unclaimed" | "claimed" | "released"; + +export interface StoredWebhookReceipt { + providerId: string; + externalEventId: string; + orderId: string; + status: WebhookReceiptStatus; + correlationId?: string; + createdAt: string; + updatedAt: string; + /** Optional terminal error classification for operational triage and recovery tooling. */ + errorCode?: WebhookReceiptErrorCode; + /** Optional operational details for terminal error receipts. */ + errorDetails?: Record; + /** Lease owner for concurrency recovery / claim transfer. */ + claimOwner?: string; + /** Lease token tied to a claim attempt (opaque to storage layer). */ + claimToken?: string; + /** Lease expiry timestamp (ISO-8601 string) for stale-claim recovery. */ + claimExpiresAt?: string; + /** Storage version observed when claim token was issued (for CAS-style transitions). */ + claimVersion?: string; + /** High-level state of claim ownership (`claimed` when actively owned). */ + claimState?: WebhookReceiptClaimState; +} + +export interface StoredIdempotencyKey { + route: string; + keyHash: string; + httpStatus: number; + responseBody: unknown; + createdAt: string; +} + +/** Append-only movement row; materialized quantity lives in {@link StoredInventoryStock}. */ +export interface StoredInventoryLedgerEntry { + productId: string; + /** Empty string when the catalog does not use variants. */ + variantId: string; + delta: number; + referenceType: string; + referenceId: string; + createdAt: string; +} + +export interface StoredInventoryStock { + productId: string; + variantId: string; + version: number; + quantity: number; + updatedAt: string; +} + +export type ProductType = "simple" | "variable" | "bundle"; +export type ProductStatus = "draft" | "active" | "archived"; +export type ProductVisibility = "public" | "hidden"; +export type ProductSkuStatus = "active" | "inactive"; +export type BundleDiscountType = "none" | "fixed_amount" | "percentage"; + +export interface StoredProduct { + id: string; + type: ProductType; + status: ProductStatus; + visibility: ProductVisibility; + slug: string; + title: string; + shortDescription: string; + longDescription: string; + brand?: string; + vendor?: string; + featured: boolean; + sortOrder: number; + requiresShippingDefault: boolean; + taxClassDefault?: string; + metadataJson?: Record; + bundleDiscountType?: BundleDiscountType; + bundleDiscountValueMinor?: number; + bundleDiscountValueBps?: number; + createdAt: string; + updatedAt: string; + publishedAt?: string; + archivedAt?: string; +} + +export interface StoredProductSku { + id: string; + productId: string; + skuCode: string; + status: ProductSkuStatus; + unitPriceMinor: number; + compareAtPriceMinor?: number; + inventoryQuantity: number; + inventoryVersion: number; + requiresShipping: boolean; + isDigital: boolean; + createdAt: string; + updatedAt: string; +} + +export type ProductAttributeKind = "variant_defining" | "descriptive"; + +export interface StoredProductAttribute { + id: string; + productId: string; + name: string; + code: string; + kind: ProductAttributeKind; + position: number; + createdAt: string; + updatedAt: string; +} + +export interface StoredProductAttributeValue { + id: string; + attributeId: string; + value: string; + code: string; + position: number; + createdAt: string; + updatedAt: string; +} + +export interface StoredProductSkuOptionValue { + id: string; + skuId: string; + attributeId: string; + attributeValueId: string; + createdAt: string; + updatedAt: string; +} + +export interface StoredDigitalAsset { + id: string; + provider: string; + externalAssetId: string; + label?: string; + downloadLimit?: number; + downloadExpiryDays?: number; + isManualOnly: boolean; + isPrivate: boolean; + createdAt: string; + updatedAt: string; + metadata?: Record; +} + +export interface StoredDigitalEntitlement { + id: string; + skuId: string; + digitalAssetId: string; + grantedQuantity: number; + createdAt: string; + updatedAt: string; +} + +export interface StoredCategory { + id: string; + name: string; + slug: string; + parentId?: string; + position: number; + createdAt: string; + updatedAt: string; +} + +export interface StoredProductCategoryLink { + id: string; + productId: string; + categoryId: string; + createdAt: string; + updatedAt: string; +} + +export interface StoredProductTag { + id: string; + name: string; + slug: string; + createdAt: string; + updatedAt: string; +} + +export interface StoredProductTagLink { + id: string; + productId: string; + tagId: string; + createdAt: string; + updatedAt: string; +} + +export interface StoredBundleComponent { + id: string; + bundleProductId: string; + componentSkuId: string; + quantity: number; + position: number; + createdAt: string; + updatedAt: string; +} + +export type ProductAssetLinkTarget = "product" | "sku"; + +export type ProductAssetRole = "primary_image" | "gallery_image" | "variant_image"; + +export interface StoredProductAsset { + id: string; + provider: string; + externalAssetId: string; + fileName?: string; + altText?: string; + mimeType?: string; + byteSize?: number; + width?: number; + height?: number; + createdAt: string; + updatedAt: string; + metadata?: Record; +} + +export interface StoredProductAssetLink { + id: string; + targetType: ProductAssetLinkTarget; + targetId: string; + assetId: string; + role: ProductAssetRole; + position: number; + createdAt: string; + updatedAt: string; +} diff --git a/packages/plugins/commerce/tsconfig.json b/packages/plugins/commerce/tsconfig.json new file mode 100644 index 000000000..0732533c3 --- /dev/null +++ b/packages/plugins/commerce/tsconfig.json @@ -0,0 +1,20 @@ +{ + "compilerOptions": { + "target": "ES2022", + "module": "preserve", + "moduleResolution": "bundler", + "moduleDetection": "force", + "verbatimModuleSyntax": true, + "strict": true, + "noEmit": true, + "skipLibCheck": true, + "rootDir": "src", + "lib": ["es2022", "DOM", "DOM.Iterable"], + "noUncheckedIndexedAccess": true, + "noImplicitOverride": true, + "isolatedModules": true, + "resolveJsonModule": true, + "esModuleInterop": true + }, + "include": ["src/**/*.ts"] +} diff --git a/packages/plugins/commerce/vitest.config.ts b/packages/plugins/commerce/vitest.config.ts new file mode 100644 index 000000000..cfbd4c3fc --- /dev/null +++ b/packages/plugins/commerce/vitest.config.ts @@ -0,0 +1,8 @@ +import { defineConfig } from "vitest/config"; + +export default defineConfig({ + test: { + environment: "node", + include: ["src/**/*.test.ts"], + }, +}); diff --git a/packages/plugins/forms/src/client/index.ts b/packages/plugins/forms/src/client/index.ts index 991740068..6729f122e 100644 --- a/packages/plugins/forms/src/client/index.ts +++ b/packages/plugins/forms/src/client/index.ts @@ -460,8 +460,12 @@ function restoreState(form: HTMLFormElement) { // Restore field values for (const [name, value] of Object.entries(state.values)) { const input = form.elements.namedItem(name); - if (input && "value" in input) { - (input as unknown as HTMLInputElement).value = value; + if ( + input instanceof HTMLInputElement || + input instanceof HTMLTextAreaElement || + input instanceof HTMLSelectElement + ) { + input.value = value; } } @@ -525,11 +529,13 @@ function initTurnstile(form: HTMLFormElement) { } function renderTurnstile(container: HTMLElement, siteKey: string) { - const w = window as unknown as { + interface TurnstileWindow { turnstile?: { render: (el: HTMLElement, opts: Record) => void; }; - }; + } + + const w = window as Window & TurnstileWindow; if (w.turnstile) { w.turnstile.render(container, { sitekey: siteKey }); } diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 1bc96580e..9f2ad0f23 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -614,7 +614,7 @@ importers: version: 7.3.1(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) vitest-browser-react: specifier: ^2.0.5 version: 2.0.5(@types/react-dom@19.2.3(@types/react@19.2.14))(@types/react@19.2.14)(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(vitest@4.0.18) @@ -660,7 +660,7 @@ importers: version: 5.9.3 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) packages/blocks: dependencies: @@ -715,7 +715,7 @@ importers: version: 5.9.3 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) packages/blocks/playground: dependencies: @@ -798,7 +798,7 @@ importers: version: 5.9.3 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) packages/core: dependencies: @@ -1027,7 +1027,7 @@ importers: version: 5.9.3 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) packages/marketplace: dependencies: @@ -1058,7 +1058,7 @@ importers: version: 5.9.3 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) wrangler: specifier: 'catalog:' version: 4.71.0(@cloudflare/workers-types@4.20260305.1) @@ -1086,7 +1086,7 @@ importers: version: 19.2.14 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) packages/plugins/api-test: dependencies: @@ -1108,7 +1108,7 @@ importers: devDependencies: vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) packages/plugins/audit-log: dependencies: @@ -1129,6 +1129,25 @@ importers: specifier: 'catalog:' version: 19.2.14 + packages/plugins/commerce: + dependencies: + ulidx: + specifier: ^2.4.1 + version: 2.4.1 + devDependencies: + astro: + specifier: 'catalog:' + version: 6.0.1(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(rollup@4.55.2)(tsx@4.21.0)(typescript@5.9.3)(yaml@2.8.2) + emdash: + specifier: workspace:* + version: link:../../core + typescript: + specifier: 'catalog:' + version: 5.9.3 + vitest: + specifier: 'catalog:' + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + packages/plugins/embeds: dependencies: '@emdash-cms/blocks': @@ -1223,7 +1242,7 @@ importers: version: 5.9.3 vitest: specifier: 'catalog:' - version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + version: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) optionalDependencies: '@x402/svm': specifier: ^2.8.0 @@ -11665,7 +11684,7 @@ snapshots: '@vitest/mocker': 4.0.18(vite@7.3.1(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2)) playwright: 1.58.2 tinyrainbow: 3.0.3 - vitest: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + vitest: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) transitivePeerDependencies: - bufferutil - msw @@ -11699,7 +11718,7 @@ snapshots: pngjs: 7.0.0 sirv: 3.0.2 tinyrainbow: 3.0.3 - vitest: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + vitest: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) ws: 8.19.0 transitivePeerDependencies: - bufferutil @@ -15958,7 +15977,7 @@ snapshots: dependencies: react: 19.2.4 react-dom: 19.2.4(react@19.2.4) - vitest: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(@vitest/ui@4.0.17)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + vitest: 4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) optionalDependencies: '@types/react': 19.2.14 '@types/react-dom': 19.2.3(@types/react@19.2.14) @@ -16003,6 +16022,45 @@ snapshots: - tsx - yaml + vitest@4.0.18(@types/node@24.10.13)(@vitest/browser-playwright@4.0.18)(jiti@2.6.1)(jsdom@26.1.0)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2): + dependencies: + '@vitest/expect': 4.0.18 + '@vitest/mocker': 4.0.18(vite@6.4.1(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2)) + '@vitest/pretty-format': 4.0.18 + '@vitest/runner': 4.0.18 + '@vitest/snapshot': 4.0.18 + '@vitest/spy': 4.0.18 + '@vitest/utils': 4.0.18 + es-module-lexer: 1.7.0 + expect-type: 1.3.0 + magic-string: 0.30.21 + obug: 2.1.1 + pathe: 2.0.3 + picomatch: 4.0.3 + std-env: 3.10.0 + tinybench: 2.9.0 + tinyexec: 1.0.2 + tinyglobby: 0.2.15 + tinyrainbow: 3.0.3 + vite: 6.4.1(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2) + why-is-node-running: 2.3.0 + optionalDependencies: + '@types/node': 24.10.13 + '@vitest/browser-playwright': 4.0.18(playwright@1.58.2)(vite@7.3.1(@types/node@24.10.13)(jiti@2.6.1)(lightningcss@1.31.1)(tsx@4.21.0)(yaml@2.8.2))(vitest@4.0.18) + jsdom: 26.1.0 + transitivePeerDependencies: + - jiti + - less + - lightningcss + - msw + - sass + - sass-embedded + - stylus + - sugarss + - terser + - tsx + - yaml + volar-service-css@0.0.68(@volar/language-service@2.4.27): dependencies: vscode-css-languageservice: 6.3.9 diff --git a/prompts.txt b/prompts.txt new file mode 100644 index 000000000..86734d31f --- /dev/null +++ b/prompts.txt @@ -0,0 +1,29 @@ +1. **Evaluate 4 Strategies:** Against complexity, **DRY**, **YAGNI**, and scalability. +2. **Describe Each:** One paragraph per strategy, detailing trade-offs. +3. **Compare & Choose:** Summarize side-by-side; pick best overall. +4. **Implement:** Provide runnable, concise code aligned with chosen approach. + + +#handover + +Let us hand this project over to a new developer to further develop, test and debug the next phase. Please make sure all documentation (.md files) are current. Please provide them with all the information they need to succeed and tackle next steps. Please edit @HANDOVER.md as follows: +- **Goal:** Brief a new developer on project status in short & concise paragraphs covering: + 1. The big-picture purposes of the app, and the specific problem we are trying to solve at this time. + 2. Completed work and outcomes + 3. Failures, open issues, and lessons learned + 4. Files changed, if that matters for future development, key insights, and “gotchas” to avoid + 5. Key files and directories +- **Tone:** Technical README style—fact-only, no speculation or fluff. Make sure it is DRY & YAGNI. + +# new +Please take over this project for an EmDash ecommerce plugin. Read @handover.md to get you started, and any other documentation you find useful from this project. Proceed like a 10x engineer working the next version of of this app. + + + + +Just as a sanity check, Let us take a step back and look at all the code changes you have made throughout this coding session from beginning to end. Given what you now know: +- **Objectives:** Spot logic flaws, edge cases, performance issues, technical debt, duplicated or semi-duplicated processes or data that could be consolidated, etc. +- **Refactoring Options:** Without changing your solution, propose 4 strategies, each with cognitive, performance, DRY, YAGNI, and scalability analysis, with a focus on EmDash best practices. +- **Recommendation:** Compare, select, and recommend the best refactoring in code like a 10x engineer (do not implement) +- **Validate:** Make sure all your recommendations are supported by validated problems and fixes, and not assumptions. +- **IMPORTANT:** Don't over-engineer, and don't fix what is not broken \ No newline at end of file diff --git a/scripts/build-commerce-external-review-zip.sh b/scripts/build-commerce-external-review-zip.sh new file mode 100755 index 000000000..5773beaaf --- /dev/null +++ b/scripts/build-commerce-external-review-zip.sh @@ -0,0 +1,36 @@ +#!/usr/bin/env bash +set -euo pipefail +ROOT="$(cd "$(dirname "$0")/.." && pwd)" +cd "$ROOT" + +rm -f commerce-plugin-external-review.zip +rm -rf .review-staging +mkdir -p .review-staging/packages/plugins + +rsync -a --exclude 'node_modules' --exclude '.vite' \ + packages/plugins/commerce/ .review-staging/packages/plugins/commerce/ + +REVIEW_FILES=( + "@THIRD_PARTY_REVIEW_PACKAGE.md" + "external_review.md" + "SHARE_WITH_REVIEWER.md" + "HANDOVER.md" + "commerce-plugin-architecture.md" + "3rd-party-checklist.md" + "emdash-commerce-third-party-review-memo.md" + "emdash_commerce_review_update_ordered_children.md" +) + +for file in "${REVIEW_FILES[@]}"; do + if [ -f "$file" ]; then + mkdir -p ".review-staging/$(dirname "$file")" + cp "$file" ".review-staging/$file" + fi +done + +find .review-staging -type f -name '*.zip' -delete + +(cd .review-staging && zip -rq ../commerce-plugin-external-review.zip .) +rm -rf .review-staging + +echo "Wrote $ROOT/commerce-plugin-external-review.zip"