diff --git a/.claude/skills/ci-prep/SKILL.md b/.claude/skills/ci-prep/SKILL.md index 86cd81f..82c46a3 100644 --- a/.claude/skills/ci-prep/SKILL.md +++ b/.claude/skills/ci-prep/SKILL.md @@ -3,7 +3,7 @@ name: ci-prep description: Prepares the current branch for CI by running the exact same steps locally and fixing issues. If CI is already failing, fetches the GH Actions logs first to diagnose. Use before pushing, when CI is red, or when the user says "fix ci". argument-hint: "[--failing] [optional job name to focus on]" --- - + # CI Prep @@ -11,12 +11,12 @@ Prepare the current state for CI. If CI is already failing, fetch and analyze th ## Arguments -- `--failing` -- Indicates a GitHub Actions run is already failing. When present, you MUST execute **Step 1** before doing anything else. +- `--failing` — Indicates a GitHub Actions run is already failing. When present, you MUST execute **Step 1** before doing anything else. - Any other argument is treated as a job name to focus on (but all failures are still reported). If `--failing` is NOT passed, skip directly to **Step 2**. -## Step 1 -- Fetch failed CI logs (only when `--failing`) +## Step 1 — Fetch failed CI logs (only when `--failing`) You MUST do this before any other work. @@ -40,23 +40,23 @@ gh run view "$RUN_ID" --log-failed Read **every line** of `--log-failed` output. For each failure note the exact file, line, and error message. If a job name argument was provided, prioritize that job but still report all failures. -## Step 2 -- Analyze the CI workflow +## Step 2 — Analyze the CI workflow 1. Find the CI workflow file. Look in `.github/workflows/` for `ci.yml`, `build.yml`, `test.yml`, `checks.yml`, `main.yml`, `pull_request.yml`, or any workflow triggered on `pull_request` or `push`. 2. Read the workflow file completely. Parse every job and every step. -3. Extract the ordered list of commands the CI actually runs (e.g., `make lint`, `make fmt-check`, `make test`, `make coverage-check`, `make build`, or whatever the workflow specifies -- it may use `npm`, `cargo`, `dotnet`, raw shell commands, or anything else). +3. Extract the ordered list of commands the CI actually runs (e.g., `make lint`, `make fmt-check`, `make test`, `make coverage-check`, `make build`, or whatever the workflow specifies — it may use `npm`, `cargo`, `dotnet`, raw shell commands, or anything else). 4. Note any environment variables, matrix strategies, or conditional steps that affect execution. **Do NOT assume the steps are `make lint`, `make test`, `make coverage-check`, `make build`.** The actual CI may run different commands, in a different order, with different targets. Extract what the CI *actually does*. -## Step 3 -- Run each CI step locally, in order +## Step 3 — Run each CI step locally, in order Work through failures in this priority order: -1. **Formatting** -- run auto-formatters first to clear noise -2. **Compilation errors** -- must compile before lint/test -3. **Lint violations** -- fix the code pattern -4. **Runtime / test failures** -- fix source code to satisfy the test +1. **Formatting** — run auto-formatters first to clear noise +2. **Compilation errors** — must compile before lint/test +3. **Lint violations** — fix the code pattern +4. **Runtime / test failures** — fix source code to satisfy the test For each command extracted from the CI workflow: @@ -67,20 +67,21 @@ For each command extracted from the CI workflow: ### Hard constraints -- **NEVER modify test files** -- fix the source code, not the tests -- **NEVER add suppressions** (`#pragma warning disable`, `// eslint-disable`, `#[allow(...)]`) +- **NEVER modify test files** — fix the source code, not the tests +- **NEVER add suppressions** (`#[allow(...)]`, `// eslint-disable`, `#pragma warning disable`) +- **NEVER use `any` in TypeScript** to silence type errors - **NEVER delete or ignore failing tests** - **NEVER remove assertions** If stuck on the same failure after 5 attempts, ask the user for help. -## Step 4 -- Report +## Step 4 — Report - List every step that was run and its result (pass/fail/fixed). - If any step could not be fixed, report what failed and why. - Confirm whether the branch is ready to push. -## Step 5 -- Commit/Push (only when `--failing`) +## Step 5 — Commit/Push (only when `--failing`) Once all CI steps pass locally: @@ -96,10 +97,10 @@ Once all CI steps pass locally: - Fix issues found in each step before moving to the next - Never skip steps or suppress errors - If the CI workflow has multiple jobs, run all of them (respecting dependency order) -- Skip steps that are CI-infrastructure-only (checkout, setup-node/python/rust actions, cache steps, artifact uploads) -- focus on the actual build/test/lint commands +- Skip steps that are CI-infrastructure-only (checkout, setup-node/python/rust actions, cache steps, artifact uploads) — focus on the actual build/test/lint commands ## Success criteria - Every command that CI runs has been executed locally and passed - All fixes are applied to the working tree -- The CI passes successfully (if you are correcting an existing failure) +- The CI passes successfully (if you are correcting and existing failure) diff --git a/.claude/skills/code-dedup/SKILL.md b/.claude/skills/code-dedup/SKILL.md index 2c4450e..21f29ab 100644 --- a/.claude/skills/code-dedup/SKILL.md +++ b/.claude/skills/code-dedup/SKILL.md @@ -1,20 +1,23 @@ --- name: code-dedup -description: Searches for duplicate code, duplicate tests, and dead code, then safely merges or removes them. Use when the user says "deduplicate", "find duplicates", "remove dead code", "DRY up", or "code dedup". Requires test coverage -- refuses to touch untested code. +description: Searches for duplicate code, duplicate tests, and dead code, then safely merges or removes them. Use when the user says "deduplicate", "find duplicates", "remove dead code", "DRY up", or "code dedup". Requires test coverage — refuses to touch untested code. --- - + # Code Dedup -Carefully search for duplicate code, duplicate tests, and dead code across the repo. Merge duplicates and delete dead code -- but only when test coverage proves the change is safe. +Carefully search for duplicate code, duplicate tests, and dead code across the repo. Merge duplicates and delete dead code — but only when test coverage proves the change is safe. -## Prerequisites -- hard gate +## Prerequisites — hard gate Before touching ANY code, verify these conditions. If any fail, stop and report why. -1. Run `make test` -- all tests must pass. If tests fail, stop. Do not dedup a broken codebase. -2. Run `make coverage-check` -- coverage must meet the repo's threshold. If it doesn't, stop. -3. This is a C# repo with static typing -- proceed. +1. Run `make test` — all tests must pass. If tests fail, stop. Do not dedup a broken codebase. +2. Run `make coverage-check` — coverage must meet the repo's threshold. If it doesn't, stop. +3. Verify the project uses **static typing**. Check for: + - C#: typed by default — proceed + - Python: must have type annotations AND a type checker configured (pyright, mypy, or Basilisk in pyproject.toml / Makefile) — proceed + - **Untyped Python: STOP. Refuse to dedup.** Print: "This codebase has no static type checking. Deduplication without types is reckless — too high a risk of silent breakage. Add type checking first." ## Steps @@ -30,65 +33,80 @@ Dedup Progress: - [ ] Step 6: Verification passed (tests green, coverage stable) ``` -### Step 1 -- Inventory test coverage +### Step 1 — Inventory test coverage + +Before deciding what to touch, understand what is tested. 1. Run `make test` and `make coverage-check` to confirm green baseline -2. Note the current coverage percentage -- this is the floor. It must not drop. +2. Note the current coverage percentage — this is the floor. It must not drop. 3. Identify which files/modules have coverage and which do not. Only files WITH coverage are candidates for dedup. -### Step 2 -- Scan for dead code +### Step 2 — Scan for dead code + +Search for code that is never called, never imported, never referenced. 1. Look for unused exports, unused functions, unused classes, unused variables -2. C# analyzer warnings for unused members -- check `make lint` output -3. For each candidate: **grep the entire codebase** for references. Only mark as dead if truly zero references. +2. Use language-appropriate tools where available: + - C#: analyzer warnings for unused members + - Python: look for functions/classes with zero imports across the codebase +3. For each candidate: **grep the entire codebase** for references (including tests, scripts, configs). Only mark as dead if truly zero references. 4. List all dead code found with file paths and line numbers. Do NOT delete yet. -### Step 3 -- Scan for duplicate code +### Step 3 — Scan for duplicate code + +Search for code blocks that do the same thing in multiple places. 1. Look for functions/methods with identical or near-identical logic 2. Look for copy-pasted blocks (same structure, maybe different variable names) 3. Look for multiple implementations of the same algorithm or pattern -4. Check across module boundaries -- duplicates often hide in different projects -5. For each duplicate pair: note both locations, what they do, and how they differ +4. Check across module boundaries — duplicates often hide in different packages/projects +5. For each duplicate pair: note both locations, what they do, and how they differ (if at all) 6. List all duplicates found. Do NOT merge yet. -### Step 4 -- Scan for duplicate tests +### Step 4 — Scan for duplicate tests + +Search for tests that verify the same behavior. 1. Look for test functions with identical assertions against the same code paths 2. Look for test fixtures/helpers that are duplicated across test files -3. Look for integration tests that fully cover what a unit test also covers (keep the integration test) +3. Look for integration tests that fully cover what a unit test also covers (keep the integration test, mark the unit test as redundant per CLAUDE.md rules) 4. List all duplicate tests found. Do NOT delete yet. -### Step 5 -- Apply changes (one at a time) +### Step 5 — Apply changes (one at a time) -For each change: **change -> test -> verify coverage -> continue or revert**. +For each change, follow this cycle: **change → test → verify coverage → continue or revert**. #### 5a. Remove dead code +- Delete dead code identified in Step 2 - After each deletion: run `make test` and `make coverage-check` -- If tests fail or coverage drops: **revert immediately** +- If tests fail or coverage drops: **revert immediately** and investigate +- Dead code removal should never break tests or drop coverage #### 5b. Merge duplicate code -- Extract shared logic into a single function/module, update all call sites +- For each duplicate pair: extract the shared logic into a single function/module +- Update all call sites to use the shared version - After each merge: run `make test` and `make coverage-check` -- If tests fail: **revert immediately** +- If tests fail: **revert immediately**. The duplicates may have subtle differences you missed. +- If coverage drops: the shared code must have equivalent test coverage. Add tests if needed before proceeding. #### 5c. Remove duplicate tests - Delete the redundant test (keep the more thorough one) - After each deletion: run `make coverage-check` -- If coverage drops: **revert immediately** +- If coverage drops: **revert immediately**. The "duplicate" test was covering something the other wasn't. -### Step 6 -- Final verification +### Step 6 — Final verification -1. Run `make test` -- all tests must still pass -2. Run `make coverage-check` -- coverage must be >= the baseline from Step 1 -3. Run `make lint` and `make fmt-check` -- code must be clean +1. Run `make test` — all tests must still pass +2. Run `make coverage-check` — coverage must be >= the baseline from Step 1 +3. Run `make lint` and `make fmt-check` — code must be clean 4. Report: what was removed, what was merged, final coverage vs baseline ## Rules -- **No test coverage = do not touch.** If a file has no tests covering it, leave it alone entirely. -- **Coverage must not drop.** The coverage floor from Step 1 is sacred. -- **One change at a time.** Never batch multiple dedup changes before testing. -- **When in doubt, leave it.** -- **Preserve public API surface.** -- **Three similar lines is fine.** Only dedup when shared logic is substantial (>10 lines) or 3+ copies. +- **No test coverage = do not touch.** If a file has no tests covering it, leave it alone entirely. You cannot safely dedup what you cannot verify. +- **Coverage must not drop.** If removing or merging code causes coverage to decrease, revert and investigate. The coverage floor from Step 1 is sacred. +- **Untyped code = refuse to dedup.** Untyped Python is too dangerous. Types are the safety net that catches breakage at compile time. Without them, silent runtime errors are near-certain. +- **One change at a time.** Make one dedup change, run tests, verify coverage. Never batch multiple dedup changes before testing. +- **When in doubt, leave it.** If two code blocks look similar but you're not 100% sure they're functionally identical, leave both. False dedup is worse than duplication. +- **Preserve public API surface.** Do not change function signatures, class names, or module exports that external code depends on. Internal refactoring only. +- **Three similar lines is fine.** Do not create abstractions for trivial duplication. The cure must not be worse than the disease. Only dedup when the shared logic is substantial (>10 lines) or when there are 3+ copies. diff --git a/.claude/skills/spec-check/SKILL.md b/.claude/skills/spec-check/SKILL.md index d411db8..683cfb7 100644 --- a/.claude/skills/spec-check/SKILL.md +++ b/.claude/skills/spec-check/SKILL.md @@ -3,7 +3,7 @@ name: spec-check description: Audit spec/plan documents against the codebase. Ensures every spec section has implementing code, tests, and matching logic. Use when the user says "check specs", "spec audit", or "verify specs". argument-hint: "[optional spec ID or filename filter]" --- - + # spec-check @@ -13,7 +13,7 @@ Audit spec/plan documents against the codebase. Ensures every spec section has i ## Arguments -- `$ARGUMENTS` -- optional spec name or ID to check (e.g., `AUTH-TOKEN-VERIFY` or `repo-standards`). If empty, check ALL specs. Spec IDs are descriptive slugs, NEVER numbered (see Step 1). +- `$ARGUMENTS` — optional spec name or ID to check (e.g., `AUTH-TOKEN-VERIFY` or `repo-standards`). If empty, check ALL specs. Spec IDs are descriptive slugs, NEVER numbered (see Step 1). ## Instructions @@ -28,13 +28,22 @@ Before checking code/test references, verify that the specs themselves are well- 1. Find all spec documents (see locations in Step 2). 2. Extract every section ID using the regex `\[([A-Z][A-Z0-9]*(-[A-Z0-9]+)+)\]`. 3. **Flag invalid IDs:** - - Numbered IDs (`[SPEC-001]`, `[REQ-003]`, `[CI-004]`) -- must be renamed to descriptive hierarchical slugs. - - Single-word IDs (`[TIMEOUT]`) -- must have a group prefix. - - IDs with trailing numbers (`[FEAT-AUTH-01]`) -- the number is meaningless, remove it. + - Numbered IDs (`[SPEC-001]`, `[REQ-003]`, `[CI-004]`) — must be renamed to descriptive hierarchical slugs. + - Single-word IDs (`[TIMEOUT]`) — must have a group prefix. + - IDs with trailing numbers (`[FEAT-AUTH-01]`) — the number is meaningless, remove it. 4. **Check group clustering:** The first word of each ID is its group. All sections in the same group MUST appear together (adjacent) in the document. If they're scattered, flag it. 5. **Check for missing IDs:** Any heading that defines a requirement or behavior should have an ID. Flag headings in spec files that look like they define behavior but lack an ID. -If any ID violations are found, report them all and **STOP**. +If any ID violations are found, report them all and **STOP**: +``` +SPEC ID VIOLATIONS: + +- docs/specs/AUTH-SPEC.md line 12: [SPEC-001] → rename to descriptive ID (e.g., [AUTH-LOGIN]) +- docs/specs/AUTH-SPEC.md line 30: [AUTH-TOKEN-VERIFY] and [AUTH-LOGIN] are not adjacent (scattered group) +- docs/specs/CI-SPEC.md line 5: "## Coverage thresholds" has no spec ID + +Fix spec IDs first, then re-run spec-check. +``` If all IDs are valid, proceed to Step 2. @@ -52,23 +61,45 @@ Search for markdown files that contain spec sections with IDs. Look in these loc Use Glob to find candidate files, then use Grep to confirm they contain spec IDs. -**Spec ID patterns** -- IDs appear in square brackets, typically at the start of a heading or section line. Match this regex pattern: +**Spec ID patterns** — IDs appear in square brackets, typically at the start of a heading or section line. Match this regex pattern: ``` \[([A-Z][A-Z0-9]*(-[A-Z0-9]+)+)\] ``` -Spec IDs are **hierarchical descriptive slugs, NEVER numbered.** The format is `[GROUP-TOPIC]` or `[GROUP-TOPIC-DETAIL]`. +Spec IDs are **hierarchical descriptive slugs, NEVER numbered.** The format is `[GROUP-TOPIC]` or `[GROUP-TOPIC-DETAIL]`. The first word is the **group** — all sections sharing the same group MUST appear together in the spec's table of contents. IDs are uppercase, hyphen-separated, unique across the repo, and MUST NOT contain sequential numbers. + +The hierarchy depth varies by repo: two words for simple repos (`[AUTH-LOGIN]`), three for most (`[AUTH-TOKEN-VERIFY]`), four for complex domains (`[AUTH-OAUTH-REFRESH-FLOW]`). The hierarchy mirrors the spec document's heading structure. + +Examples of valid spec IDs (note how groups cluster): +- `[AUTH-LOGIN]`, `[AUTH-TOKEN-VERIFY]`, `[AUTH-TOKEN-REFRESH]` — all in the AUTH group +- `[CI-TIMEOUT]`, `[CI-LINT]`, `[CI-COVERAGE]` — all in the CI group +- `[LINT-ESLINT]`, `[LINT-RUFF]` — all in the LINT group +- `[FEAT-DARK-MODE]`, `[FEAT-SEARCH-FILTER]` — all in the FEAT group -For each file, extract every spec ID and its associated section title and full section content. +Examples of INVALID spec IDs: +- `[SPEC-001]` — numbered, meaningless +- `[FEAT-AUTH-01]` — trailing number +- `[REQ-003]` — sequential index, no group hierarchy +- `[CI-004]` — numbered, tells the reader nothing +- `[TIMEOUT]` — no group prefix, ungrouped + +For each file, extract every spec ID and its associated section title (the heading text after the ID) and the full section content (everything until the next heading of equal or higher level). --- ### Step 3: Filter specs -- If `$ARGUMENTS` is non-empty, filter the discovered specs. +- If `$ARGUMENTS` is non-empty, filter the discovered specs: + - If it matches a spec ID exactly (e.g., `AUTH-TOKEN-VERIFY`), check only that spec. + - If it matches a partial name (e.g., `repo-standards`), check all specs in files whose path contains that string. - If `$ARGUMENTS` is empty, process ALL discovered specs. +If filtering produces zero specs, report an error: +``` +ERROR: No specs found matching "$ARGUMENTS". Discovered spec files: [list them] +``` + --- ### Step 4: Check each spec section @@ -77,59 +108,222 @@ For EACH spec section that has an ID, perform checks A, B, and C below. **Stop o #### Check A: Code references the spec ID -Search the entire codebase for the spec ID string, **excluding** `docs/`, `node_modules/`, `.git/`, and `*.md` files. +Search the entire codebase for the spec ID string, **excluding** these directories: +- `docs/` +- `node_modules/` +- `.git/` +- `*.md` files (markdown is docs, not code) + +Use Grep with the literal spec ID (e.g., `[AUTH-TOKEN-VERIFY]`) to find references in code files. + +Code files should contain comments referencing the spec ID. The search must catch **all** comment styles across languages: -Any comment containing the exact spec ID string counts as a valid code reference. +**C-style `//` comments** (JavaScript, TypeScript, Rust, C#, F#, Java, Kotlin, Go, Swift, Dart): +- `// Implements [AUTH-TOKEN-VERIFY]` +- `// [AUTH-TOKEN-VERIFY]` +- `// Tests [AUTH-TOKEN-VERIFY]` (also counts as a code reference) +- `/// Implements [AUTH-TOKEN-VERIFY]` (doc comments) + +**Hash `#` comments** (Python, Ruby, Shell/Bash, YAML, TOML): +- `# Implements [AUTH-TOKEN-VERIFY]` +- `# [AUTH-TOKEN-VERIFY]` +- `# Tests [AUTH-TOKEN-VERIFY]` + +**HTML/XML comments** (HTML, CSS, SVG, XML, XAML, JSX templates): +- `` +- `` + +**ML-style comments** (F#, OCaml): +- `(* Implements [AUTH-TOKEN-VERIFY] *)` + +**Lua comments:** +- `-- Implements [AUTH-TOKEN-VERIFY]` + +**CSS comments:** +- `/* Implements [AUTH-TOKEN-VERIFY] */` + +**The key rule:** any comment in any language containing the exact spec ID string (e.g., `[AUTH-TOKEN-VERIFY]`) counts as a valid code reference. The Grep search uses the literal spec ID string, so it naturally matches all comment styles. Do NOT restrict the search to specific comment prefixes — just search for the spec ID string itself. **If NO code files reference the spec ID:** ``` -SPEC VIOLATION: [ID] "Section Title" has no implementing code. +SPEC VIOLATION: [AUTH-TOKEN-VERIFY] "Section Title" has no implementing code. -ACTION REQUIRED: Add a comment referencing [ID] in the file(s) that implement +Every spec section must have at least one code file that references it via a comment +containing the spec ID (e.g., `// Implements [AUTH-TOKEN-VERIFY]`). + +ACTION REQUIRED: Add a comment referencing [AUTH-TOKEN-VERIFY] in the file(s) that implement this spec section, then re-run spec-check. ``` -**STOP HERE.** +**STOP HERE. Do not continue to other checks.** #### Check B: Tests reference the spec ID -Search test files (`**/*.Tests/**`, `**/*Tests.*`, `**/*Test.*`) for the literal spec ID string. +Search test files for the spec ID. Test files are found in: +- `test/` +- `tests/` +- `**/*.test.*` +- `**/*.spec.*` +- `**/*_test.*` +- `**/test_*.*` +- `**/*Tests.*` +- `**/*Test.*` + +Use Grep to search these locations for the literal spec ID string. + +Tests should contain the spec ID in comments, test names, or annotations. The search must catch **all** test frameworks across languages: + +**JavaScript/TypeScript** (Jest, Mocha, Vitest, Playwright): +- `// Tests [AUTH-TOKEN-VERIFY]` +- `describe('[AUTH-TOKEN-VERIFY] Authentication flow', () => ...)` +- `test('[AUTH-TOKEN-VERIFY] should verify token', () => ...)` +- `it('[AUTH-TOKEN-VERIFY] verifies token', () => ...)` + +**Python** (pytest, unittest): +- `# Tests [AUTH-TOKEN-VERIFY]` +- `def test_auth_token_verify_flow():` +- `class TestAuthTokenVerify:` + +**Rust:** +- `// Tests [AUTH-TOKEN-VERIFY]` +- `#[test] // Tests [AUTH-TOKEN-VERIFY]` + +**C#** (xUnit, NUnit, MSTest): +- `// Tests [AUTH-TOKEN-VERIFY]` +- `[Fact] // Tests [AUTH-TOKEN-VERIFY]` +- `[Test] // Tests [AUTH-TOKEN-VERIFY]` +- `[TestMethod] // Tests [AUTH-TOKEN-VERIFY]` + +**F#** (xUnit, Expecto): +- `// Tests [AUTH-TOKEN-VERIFY]` +- `[] // Tests [AUTH-TOKEN-VERIFY]` +- `testCase "[AUTH-TOKEN-VERIFY] description" <| fun () ->` + +**Java/Kotlin** (JUnit, TestNG): +- `// Tests [AUTH-TOKEN-VERIFY]` +- `@Test // Tests [AUTH-TOKEN-VERIFY]` + +**Go:** +- `// Tests [AUTH-TOKEN-VERIFY]` +- `func TestAuthTokenVerify(t *testing.T) { // Tests [AUTH-TOKEN-VERIFY]` + +**Swift** (XCTest): +- `// Tests [AUTH-TOKEN-VERIFY]` +- `func testAuthTokenVerify() { // Tests [AUTH-TOKEN-VERIFY]` + +**Dart** (flutter_test): +- `// Tests [AUTH-TOKEN-VERIFY]` +- `test('[AUTH-TOKEN-VERIFY] description', () { ... });` + +**Ruby** (RSpec, Minitest): +- `# Tests [AUTH-TOKEN-VERIFY]` +- `describe '[AUTH-TOKEN-VERIFY] Authentication' do` +- `it '[AUTH-TOKEN-VERIFY] verifies token' do` + +**Shell** (bats, shunit2): +- `# Tests [AUTH-TOKEN-VERIFY]` +- `@test "[AUTH-TOKEN-VERIFY] description" {` + +**The key rule:** same as Check A — search for the literal spec ID string in test files. Any occurrence of the exact spec ID in a test file counts. Do NOT restrict to specific patterns — just search for the spec ID string itself. **If NO test files reference the spec ID:** ``` -SPEC VIOLATION: [ID] "Section Title" has no tests. +SPEC VIOLATION: [AUTH-TOKEN-VERIFY] "Section Title" has no tests. + +Every spec section must have corresponding tests that reference the spec ID. -ACTION REQUIRED: Add tests for [ID] with a comment or test name containing +ACTION REQUIRED: Add tests for [AUTH-TOKEN-VERIFY] with a comment or test name containing the spec ID, then re-run spec-check. ``` -**STOP HERE.** +**STOP HERE. Do not continue to other checks.** #### Check C: Code logic matches the spec -1. Read the spec section content carefully. -2. Read the implementing code. -3. Compare spec vs. code. Be SENSITIVE and PEDANTIC. Check for ordering violations, missing conditions, wrong logic, missing steps, wrong defaults. -4. If the code deviates from the spec, report a detailed error with quotes from both spec and code. +This is the most critical check. You must: + +1. **Read the spec section content carefully.** Understand exactly what behavior, logic, ordering, conditions, and constraints the spec describes. -**STOP HERE on any deviation.** +2. **Read the implementing code.** Use the references found in Check A to locate the implementing files. Read the relevant functions/sections. -5. If the code matches the spec, this check passes. Move to the next spec. +3. **Compare spec vs. code.** Be SENSITIVE and PEDANTIC. Check for: + - **Ordering violations** — If the spec says A happens before B, the code must do A before B. + - **Missing conditions** — If the spec says "only when X", the code must have that condition. + - **Extra behavior** — If the code does something the spec doesn't mention, flag it only if it contradicts the spec. + - **Wrong logic** — If the spec says "greater than" but code uses "greater than or equal", that's a violation. + - **Missing steps** — If the spec describes 5 steps but code only implements 3, that's a violation. + - **Wrong defaults** — If the spec says "default to X" but code defaults to Y, that's a violation. + +4. **If the code deviates from the spec**, report a detailed error: + +``` +SPEC VIOLATION: [AUTH-TOKEN-VERIFY] Code does not match spec. + +SPEC SAYS: +> "The authentication flow must verify the token expiry before checking permissions" +> (from docs/specs/AUTH-SPEC.md, line 42) + +CODE DOES: +> `if (hasPermission(user)) { verifyToken(token); }` (src/auth.ts:42) + +DEVIATION: The code checks permissions BEFORE verifying token expiry. +The spec explicitly requires token expiry verification FIRST. + +ACTION REQUIRED: Reorder the logic in src/auth.ts to verify token expiry +before checking permissions, as specified in [AUTH-TOKEN-VERIFY]. +``` + +**STOP HERE. Do not continue to other specs.** + +5. **If the code matches the spec**, this check passes. Move to the next spec. --- ### Step 5: Report results -On failure: output ONLY the first violation found. +#### On failure (any check fails): + +Output ONLY the first violation found. Use the exact error format shown above. Do not summarize other specs. Do not offer to fix the code. Just report the violation. + +End with: +``` +spec-check FAILED. Fix the violation above and re-run. +``` + +#### On success (all specs pass): + +Output a summary table: + +``` +spec-check PASSED. All specs verified. + +| Spec ID | Title | Code References | Test References | Logic Match | +|----------------|--------------------------|-----------------|-----------------|-------------| +| [AUTH-TOKEN-VERIFY] | Authentication flow | src/auth.ts | tests/auth.test.ts | PASS | +| [RATE-LIMIT-CONFIG] | Rate limiting | src/rate.ts | tests/rate.test.ts | PASS | +| ... | ... | ... | ... | ... | + +Checked N spec sections across M files. All have implementing code, tests, and matching logic. +``` + +--- + +## Search strategy summary -On success: output a summary table of all specs checked. +1. **Validate spec IDs:** Check all IDs are hierarchical, descriptive, grouped, and non-numbered +2. **Find spec files:** Glob for `docs/**/*.md`, `SPEC.md`, `PLAN.md`, `specs/**/*.md` +3. **Extract spec IDs:** Grep for `\[[A-Z][A-Z0-9]*(-[A-Z0-9]+)+\]` in those files +4. **Find code refs:** Grep for the literal spec ID in all files, excluding `docs/`, `node_modules/`, `.git/`, `*.md` +5. **Find test refs:** Grep for the literal spec ID in test directories and test file patterns +6. **Read and compare:** Read the spec section content and the implementing code, compare logic ## Key principles - **Fail fast.** Stop on the first violation. One fix at a time. -- **Be pedantic.** If the spec says it, the code must do it. -- **Quote everything.** Always quote the spec text and the code in error messages. +- **Be pedantic.** If the spec says it, the code must do it. No "close enough". +- **Quote everything.** Always quote the spec text and the code in error messages so the developer sees exactly what's wrong. - **Be actionable.** Every error must tell the developer what file to change and what to do. -- **No numbered IDs.** Spec IDs are hierarchical descriptive slugs, NEVER sequential numbers. +- **Exclude docs from code search.** Markdown files are documentation, not implementation. Only search actual code files for spec references. +- **No numbered IDs.** Spec IDs are hierarchical descriptive slugs (`[AUTH-TOKEN-VERIFY]`), NEVER sequential numbers (`[SPEC-001]`). The first word is the group — sections sharing a group must be adjacent in the TOC. If you encounter numbered or ungrouped IDs, flag them as a violation. diff --git a/.claude/skills/submit-pr/SKILL.md b/.claude/skills/submit-pr/SKILL.md index 33dd896..72526cc 100644 --- a/.claude/skills/submit-pr/SKILL.md +++ b/.claude/skills/submit-pr/SKILL.md @@ -3,7 +3,7 @@ name: submit-pr description: Creates a pull request with a well-structured description after verifying CI passes. Use when the user asks to submit, create, or open a pull request. disable-model-invocation: true --- - + # Submit PR @@ -11,9 +11,9 @@ Create a pull request for the current branch with a well-structured description. ## Steps -1. Run `make ci` -- must pass completely before creating PR +1. Run `make ci` — must pass completely before creating PR 2. **Generate the diff against main.** Run `git diff main...HEAD > /tmp/pr-diff.txt` to capture the full diff between the current branch and the head of main. This is the ONLY source of truth for what the PR contains. **Warning:** the diff can be very large. If the diff file exceeds context limits, process it in chunks (e.g., read sections with `head`/`tail` or split by file) rather than trying to load it all at once. -3. **Derive the PR title and description SOLELY from the diff.** Read the diff output and summarize what changed. Ignore commit messages, branch names, and any other metadata -- only the actual code/content diff matters. +3. **Derive the PR title and description SOLELY from the diff.** Read the diff output and summarize what changed. Ignore commit messages, branch names, and any other metadata — only the actual code/content diff matters. 4. Write PR body using the template in `.github/pull_request_template.md` 5. Fill in (based on the diff analysis from step 3): - TLDR: one sentence @@ -27,7 +27,7 @@ Create a pull request for the current branch with a well-structured description. ## Rules - Never create a PR if `make ci` fails -- PR description must be specific and tight -- no vague placeholders +- PR description must be specific and tight — no vague placeholders - Link to the relevant GitHub issue if one exists ## Success criteria diff --git a/.claude/skills/upgrade-packages/SKILL.md b/.claude/skills/upgrade-packages/SKILL.md index 11326c6..b2eb963 100644 --- a/.claude/skills/upgrade-packages/SKILL.md +++ b/.claude/skills/upgrade-packages/SKILL.md @@ -1,45 +1,71 @@ --- name: upgrade-packages -description: Upgrade all dependencies/packages to their latest versions for C#/.NET. Use when the user says "upgrade packages", "update dependencies", "bump versions", "update packages", or "upgrade deps". +description: Upgrade all dependencies/packages to their latest versions for C#/.NET and Python. Use when the user says "upgrade packages", "update dependencies", "bump versions", "update packages", or "upgrade deps". argument-hint: "[--check-only] [--major] [package-name]" --- - + # Upgrade Packages -Upgrade all project dependencies to their latest compatible (or latest major, if `--major`) versions. +Upgrade all project dependencies to their latest compatible (or latest major, if `--major`) versions for HealthcareSamples (C#/.NET primary, Python embedding service + scripts). ## Arguments -- `--check-only` -- List outdated packages without upgrading. Stop after Step 2. -- `--major` -- Include major version bumps (breaking changes). Without this flag, stay within semver-compatible ranges. +- `--check-only` — List outdated packages without upgrading. Stop after Step 2. +- `--major` — Include major version bumps (breaking changes). Without this flag, stay within semver-compatible ranges. - Any other argument is treated as a specific package name to upgrade (instead of all packages). -## Step 1 -- Detect language and package manager +## Step 1 — Detect language and package manager -This is a C#/.NET repo. Manifest files: -- `HealthcareSamples.sln` -- `Directory.Build.props` -- Individual `.csproj` files across Clinical, Scheduling, ICD10, Dashboard, and Shared projects +Inspect the repo for these manifest files: -## Step 2 -- List outdated packages +| Manifest file | Language | Package manager | +|---|---|---| +| `*.csproj` / `*.sln` | C# / .NET | NuGet (dotnet) | +| `Directory.Build.props` | C# / .NET | NuGet (dotnet) — central version pinning | +| `requirements.txt` | Python (ICD10/embedding-service, ICD10/scripts/CreateDb) | pip | +This repo uses both. Process .NET first, then Python. + +## Step 2 — List outdated packages + +Run the appropriate command BEFORE upgrading anything. Show the user what will change. + +### C# / .NET (NuGet) ```bash -dotnet list package --outdated +dotnet list HealthcareSamples.sln package --outdated ``` - -For transitive dependencies too: `dotnet list package --outdated --include-transitive` +For transitive dependencies too: `dotnet list HealthcareSamples.sln package --outdated --include-transitive` **Read the docs:** https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-list-package +### Python (pip) +The Python pieces use plain `requirements.txt` files. Install each in a venv and run `pip list --outdated`: +```bash +# embedding service +python -m venv /tmp/embedding-venv +/tmp/embedding-venv/bin/pip install -r ICD10/embedding-service/requirements.txt +/tmp/embedding-venv/bin/pip list --outdated + +# DB scripts +python -m venv /tmp/scripts-venv +/tmp/scripts-venv/bin/pip install -r ICD10/scripts/CreateDb/requirements.txt +/tmp/scripts-venv/bin/pip list --outdated +``` + +**Read the docs:** https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-U + If `--check-only` was passed, **stop here** and report the outdated list. -## Step 3 -- Read the official upgrade docs +## Step 3 — Read the official upgrade docs **Before running any upgrade command, you MUST fetch and read the official documentation URL listed above for the detected package manager.** Use WebFetch to retrieve the page. This ensures you use the correct flags and understand the behavior. Do not guess at flags or options from memory. -## Step 4 -- Upgrade packages +## Step 4 — Upgrade packages + +Run the upgrade. If a specific package name was given as an argument, upgrade only that package. +### C# / .NET (NuGet) There is NO single `dotnet upgrade-all` command. You must upgrade each package individually: ```bash # For each outdated package from Step 2: @@ -58,7 +84,17 @@ dotnet outdated --upgrade ``` **Read the docs:** https://github.com/dotnet-outdated/dotnet-outdated -## Step 5 -- Verify the upgrade +### Python (pip) +For `requirements.txt`: +```bash +/tmp/embedding-venv/bin/pip install --upgrade -r ICD10/embedding-service/requirements.txt +/tmp/embedding-venv/bin/pip freeze > ICD10/embedding-service/requirements.txt + +/tmp/scripts-venv/bin/pip install --upgrade -r ICD10/scripts/CreateDb/requirements.txt +/tmp/scripts-venv/bin/pip freeze > ICD10/scripts/CreateDb/requirements.txt +``` + +## Step 5 — Verify the upgrade After upgrading, run the project's build and test suite to confirm nothing broke: @@ -68,17 +104,17 @@ make ci If tests fail: 1. Read the failure output carefully -2. Check the changelog / migration guide for the upgraded packages +2. Check the changelog / migration guide for the upgraded packages (fetch the release notes URL if available) 3. Fix breaking changes in the code 4. Re-run tests -5. If stuck after 3 attempts on the same failure, report it to the user +5. If stuck after 3 attempts on the same failure, report it to the user with the error details and the package that caused it -## Step 6 -- Report +## Step 6 — Report Provide a summary: - Packages upgraded (old version -> new version) -- Packages skipped (and why) +- Packages skipped (and why, e.g., major version bump without `--major` flag) - Build/test result after upgrade - Any breaking changes that were fixed - Any packages that could not be upgraded (with error details) @@ -90,8 +126,8 @@ Provide a summary: - **Always run tests after upgrading** to catch breakage immediately - **Never remove packages** unless they were explicitly deprecated and replaced - **Never downgrade packages** unless rolling back a broken upgrade -- **Never modify lockfiles manually** -- let the package manager regenerate them -- **Commit nothing** -- leave changes in the working tree for the user to review +- **Never modify lockfiles manually** — let the package manager regenerate them +- **Commit nothing** — leave changes in the working tree for the user to review ## Success criteria diff --git a/.claude/skills/website-audit/SKILL.md b/.claude/skills/website-audit/SKILL.md index b511de6..5948cbf 100644 --- a/.claude/skills/website-audit/SKILL.md +++ b/.claude/skills/website-audit/SKILL.md @@ -2,7 +2,7 @@ name: website-audit description: Audits a website for SEO, AI search performance, structured data, mobile usability, broken links, and social media cards. Fixes issues found. Use when the user mentions "audit website", "SEO", "fix search ranking", "AI search", "structured data", "social media cards", or "website performance". --- - + # Website Audit @@ -26,11 +26,11 @@ Audit Progress: - [ ] Step 12: Report findings ``` -- Check the outputted HTML/CSS/JavaScript AFTER the website is generated by the static content generator. - Don't just check the static content before the website is generated. +- Check the outputted HTML/CSS/JavaScript AFTER the website is generated by the static content generator. - Don't just check the static content before the website is generated. - Fix issues at the core where the static content templates are stored - not in the outputted HTML (e.g. _site) - Never manually edit the generated website content directly -## Step 1 -- Read guidelines +## Step 1 — Read guidelines Fetch and read each of these before auditing. These are the authoritative references for every step that follows. @@ -38,40 +38,42 @@ Fetch and read each of these before auditing. These are the authoritative refere - [Top ways to ensure content performs well in Google's AI experiences](https://developers.google.com/search/blog/2025/05/succeeding-in-ai-search) - [SEO Starter Guide](https://developers.google.com/search/docs/fundamentals/seo-starter-guide) +If the repo has a business plan doc, take it into account + Identify the website source files in the repo. Determine the framework (static site generator, Next.js, Hugo, etc.) so you know where to find templates, metadata, and content. -## Step 2 -- Audit AI search readiness +## Step 2 — Audit AI search readiness Apply the guidance from the AI search article. Check: -1. **Content quality** -- Is content original, expert-level, and comprehensive? Flag thin or duplicated pages. -2. **Clear structure** -- Do pages use descriptive headings, lists, and concise answers to likely questions? -3. **Entity clarity** -- Are key terms, products, and concepts defined clearly so AI can extract them? -4. **Freshness signals** -- Are dates, update timestamps, and authorship present? +1. **Content quality** — Is content original, expert-level, and comprehensive? Flag thin or duplicated pages. +2. **Clear structure** — Do pages use descriptive headings, lists, and concise answers to likely questions? +3. **Entity clarity** — Are key terms, products, and concepts defined clearly so AI can extract them? +4. **Freshness signals** — Are dates, update timestamps, and authorship present? Fix issues directly in the source files. For each fix, note what changed and why. -## Step 3 -- Audit SEO and keywords +## Step 3 — Audit SEO and keywords 1. Search [Google Trends](https://trends.google.com/home) for trending keywords related to the website's content. 2. Review each page's ``, `<meta name="description">`, and `<h1>` tags. -3. Check for keyword opportunities -- can trending terms be naturally inserted into headings, descriptions, or body content? +3. Check for keyword opportunities — can trending terms be naturally inserted into headings, descriptions, or body content? 4. Verify each page has a unique, descriptive title (50-60 chars) and meta description (150-160 chars). 5. Check image `alt` attributes describe the image content and include relevant keywords where natural. Apply the [SEO Starter Guide](https://developers.google.com/search/docs/fundamentals/seo-starter-guide) principles. Fix issues directly. -## Step 4 -- Audit crawling and indexing +## Step 4 — Audit crawling and indexing Reference: [Overview of crawling and indexing topics](https://developers.google.com/search/docs/crawling-indexing) -1. **robots.txt** -- Locate and review it. Verify it doesn't block important pages. Reference: [robots.txt spec](https://developers.google.com/search/docs/crawling-indexing/robots-txt) -2. **Sitemap** -- Locate the sitemap (or sitemap index). Verify all important pages are listed and no dead URLs are included. Reference: [Sitemap guidelines](https://developers.google.com/search/docs/crawling-indexing/sitemaps/large-sitemaps) -3. **Meta robots tags** -- Check for unintended `noindex` or `nofollow` directives on pages that should be indexed. +1. **robots.txt** — Locate and review it. Verify it doesn't block important pages. Reference: [robots.txt spec](https://developers.google.com/search/docs/crawling-indexing/robots-txt) +2. **Sitemap** — Locate the sitemap (or sitemap index). Verify all important pages are listed and no dead URLs are included. Reference: [Sitemap guidelines](https://developers.google.com/search/docs/crawling-indexing/sitemaps/large-sitemaps) +3. **Meta robots tags** — Check for unintended `noindex` or `nofollow` directives on pages that should be indexed. Note: robots.txt and sitemaps are often auto-generated. If so, check the generator config rather than the output file. -## Step 5 -- Audit broken links and canonicalization +## Step 5 — Audit broken links and canonicalization Reference: [What is canonicalization](https://developers.google.com/search/docs/crawling-indexing/canonicalization) @@ -80,7 +82,7 @@ Reference: [What is canonicalization](https://developers.google.com/search/docs/ 3. Check for duplicate content accessible via multiple URLs (with/without trailing slash, www vs non-www). 4. Verify redirects use 301 (permanent) not 302 (temporary) where appropriate. -## Step 6 -- Audit mobile usability +## Step 6 — Audit mobile usability Reference: [Mobile-first indexing best practices](https://developers.google.com/search/docs/crawling-indexing/mobile/mobile-sites-mobile-first-indexing) @@ -89,7 +91,7 @@ Reference: [Mobile-first indexing best practices](https://developers.google.com/ 3. Verify touch targets are adequately sized (min 48x48px). 4. Check font sizes are readable without zooming (min 16px body text). -## Step 7 -- Audit structured data +## Step 7 — Audit structured data Reference: [Structured data guidelines](https://developers.google.com/search/docs/appearance/structured-data/sd-policies) @@ -102,7 +104,7 @@ Reference: [Structured data guidelines](https://developers.google.com/search/doc - **FAQ** for pages with question/answer content 4. Validate JSON-LD syntax is correct. -## Step 8 -- Audit social media cards +## Step 8 — Audit social media cards Reference: [Implementing Social Media Preview Cards](https://documentation.platformos.com/use-cases/implementing-social-media-preview-cards) @@ -124,31 +126,31 @@ Ensure that all claims are backed up with a link to a reputable source. As an ex Search for the authoritative URL and add a link to the URL. If it is not available, change the claim to something that can be substatiated. -## Step 10 -- Audit Design Compliance +## Step 10 — Audit Design Compliance Read the design system docs and view the design screens in the designsystem folder. -## Step 11 -- Test with Playwright +## Step 11 — Test with Playwright Build and run the website locally using `make website-run` (or the project's equivalent dev server command). **Desktop tests (1280x720):** -1. Navigate to the homepage -- take a screenshot. -2. Navigate to each major section -- verify pages load without errors. +1. Navigate to the homepage — take a screenshot. +2. Navigate to each major section — verify pages load without errors. 3. Check the browser console for JavaScript errors. 4. Verify all navigation links work. **Mobile tests (375x667, iPhone SE):** 1. Resize the browser to mobile dimensions. -2. Navigate to the homepage -- take a screenshot. +2. Navigate to the homepage — take a screenshot. 3. Verify the layout is responsive (no horizontal overflow, readable text). 4. Test navigation menu (hamburger menu if applicable). If any page fails to load or has console errors, fix the issue and retest. -## Step 12 -- Report findings +## Step 12 — Report findings Summarize the audit results: @@ -170,8 +172,8 @@ Summarize the audit results: ## Rules -- **Fix issues directly** -- don't just report them. Only flag issues as warnings when they require human judgment (e.g., content tone, keyword selection). -- **One step at a time** -- complete each step before moving to the next. -- **Preserve existing content** -- improve structure and metadata without rewriting the author's voice. -- **No keyword stuffing** -- keywords must read naturally in context. -- **Respect the framework** -- edit templates/configs, not generated output files. +- **Fix issues directly** — don't just report them. Only flag issues as warnings when they require human judgment (e.g., content tone, keyword selection). +- **One step at a time** — complete each step before moving to the next. +- **Preserve existing content** — improve structure and metadata without rewriting the author's voice. +- **No keyword stuffing** — keywords must read naturally in context. +- **Respect the framework** — edit templates/configs, not generated output files. diff --git a/.clinerules/00-read-instructions.md b/.clinerules/00-read-instructions.md index 3af39da..f0ce473 100644 --- a/.clinerules/00-read-instructions.md +++ b/.clinerules/00-read-instructions.md @@ -1,4 +1,4 @@ -<!-- agent-pmo:d58c330 --> +<!-- agent-pmo:29b9dcf --> # Single Source of Truth diff --git a/.config/dotnet-tools.json b/.config/dotnet-tools.json new file mode 100644 index 0000000..3d55487 --- /dev/null +++ b/.config/dotnet-tools.json @@ -0,0 +1,41 @@ +{ + "version": 1, + "isRoot": true, + "tools": { + "csharpier": { + "version": "1.2.6", + "commands": [ + "csharpier" + ], + "rollForward": false + }, + "dataprovidermigrate": { + "version": "0.9.5-beta", + "commands": [ + "DataProviderMigrate" + ], + "rollForward": false + }, + "dataprovider": { + "version": "0.9.5-beta", + "commands": [ + "DataProvider" + ], + "rollForward": false + }, + "lql": { + "version": "0.9.5-beta", + "commands": [ + "Lql" + ], + "rollForward": false + }, + "h5-compiler": { + "version": "26.3.64893", + "commands": [ + "h5" + ], + "rollForward": false + } + } +} \ No newline at end of file diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md index 0e46e98..7595149 100644 --- a/.github/copilot-instructions.md +++ b/.github/copilot-instructions.md @@ -1,4 +1,4 @@ -<!-- agent-pmo:d58c330 --> +<!-- agent-pmo:29b9dcf --> @CLAUDE.md diff --git a/.github/pull_request_template.md b/.github/pull_request_template.md index cddd9c5..135e75c 100644 --- a/.github/pull_request_template.md +++ b/.github/pull_request_template.md @@ -1,22 +1,11 @@ -<!-- agent-pmo:d58c330 --> +<!-- agent-pmo:29b9dcf --> ## TLDR <!-- One sentence: what does this PR do? --> -## What Was Added? -<!-- New functionality, new files, new dependencies. Delete section if nothing new. --> - -## What Was Changed or Deleted? -<!-- Modified behaviour, removed code, breaking changes. --> +## Details +<!-- New functionality, new files, new dependencies. What changed? --> ## How Do The Automated Tests Prove It Works? <!-- Name specific tests or describe what the test output demonstrates. --> <!-- "Tests pass" is not acceptable. Be specific. --> - -## Spec / Doc Changes -<!-- If any spec, CLAUDE.md, README, or doc was updated, summarise here. --> -<!-- Delete section if no docs changed. --> - -## Breaking Changes -- [ ] None -<!-- Or describe any breaking API / behaviour changes below --> diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 206624a..37e8a80 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -1,4 +1,4 @@ -# agent-pmo:d58c330 +# agent-pmo:29b9dcf name: CI on: @@ -12,10 +12,14 @@ concurrency: cancel-in-progress: true jobs: - lint: - name: Lint + ci: + name: CI runs-on: ubuntu-latest - timeout-minutes: 10 + timeout-minutes: 30 + env: + DB_PASSWORD: changeme + TEST_POSTGRES_CONNECTION: Host=localhost;Database=postgres;Username=postgres;Password=changeme + ICD10_TEST_CONNECTION_STRING: Host=localhost;Database=icd10;Username=postgres;Password=changeme steps: - uses: actions/checkout@v4 @@ -26,30 +30,53 @@ jobs: - run: dotnet restore - run: dotnet tool restore - - name: Lint - run: make lint + # csharpier needs no DB -- run it first so format issues fail + # before paying the cost of docker, codegen, or Playwright. + - name: Format check + run: make fmt-check - test: - name: Test - runs-on: ubuntu-latest - timeout-minutes: 10 - needs: lint - steps: - - uses: actions/checkout@v4 + - name: Start Postgres (pgvector) via docker compose + run: make db-up - - uses: actions/setup-dotnet@v4 - with: - dotnet-version: '10.0.x' + - name: Migrate Postgres schemas + run: make db-migrate - - run: dotnet restore - - run: dotnet tool restore + # `make lint` runs the full Release build, which triggers + # `dotnet DataProvider postgres` codegen against the live database, + # so it has to come after db-up + db-migrate. Still kept ahead of + # the embedding service / Playwright steps to fail fast on warnings. + - name: Lint + run: make lint + + - name: Start embedding service + run: | + cd ICD10/embedding-service + docker compose up -d --build + # Wait until /health responds 200 (model load can take ~60s) + for i in $(seq 1 60); do + if curl -sf http://localhost:8000/health > /dev/null; then + echo "Embedding service ready" + exit 0 + fi + sleep 2 + done + echo "Embedding service failed to become healthy" + docker compose logs + exit 1 + + - name: Install Playwright browsers + run: | + dotnet build Dashboard/Dashboard.Integration.Tests/Dashboard.Integration.Tests.csproj --configuration Release + dotnet tool install --global Microsoft.Playwright.CLI || true + export PATH="$PATH:$HOME/.dotnet/tools" + playwright install --with-deps chromium - name: Test run: make test + # Per-project thresholds live in coverage-thresholds.json (default 80%). + # Bump entries by floor(measured) - 1 whenever real coverage improves. - name: Coverage check - env: - COVERAGE_THRESHOLD: ${{ vars.COVERAGE_THRESHOLD_DOTNET || '80' }} run: make coverage-check - name: Upload coverage @@ -61,19 +88,5 @@ jobs: TestResults/**/coverage.* retention-days: 7 - build: - name: Build - runs-on: ubuntu-latest - timeout-minutes: 10 - needs: test - steps: - - uses: actions/checkout@v4 - - - uses: actions/setup-dotnet@v4 - with: - dotnet-version: '10.0.x' - - - run: dotnet restore - - name: Build run: make build diff --git a/.gitignore b/.gitignore index f2ac821..b1a18a9 100644 --- a/.gitignore +++ b/.gitignore @@ -1,4 +1,4 @@ -# agent-pmo:d58c330 +# agent-pmo:29b9dcf # ============================================================================= # UNIVERSAL @@ -71,6 +71,16 @@ obj/ out/ publish/ artifacts/ + +# DataProvider codegen output (regenerated each build by `dotnet DataProvider postgres`) +**/Generated/ +**/Generated/**/*.g.cs +**/Generated/.timestamp +**/*.generated.sql +# Local SQLite databases (no longer used; project is on Postgres) +*.db +*.db-shm +*.db-wal *.user *.userosscache *.suo @@ -79,8 +89,13 @@ artifacts/ project.lock.json *.nupkg *.snupkg +!nupkgs/*.nupkg **/packages/* !**/packages/build/ !**/packages/repositories.config *.TargetFrameworkMonikers *.dotCover + + +*.nupkg +*.generated.sql \ No newline at end of file diff --git a/AGENTS.md b/AGENTS.md index 3af39da..f0ce473 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -1,4 +1,4 @@ -<!-- agent-pmo:d58c330 --> +<!-- agent-pmo:29b9dcf --> # Single Source of Truth diff --git a/CLAUDE.md b/CLAUDE.md index 9ca39ee..cc4d404 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -1,10 +1,14 @@ -<!-- agent-pmo:d58c330 --> - # HealthcareSamples -- Agent Instructions +⚠️ CRITICAL: **Reduce token usage.** Check file size before loading. Write less. Delete fluff and dead code. Alert user when context is loaded with pointless files. ⚠️ + +⚠️ MIGRATING ANY DB WITH ANYTHING OTHER THAN Data Provider Migrations is COMPLETELY ILLEGAL ⚠️ + > Read this entire file before writing any code. > These rules are NON-NEGOTIABLE. Violations will be rejected in review. +<!-- agent-pmo:29b9dcf --> + ## Project Overview HealthcareSamples is a comprehensive demonstration of the DataProvider .NET toolkit. It contains three FHIR-compliant microservices (Clinical API, Scheduling API, ICD-10 API) with bidirectional sync workers, semantic search via pgvector embeddings, a React dashboard (H5 transpiler), and Docker configuration. All medical data follows the FHIR R5 specification. @@ -44,7 +48,7 @@ If the TMC server is available: ## Logging Standards -- **Use a structured logging library.** Never use `Console.WriteLine` or `Debug.WriteLine` for diagnostics. Use `Microsoft.Extensions.Logging` with Serilog. +- **Use a structured logging library.** Never use `Console.WriteLine` or `Debug.WriteLine` for diagnostics. Use `Microsoft.Extensions.Logging`. - **Log at entry/exit of all significant operations.** Use appropriate levels: `error`, `warn`, `info`, `debug`, `trace`. - **Logging must be throughout the app.** Every service, handler, and non-trivial operation should log. Silent failures are forbidden. - **SaaS / server apps:** Log to the database for persistence and queryability. Log calls that write to the database or file MUST be async or run on a background thread -- never block the request path with I/O logging. @@ -56,7 +60,7 @@ If the TMC server is available: | Language | Library | Notes | |----------|---------|-------| -| C# | `Microsoft.Extensions.Logging` | With Serilog for structured output | +| C# | `Microsoft.Extensions.Logging` | | ## Hard Rules -- C# diff --git a/Clinical/Clinical.Api/Clinical.Api.csproj b/Clinical/Clinical.Api/Clinical.Api.csproj index 55ca32b..7f4cce3 100644 --- a/Clinical/Clinical.Api/Clinical.Api.csproj +++ b/Clinical/Clinical.Api/Clinical.Api.csproj @@ -1,7 +1,7 @@ <Project Sdk="Microsoft.NET.Sdk.Web"> <PropertyGroup> <OutputType>Exe</OutputType> - <NoWarn>CA1515;CA2100;RS1035;CA1508;CA2234</NoWarn> + <NoWarn>$(NoWarn);CA1515;CA2100;RS1035;CA1508;CA2234;CS1591</NoWarn> <EnableLqlTranspile>true</EnableLqlTranspile> </PropertyGroup> @@ -12,12 +12,17 @@ <ItemGroup> <PackageReference Include="Npgsql" Version="9.0.2" /> - <PackageReference Include="MelbourneDev.DataProvider" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Lql.Postgres" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Sync.Postgres" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Selecta" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Migration" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Migration.Postgres" Version="0.1.0" /> + <PackageReference Include="Nimblesite.DataProvider.Core" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Lql.Postgres" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Sync.Postgres" Version="$(DataProviderVersion)" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Core" + Version="$(DataProviderVersion)" + /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Postgres" + Version="$(DataProviderVersion)" + /> </ItemGroup> <ItemGroup> @@ -34,19 +39,10 @@ </Content> </ItemGroup> - <!-- Create database from YAML using Migration.Cli (installed as dotnet tool) --> - <Target Name="CreateDatabaseSchema" BeforeTargets="TranspileLqlAndGenerateDataProvider"> - <Exec - Command="dotnet migration-cli -- --schema "$(MSBuildProjectDirectory)/clinical-schema.yaml" --output "$(MSBuildProjectDirectory)/clinical.db" --provider sqlite" - WorkingDirectory="$(MSBuildProjectDirectory)" - StandardOutputImportance="High" - StandardErrorImportance="High" - /> - </Target> - - <!-- Pre-compile: transpile LQL to SQL, then generate C# from SQL using CLI tools --> + <!-- Pre-compile: transpile LQL to SQL, then generate C# via `dotnet DataProvider postgres`. + Requires a live Postgres with the clinical schema migrated (see `make db-migrate`). --> <Target - Name="TranspileLqlAndGenerateDataProvider" + Name="GenerateDataProvider" BeforeTargets="BeforeCompile;CoreCompile" Inputs="$(MSBuildProjectDirectory)/DataProvider.json;@(AdditionalFiles);@(LqlFiles)" Outputs="$(MSBuildProjectDirectory)/Generated/.timestamp" @@ -58,19 +54,17 @@ </ItemGroup> <Message Importance="High" Text="Transpiling LQL files (@(LqlFiles))" /> <Exec - Command="dotnet lqlcli-sqlite -- --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" + Command="dotnet Lql postgres --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" Condition="'$(EnableLqlTranspile)' == 'true' and @(LqlFiles) != ''" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" - ContinueOnError="WarnAndContinue" /> <Exec - Command="dotnet dataprovider-sqlite-cli -- --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated" --connection-type NpgsqlConnection" + Command="dotnet DataProvider postgres --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" - IgnoreExitCode="true" /> <Touch Files="$(MSBuildProjectDirectory)/Generated/.timestamp" AlwaysCreate="true" /> <ItemGroup> diff --git a/Clinical/Clinical.Api/DataProvider.json b/Clinical/Clinical.Api/DataProvider.json index 3c6fe7d..fc5feeb 100644 --- a/Clinical/Clinical.Api/DataProvider.json +++ b/Clinical/Clinical.Api/DataProvider.json @@ -1,67 +1,71 @@ { - "queries": [ - { - "name": "GetPatients", - "sqlFile": "Queries/GetPatients.generated.sql" - }, - { - "name": "GetPatientById", - "sqlFile": "Queries/GetPatientById.generated.sql" - }, - { - "name": "SearchPatients", - "sqlFile": "Queries/SearchPatients.generated.sql" - }, - { - "name": "GetEncountersByPatient", - "sqlFile": "Queries/GetEncountersByPatient.generated.sql" - }, - { - "name": "GetConditionsByPatient", - "sqlFile": "Queries/GetConditionsByPatient.generated.sql" - }, - { - "name": "GetMedicationsByPatient", - "sqlFile": "Queries/GetMedicationsByPatient.generated.sql" - } - ], - "tables": [ - { - "schema": "main", - "name": "fhir_Patient", - "generateInsert": true, - "generateUpdate": true, - "generateDelete": true, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "fhir_Encounter", - "generateInsert": true, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "fhir_Condition", - "generateInsert": true, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "fhir_MedicationRequest", - "generateInsert": true, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - } - ], - "connectionString": "Data Source=clinical.db" -} + "queries": [ + { + "name": "GetPatients", + "sqlFile": "Queries/GetPatients.generated.sql" + }, + { + "name": "GetPatientById", + "sqlFile": "Queries/GetPatientById.generated.sql" + }, + { + "name": "SearchPatients", + "sqlFile": "Queries/SearchPatients.generated.sql" + }, + { + "name": "GetEncountersByPatient", + "sqlFile": "Queries/GetEncountersByPatient.generated.sql" + }, + { + "name": "GetConditionsByPatient", + "sqlFile": "Queries/GetConditionsByPatient.generated.sql" + }, + { + "name": "GetMedicationsByPatient", + "sqlFile": "Queries/GetMedicationsByPatient.generated.sql" + } + ], + "tables": [ + { + "schema": "public", + "name": "fhir_patient", + "generateInsert": true, + "generateUpdate": true, + "generateDelete": true, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "fhir_encounter", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "fhir_condition", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "fhir_medicationrequest", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + } + ], + "connectionString": "Host=localhost;Port=5432;Database=clinical;Username=postgres;Password=changeme" +} \ No newline at end of file diff --git a/Clinical/Clinical.Api/DatabaseSetup.cs b/Clinical/Clinical.Api/DatabaseSetup.cs index d923560..9ae91e6 100644 --- a/Clinical/Clinical.Api/DatabaseSetup.cs +++ b/Clinical/Clinical.Api/DatabaseSetup.cs @@ -1,5 +1,5 @@ -using Migration; -using Migration.Postgres; +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; using InitError = Outcome.Result<bool, string>.Error<bool, string>; using InitOk = Outcome.Result<bool, string>.Ok<bool, string>; using InitResult = Outcome.Result<bool, string>; @@ -38,16 +38,7 @@ public static InitResult Initialize(NpgsqlConnection connection, ILogger logger) { var yamlPath = Path.Combine(AppContext.BaseDirectory, "clinical-schema.yaml"); var schema = SchemaYamlSerializer.FromYamlFile(yamlPath); - - foreach (var table in schema.Tables) - { - var ddl = PostgresDdlGenerator.Generate(new CreateTableOperation(table)); - using var cmd = connection.CreateCommand(); - cmd.CommandText = ddl; - cmd.ExecuteNonQuery(); - logger.Log(LogLevel.Debug, "Created table {TableName}", table.Name); - } - + PostgresDdlGenerator.MigrateSchema(connection, schema); logger.Log(LogLevel.Information, "Created Clinical database schema from YAML"); } catch (Exception ex) diff --git a/Clinical/Clinical.Api/Generated/.timestamp b/Clinical/Clinical.Api/Generated/.timestamp deleted file mode 100644 index e69de29..0000000 diff --git a/Clinical/Clinical.Api/Generated/GetConditionsByPatient.g.cs b/Clinical/Clinical.Api/Generated/GetConditionsByPatient.g.cs deleted file mode 100644 index e23e6af..0000000 --- a/Clinical/Clinical.Api/Generated/GetConditionsByPatient.g.cs +++ /dev/null @@ -1,163 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetConditionsByPatient'. -/// </summary> -public static partial class GetConditionsByPatientExtensions -{ - /// <summary> - /// Executes 'GetConditionsByPatient.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="patientId">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetConditionsByPatient>, SqlError>> GetConditionsByPatientAsync(this NpgsqlConnection connection, object patientId) - { - const string sql = @"SELECT fhir_Condition.Id, fhir_Condition.ClinicalStatus, fhir_Condition.VerificationStatus, fhir_Condition.Category, fhir_Condition.Severity, fhir_Condition.CodeSystem, fhir_Condition.CodeValue, fhir_Condition.CodeDisplay, fhir_Condition.SubjectReference, fhir_Condition.EncounterReference, fhir_Condition.OnsetDateTime, fhir_Condition.RecordedDate, fhir_Condition.RecorderReference, fhir_Condition.NoteText, fhir_Condition.LastUpdated, fhir_Condition.VersionId FROM fhir_Condition WHERE fhir_Condition.SubjectReference = @patientId ORDER BY fhir_Condition.RecordedDate DESC"; - - try - { - var results = ImmutableList.CreateBuilder<GetConditionsByPatient>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (patientId is not null and not DBNull) - command.Parameters.AddWithValue("@patientId", patientId); - else - command.Parameters.Add(new NpgsqlParameter("@patientId", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetConditionsByPatient( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? null : reader.GetFieldValue<string>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13), - reader.IsDBNull(14) ? null : reader.GetFieldValue<string>(14), - reader.IsDBNull(15) ? default(long) : reader.GetFieldValue<long>(15) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetConditionsByPatient>, SqlError>.Ok<ImmutableList<GetConditionsByPatient>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetConditionsByPatient>, SqlError>.Error<ImmutableList<GetConditionsByPatient>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetConditionsByPatient' query. -/// </summary> -public record GetConditionsByPatient -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'ClinicalStatus'.</summary> - public string ClinicalStatus { get; init; } - - /// <summary>Column 'VerificationStatus'.</summary> - public string VerificationStatus { get; init; } - - /// <summary>Column 'Category'.</summary> - public string Category { get; init; } - - /// <summary>Column 'Severity'.</summary> - public string Severity { get; init; } - - /// <summary>Column 'CodeSystem'.</summary> - public string CodeSystem { get; init; } - - /// <summary>Column 'CodeValue'.</summary> - public string CodeValue { get; init; } - - /// <summary>Column 'CodeDisplay'.</summary> - public string CodeDisplay { get; init; } - - /// <summary>Column 'SubjectReference'.</summary> - public string SubjectReference { get; init; } - - /// <summary>Column 'EncounterReference'.</summary> - public string EncounterReference { get; init; } - - /// <summary>Column 'OnsetDateTime'.</summary> - public string OnsetDateTime { get; init; } - - /// <summary>Column 'RecordedDate'.</summary> - public string RecordedDate { get; init; } - - /// <summary>Column 'RecorderReference'.</summary> - public string RecorderReference { get; init; } - - /// <summary>Column 'NoteText'.</summary> - public string NoteText { get; init; } - - /// <summary>Column 'LastUpdated'.</summary> - public string LastUpdated { get; init; } - - /// <summary>Column 'VersionId'.</summary> - public long VersionId { get; init; } - - /// <summary>Initializes a new instance of GetConditionsByPatient.</summary> - public GetConditionsByPatient( - string Id, - string ClinicalStatus, - string VerificationStatus, - string Category, - string Severity, - string CodeSystem, - string CodeValue, - string CodeDisplay, - string SubjectReference, - string EncounterReference, - string OnsetDateTime, - string RecordedDate, - string RecorderReference, - string NoteText, - string LastUpdated, - long VersionId - ) - { - this.Id = Id; - this.ClinicalStatus = ClinicalStatus; - this.VerificationStatus = VerificationStatus; - this.Category = Category; - this.Severity = Severity; - this.CodeSystem = CodeSystem; - this.CodeValue = CodeValue; - this.CodeDisplay = CodeDisplay; - this.SubjectReference = SubjectReference; - this.EncounterReference = EncounterReference; - this.OnsetDateTime = OnsetDateTime; - this.RecordedDate = RecordedDate; - this.RecorderReference = RecorderReference; - this.NoteText = NoteText; - this.LastUpdated = LastUpdated; - this.VersionId = VersionId; - } -} diff --git a/Clinical/Clinical.Api/Generated/GetEncountersByPatient.g.cs b/Clinical/Clinical.Api/Generated/GetEncountersByPatient.g.cs deleted file mode 100644 index 15b16da..0000000 --- a/Clinical/Clinical.Api/Generated/GetEncountersByPatient.g.cs +++ /dev/null @@ -1,139 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetEncountersByPatient'. -/// </summary> -public static partial class GetEncountersByPatientExtensions -{ - /// <summary> - /// Executes 'GetEncountersByPatient.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="patientId">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetEncountersByPatient>, SqlError>> GetEncountersByPatientAsync(this NpgsqlConnection connection, object patientId) - { - const string sql = @"SELECT fhir_Encounter.Id, fhir_Encounter.Status, fhir_Encounter.Class, fhir_Encounter.PatientId, fhir_Encounter.PractitionerId, fhir_Encounter.ServiceType, fhir_Encounter.ReasonCode, fhir_Encounter.PeriodStart, fhir_Encounter.PeriodEnd, fhir_Encounter.Notes, fhir_Encounter.LastUpdated, fhir_Encounter.VersionId FROM fhir_Encounter WHERE fhir_Encounter.PatientId = @patientId ORDER BY fhir_Encounter.PeriodStart DESC"; - - try - { - var results = ImmutableList.CreateBuilder<GetEncountersByPatient>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (patientId is not null and not DBNull) - command.Parameters.AddWithValue("@patientId", patientId); - else - command.Parameters.Add(new NpgsqlParameter("@patientId", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetEncountersByPatient( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? null : reader.GetFieldValue<string>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? default(long) : reader.GetFieldValue<long>(11) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetEncountersByPatient>, SqlError>.Ok<ImmutableList<GetEncountersByPatient>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetEncountersByPatient>, SqlError>.Error<ImmutableList<GetEncountersByPatient>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetEncountersByPatient' query. -/// </summary> -public record GetEncountersByPatient -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'Class'.</summary> - public string Class { get; init; } - - /// <summary>Column 'PatientId'.</summary> - public string PatientId { get; init; } - - /// <summary>Column 'PractitionerId'.</summary> - public string PractitionerId { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Column 'PeriodStart'.</summary> - public string PeriodStart { get; init; } - - /// <summary>Column 'PeriodEnd'.</summary> - public string PeriodEnd { get; init; } - - /// <summary>Column 'Notes'.</summary> - public string Notes { get; init; } - - /// <summary>Column 'LastUpdated'.</summary> - public string LastUpdated { get; init; } - - /// <summary>Column 'VersionId'.</summary> - public long VersionId { get; init; } - - /// <summary>Initializes a new instance of GetEncountersByPatient.</summary> - public GetEncountersByPatient( - string Id, - string Status, - string Class, - string PatientId, - string PractitionerId, - string ServiceType, - string ReasonCode, - string PeriodStart, - string PeriodEnd, - string Notes, - string LastUpdated, - long VersionId - ) - { - this.Id = Id; - this.Status = Status; - this.Class = Class; - this.PatientId = PatientId; - this.PractitionerId = PractitionerId; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - this.PeriodStart = PeriodStart; - this.PeriodEnd = PeriodEnd; - this.Notes = Notes; - this.LastUpdated = LastUpdated; - this.VersionId = VersionId; - } -} diff --git a/Clinical/Clinical.Api/Generated/GetMedicationsByPatient.g.cs b/Clinical/Clinical.Api/Generated/GetMedicationsByPatient.g.cs deleted file mode 100644 index 65fe481..0000000 --- a/Clinical/Clinical.Api/Generated/GetMedicationsByPatient.g.cs +++ /dev/null @@ -1,157 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetMedicationsByPatient'. -/// </summary> -public static partial class GetMedicationsByPatientExtensions -{ - /// <summary> - /// Executes 'GetMedicationsByPatient.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="patientId">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetMedicationsByPatient>, SqlError>> GetMedicationsByPatientAsync(this NpgsqlConnection connection, object patientId) - { - const string sql = @"SELECT fhir_MedicationRequest.Id, fhir_MedicationRequest.Status, fhir_MedicationRequest.Intent, fhir_MedicationRequest.PatientId, fhir_MedicationRequest.PractitionerId, fhir_MedicationRequest.EncounterId, fhir_MedicationRequest.MedicationCode, fhir_MedicationRequest.MedicationDisplay, fhir_MedicationRequest.DosageInstruction, fhir_MedicationRequest.Quantity, fhir_MedicationRequest.Unit, fhir_MedicationRequest.Refills, fhir_MedicationRequest.AuthoredOn, fhir_MedicationRequest.LastUpdated, fhir_MedicationRequest.VersionId FROM fhir_MedicationRequest WHERE fhir_MedicationRequest.PatientId = @patientId ORDER BY fhir_MedicationRequest.AuthoredOn DESC"; - - try - { - var results = ImmutableList.CreateBuilder<GetMedicationsByPatient>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (patientId is not null and not DBNull) - command.Parameters.AddWithValue("@patientId", patientId); - else - command.Parameters.Add(new NpgsqlParameter("@patientId", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetMedicationsByPatient( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? default(double) : reader.GetFieldValue<double>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? default(long) : reader.GetFieldValue<long>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13), - reader.IsDBNull(14) ? default(long) : reader.GetFieldValue<long>(14) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetMedicationsByPatient>, SqlError>.Ok<ImmutableList<GetMedicationsByPatient>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetMedicationsByPatient>, SqlError>.Error<ImmutableList<GetMedicationsByPatient>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetMedicationsByPatient' query. -/// </summary> -public record GetMedicationsByPatient -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'Intent'.</summary> - public string Intent { get; init; } - - /// <summary>Column 'PatientId'.</summary> - public string PatientId { get; init; } - - /// <summary>Column 'PractitionerId'.</summary> - public string PractitionerId { get; init; } - - /// <summary>Column 'EncounterId'.</summary> - public string EncounterId { get; init; } - - /// <summary>Column 'MedicationCode'.</summary> - public string MedicationCode { get; init; } - - /// <summary>Column 'MedicationDisplay'.</summary> - public string MedicationDisplay { get; init; } - - /// <summary>Column 'DosageInstruction'.</summary> - public string DosageInstruction { get; init; } - - /// <summary>Column 'Quantity'.</summary> - public double Quantity { get; init; } - - /// <summary>Column 'Unit'.</summary> - public string Unit { get; init; } - - /// <summary>Column 'Refills'.</summary> - public long Refills { get; init; } - - /// <summary>Column 'AuthoredOn'.</summary> - public string AuthoredOn { get; init; } - - /// <summary>Column 'LastUpdated'.</summary> - public string LastUpdated { get; init; } - - /// <summary>Column 'VersionId'.</summary> - public long VersionId { get; init; } - - /// <summary>Initializes a new instance of GetMedicationsByPatient.</summary> - public GetMedicationsByPatient( - string Id, - string Status, - string Intent, - string PatientId, - string PractitionerId, - string EncounterId, - string MedicationCode, - string MedicationDisplay, - string DosageInstruction, - double Quantity, - string Unit, - long Refills, - string AuthoredOn, - string LastUpdated, - long VersionId - ) - { - this.Id = Id; - this.Status = Status; - this.Intent = Intent; - this.PatientId = PatientId; - this.PractitionerId = PractitionerId; - this.EncounterId = EncounterId; - this.MedicationCode = MedicationCode; - this.MedicationDisplay = MedicationDisplay; - this.DosageInstruction = DosageInstruction; - this.Quantity = Quantity; - this.Unit = Unit; - this.Refills = Refills; - this.AuthoredOn = AuthoredOn; - this.LastUpdated = LastUpdated; - this.VersionId = VersionId; - } -} diff --git a/Clinical/Clinical.Api/Generated/GetPatientById.g.cs b/Clinical/Clinical.Api/Generated/GetPatientById.g.cs deleted file mode 100644 index 5b2ce69..0000000 --- a/Clinical/Clinical.Api/Generated/GetPatientById.g.cs +++ /dev/null @@ -1,157 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetPatientById'. -/// </summary> -public static partial class GetPatientByIdExtensions -{ - /// <summary> - /// Executes 'GetPatientById.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="id">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetPatientById>, SqlError>> GetPatientByIdAsync(this NpgsqlConnection connection, object id) - { - const string sql = @"SELECT fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId FROM fhir_Patient WHERE fhir_Patient.Id = @id"; - - try - { - var results = ImmutableList.CreateBuilder<GetPatientById>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (id is not null and not DBNull) - command.Parameters.AddWithValue("@id", id); - else - command.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetPatientById( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? default(long) : reader.GetFieldValue<long>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? null : reader.GetFieldValue<string>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13), - reader.IsDBNull(14) ? default(long) : reader.GetFieldValue<long>(14) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetPatientById>, SqlError>.Ok<ImmutableList<GetPatientById>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetPatientById>, SqlError>.Error<ImmutableList<GetPatientById>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetPatientById' query. -/// </summary> -public record GetPatientById -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Column 'GivenName'.</summary> - public string GivenName { get; init; } - - /// <summary>Column 'FamilyName'.</summary> - public string FamilyName { get; init; } - - /// <summary>Column 'BirthDate'.</summary> - public string BirthDate { get; init; } - - /// <summary>Column 'Gender'.</summary> - public string Gender { get; init; } - - /// <summary>Column 'Phone'.</summary> - public string Phone { get; init; } - - /// <summary>Column 'Email'.</summary> - public string Email { get; init; } - - /// <summary>Column 'AddressLine'.</summary> - public string AddressLine { get; init; } - - /// <summary>Column 'City'.</summary> - public string City { get; init; } - - /// <summary>Column 'State'.</summary> - public string State { get; init; } - - /// <summary>Column 'PostalCode'.</summary> - public string PostalCode { get; init; } - - /// <summary>Column 'Country'.</summary> - public string Country { get; init; } - - /// <summary>Column 'LastUpdated'.</summary> - public string LastUpdated { get; init; } - - /// <summary>Column 'VersionId'.</summary> - public long VersionId { get; init; } - - /// <summary>Initializes a new instance of GetPatientById.</summary> - public GetPatientById( - string Id, - long Active, - string GivenName, - string FamilyName, - string BirthDate, - string Gender, - string Phone, - string Email, - string AddressLine, - string City, - string State, - string PostalCode, - string Country, - string LastUpdated, - long VersionId - ) - { - this.Id = Id; - this.Active = Active; - this.GivenName = GivenName; - this.FamilyName = FamilyName; - this.BirthDate = BirthDate; - this.Gender = Gender; - this.Phone = Phone; - this.Email = Email; - this.AddressLine = AddressLine; - this.City = City; - this.State = State; - this.PostalCode = PostalCode; - this.Country = Country; - this.LastUpdated = LastUpdated; - this.VersionId = VersionId; - } -} diff --git a/Clinical/Clinical.Api/Generated/GetPatients.g.cs b/Clinical/Clinical.Api/Generated/GetPatients.g.cs deleted file mode 100644 index 218223f..0000000 --- a/Clinical/Clinical.Api/Generated/GetPatients.g.cs +++ /dev/null @@ -1,172 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetPatients'. -/// </summary> -public static partial class GetPatientsExtensions -{ - /// <summary> - /// Executes 'GetPatients.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="active">Query parameter.</param> - /// <param name="familyName">Query parameter.</param> - /// <param name="givenName">Query parameter.</param> - /// <param name="gender">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetPatients>, SqlError>> GetPatientsAsync(this NpgsqlConnection connection, object active, object familyName, object givenName, object gender) - { - const string sql = @"SELECT fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId FROM fhir_Patient WHERE (@active IS NULL OR fhir_Patient.Active = @active) AND (@familyName IS NULL OR fhir_Patient.FamilyName LIKE '%' || @familyName || '%') AND (@givenName IS NULL OR fhir_Patient.GivenName LIKE '%' || @givenName || '%') AND (@gender IS NULL OR fhir_Patient.Gender = @gender) ORDER BY fhir_Patient.FamilyName , fhir_Patient.GivenName "; - - try - { - var results = ImmutableList.CreateBuilder<GetPatients>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (active is not null and not DBNull) - command.Parameters.AddWithValue("@active", active); - else - command.Parameters.Add(new NpgsqlParameter("@active", NpgsqlTypes.NpgsqlDbType.Bigint) { Value = DBNull.Value }); - if (familyName is not null and not DBNull) - command.Parameters.AddWithValue("@familyName", familyName); - else - command.Parameters.Add(new NpgsqlParameter("@familyName", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (givenName is not null and not DBNull) - command.Parameters.AddWithValue("@givenName", givenName); - else - command.Parameters.Add(new NpgsqlParameter("@givenName", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (gender is not null and not DBNull) - command.Parameters.AddWithValue("@gender", gender); - else - command.Parameters.Add(new NpgsqlParameter("@gender", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetPatients( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? default(long) : reader.GetFieldValue<long>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? null : reader.GetFieldValue<string>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13), - reader.IsDBNull(14) ? default(long) : reader.GetFieldValue<long>(14) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetPatients>, SqlError>.Ok<ImmutableList<GetPatients>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetPatients>, SqlError>.Error<ImmutableList<GetPatients>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetPatients' query. -/// </summary> -public record GetPatients -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Column 'GivenName'.</summary> - public string GivenName { get; init; } - - /// <summary>Column 'FamilyName'.</summary> - public string FamilyName { get; init; } - - /// <summary>Column 'BirthDate'.</summary> - public string BirthDate { get; init; } - - /// <summary>Column 'Gender'.</summary> - public string Gender { get; init; } - - /// <summary>Column 'Phone'.</summary> - public string Phone { get; init; } - - /// <summary>Column 'Email'.</summary> - public string Email { get; init; } - - /// <summary>Column 'AddressLine'.</summary> - public string AddressLine { get; init; } - - /// <summary>Column 'City'.</summary> - public string City { get; init; } - - /// <summary>Column 'State'.</summary> - public string State { get; init; } - - /// <summary>Column 'PostalCode'.</summary> - public string PostalCode { get; init; } - - /// <summary>Column 'Country'.</summary> - public string Country { get; init; } - - /// <summary>Column 'LastUpdated'.</summary> - public string LastUpdated { get; init; } - - /// <summary>Column 'VersionId'.</summary> - public long VersionId { get; init; } - - /// <summary>Initializes a new instance of GetPatients.</summary> - public GetPatients( - string Id, - long Active, - string GivenName, - string FamilyName, - string BirthDate, - string Gender, - string Phone, - string Email, - string AddressLine, - string City, - string State, - string PostalCode, - string Country, - string LastUpdated, - long VersionId - ) - { - this.Id = Id; - this.Active = Active; - this.GivenName = GivenName; - this.FamilyName = FamilyName; - this.BirthDate = BirthDate; - this.Gender = Gender; - this.Phone = Phone; - this.Email = Email; - this.AddressLine = AddressLine; - this.City = City; - this.State = State; - this.PostalCode = PostalCode; - this.Country = Country; - this.LastUpdated = LastUpdated; - this.VersionId = VersionId; - } -} diff --git a/Clinical/Clinical.Api/Generated/SearchPatients.g.cs b/Clinical/Clinical.Api/Generated/SearchPatients.g.cs deleted file mode 100644 index 017f3ed..0000000 --- a/Clinical/Clinical.Api/Generated/SearchPatients.g.cs +++ /dev/null @@ -1,157 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'SearchPatients'. -/// </summary> -public static partial class SearchPatientsExtensions -{ - /// <summary> - /// Executes 'SearchPatients.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="term">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<SearchPatients>, SqlError>> SearchPatientsAsync(this NpgsqlConnection connection, object term) - { - const string sql = @"SELECT fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId FROM fhir_Patient WHERE fhir_Patient.GivenName LIKE @term OR fhir_Patient.FamilyName LIKE @term OR fhir_Patient.Email LIKE @term ORDER BY fhir_Patient.FamilyName , fhir_Patient.GivenName "; - - try - { - var results = ImmutableList.CreateBuilder<SearchPatients>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (term is not null and not DBNull) - command.Parameters.AddWithValue("@term", term); - else - command.Parameters.Add(new NpgsqlParameter("@term", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new SearchPatients( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? default(long) : reader.GetFieldValue<long>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? null : reader.GetFieldValue<string>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13), - reader.IsDBNull(14) ? default(long) : reader.GetFieldValue<long>(14) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<SearchPatients>, SqlError>.Ok<ImmutableList<SearchPatients>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<SearchPatients>, SqlError>.Error<ImmutableList<SearchPatients>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'SearchPatients' query. -/// </summary> -public record SearchPatients -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Column 'GivenName'.</summary> - public string GivenName { get; init; } - - /// <summary>Column 'FamilyName'.</summary> - public string FamilyName { get; init; } - - /// <summary>Column 'BirthDate'.</summary> - public string BirthDate { get; init; } - - /// <summary>Column 'Gender'.</summary> - public string Gender { get; init; } - - /// <summary>Column 'Phone'.</summary> - public string Phone { get; init; } - - /// <summary>Column 'Email'.</summary> - public string Email { get; init; } - - /// <summary>Column 'AddressLine'.</summary> - public string AddressLine { get; init; } - - /// <summary>Column 'City'.</summary> - public string City { get; init; } - - /// <summary>Column 'State'.</summary> - public string State { get; init; } - - /// <summary>Column 'PostalCode'.</summary> - public string PostalCode { get; init; } - - /// <summary>Column 'Country'.</summary> - public string Country { get; init; } - - /// <summary>Column 'LastUpdated'.</summary> - public string LastUpdated { get; init; } - - /// <summary>Column 'VersionId'.</summary> - public long VersionId { get; init; } - - /// <summary>Initializes a new instance of SearchPatients.</summary> - public SearchPatients( - string Id, - long Active, - string GivenName, - string FamilyName, - string BirthDate, - string Gender, - string Phone, - string Email, - string AddressLine, - string City, - string State, - string PostalCode, - string Country, - string LastUpdated, - long VersionId - ) - { - this.Id = Id; - this.Active = Active; - this.GivenName = GivenName; - this.FamilyName = FamilyName; - this.BirthDate = BirthDate; - this.Gender = Gender; - this.Phone = Phone; - this.Email = Email; - this.AddressLine = AddressLine; - this.City = City; - this.State = State; - this.PostalCode = PostalCode; - this.Country = Country; - this.LastUpdated = LastUpdated; - this.VersionId = VersionId; - } -} diff --git a/Clinical/Clinical.Api/Generated/fhir_ConditionOperations.g.cs b/Clinical/Clinical.Api/Generated/fhir_ConditionOperations.g.cs deleted file mode 100644 index 6ab81d6..0000000 --- a/Clinical/Clinical.Api/Generated/fhir_ConditionOperations.g.cs +++ /dev/null @@ -1,62 +0,0 @@ -#nullable enable -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Data; -using System.Globalization; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated -{ - /// <summary> - /// Extension methods for table operations on fhir_Condition - /// </summary> - public static partial class fhir_ConditionExtensions - { - - /// <summary> - /// Inserts a new row into the fhir_Condition table. - /// </summary> - public static async Task<Result<int, SqlError>> Insertfhir_ConditionAsync(this IDbTransaction transaction, string? id, string? clinicalstatus, string? verificationstatus, string? category, string? severity, string? codesystem, string? codevalue, string? codedisplay, string? subjectreference, string? encounterreference, string? onsetdatetime, string? recordeddate, string? recorderreference, string? notetext, string? lastupdated, long? versionid) - { - const string sql = "INSERT INTO fhir_Condition (Id, ClinicalStatus, VerificationStatus, Category, Severity, CodeSystem, CodeValue, CodeDisplay, SubjectReference, EncounterReference, OnsetDateTime, RecordedDate, RecorderReference, NoteText, LastUpdated, VersionId) VALUES (@Id, @ClinicalStatus, @VerificationStatus, @Category, @Severity, @CodeSystem, @CodeValue, @CodeDisplay, @SubjectReference, @EncounterReference, @OnsetDateTime, @RecordedDate, @RecorderReference, @NoteText, @LastUpdated, @VersionId)"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@ClinicalStatus", clinicalstatus ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@VerificationStatus", verificationstatus ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Category", category ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Severity", severity ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@CodeSystem", codesystem ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@CodeValue", codevalue ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@CodeDisplay", codedisplay ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@SubjectReference", subjectreference ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@EncounterReference", encounterreference ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@OnsetDateTime", onsetdatetime ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@RecordedDate", recordeddate ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@RecorderReference", recorderreference ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@NoteText", notetext ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@LastUpdated", lastupdated ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@VersionId", versionid ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Insert failed", ex)); - } - } - - } -} diff --git a/Clinical/Clinical.Api/Generated/fhir_EncounterOperations.g.cs b/Clinical/Clinical.Api/Generated/fhir_EncounterOperations.g.cs deleted file mode 100644 index b0bf4d7..0000000 --- a/Clinical/Clinical.Api/Generated/fhir_EncounterOperations.g.cs +++ /dev/null @@ -1,58 +0,0 @@ -#nullable enable -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Data; -using System.Globalization; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated -{ - /// <summary> - /// Extension methods for table operations on fhir_Encounter - /// </summary> - public static partial class fhir_EncounterExtensions - { - - /// <summary> - /// Inserts a new row into the fhir_Encounter table. - /// </summary> - public static async Task<Result<int, SqlError>> Insertfhir_EncounterAsync(this IDbTransaction transaction, string? id, string? status, string? @class, string? patientid, string? practitionerid, string? servicetype, string? reasoncode, string? periodstart, string? periodend, string? notes, string? lastupdated, long? versionid) - { - const string sql = "INSERT INTO fhir_Encounter (Id, Status, Class, PatientId, PractitionerId, ServiceType, ReasonCode, PeriodStart, PeriodEnd, Notes, LastUpdated, VersionId) VALUES (@Id, @Status, @Class, @PatientId, @PractitionerId, @ServiceType, @ReasonCode, @PeriodStart, @PeriodEnd, @Notes, @LastUpdated, @VersionId)"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Status", status ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Class", @class ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PatientId", patientid ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PractitionerId", practitionerid ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@ServiceType", servicetype ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@ReasonCode", reasoncode ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PeriodStart", periodstart ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PeriodEnd", periodend ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Notes", notes ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@LastUpdated", lastupdated ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@VersionId", versionid ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Insert failed", ex)); - } - } - - } -} diff --git a/Clinical/Clinical.Api/Generated/fhir_MedicationRequestOperations.g.cs b/Clinical/Clinical.Api/Generated/fhir_MedicationRequestOperations.g.cs deleted file mode 100644 index 9107a72..0000000 --- a/Clinical/Clinical.Api/Generated/fhir_MedicationRequestOperations.g.cs +++ /dev/null @@ -1,61 +0,0 @@ -#nullable enable -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Data; -using System.Globalization; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated -{ - /// <summary> - /// Extension methods for table operations on fhir_MedicationRequest - /// </summary> - public static partial class fhir_MedicationRequestExtensions - { - - /// <summary> - /// Inserts a new row into the fhir_MedicationRequest table. - /// </summary> - public static async Task<Result<int, SqlError>> Insertfhir_MedicationRequestAsync(this IDbTransaction transaction, string? id, string? status, string? intent, string? patientid, string? practitionerid, string? encounterid, string? medicationcode, string? medicationdisplay, string? dosageinstruction, double? quantity, string? unit, long? refills, string? authoredon, string? lastupdated, long? versionid) - { - const string sql = "INSERT INTO fhir_MedicationRequest (Id, Status, Intent, PatientId, PractitionerId, EncounterId, MedicationCode, MedicationDisplay, DosageInstruction, Quantity, Unit, Refills, AuthoredOn, LastUpdated, VersionId) VALUES (@Id, @Status, @Intent, @PatientId, @PractitionerId, @EncounterId, @MedicationCode, @MedicationDisplay, @DosageInstruction, @Quantity, @Unit, @Refills, @AuthoredOn, @LastUpdated, @VersionId)"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Status", status ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Intent", intent ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PatientId", patientid ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PractitionerId", practitionerid ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@EncounterId", encounterid ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@MedicationCode", medicationcode ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@MedicationDisplay", medicationdisplay ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@DosageInstruction", dosageinstruction ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Quantity", quantity ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Unit", unit ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Refills", refills ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@AuthoredOn", authoredon ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@LastUpdated", lastupdated ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@VersionId", versionid ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Insert failed", ex)); - } - } - - } -} diff --git a/Clinical/Clinical.Api/Generated/fhir_PatientOperations.g.cs b/Clinical/Clinical.Api/Generated/fhir_PatientOperations.g.cs deleted file mode 100644 index 7bd62a9..0000000 --- a/Clinical/Clinical.Api/Generated/fhir_PatientOperations.g.cs +++ /dev/null @@ -1,102 +0,0 @@ -#nullable enable -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Data; -using System.Globalization; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated -{ - /// <summary> - /// Extension methods for table operations on fhir_Patient - /// </summary> - public static partial class fhir_PatientExtensions - { - - /// <summary> - /// Inserts a new row into the fhir_Patient table. - /// </summary> - public static async Task<Result<int, SqlError>> Insertfhir_PatientAsync(this IDbTransaction transaction, string? id, long? active, string? givenname, string? familyname, string? birthdate, string? gender, string? phone, string? email, string? addressline, string? city, string? state, string? postalcode, string? country, string? lastupdated, long? versionid) - { - const string sql = "INSERT INTO fhir_Patient (Id, Active, GivenName, FamilyName, BirthDate, Gender, Phone, Email, AddressLine, City, State, PostalCode, Country, LastUpdated, VersionId) VALUES (@Id, @Active, @GivenName, @FamilyName, @BirthDate, @Gender, @Phone, @Email, @AddressLine, @City, @State, @PostalCode, @Country, @LastUpdated, @VersionId)"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Active", active ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@GivenName", givenname ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@FamilyName", familyname ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@BirthDate", birthdate ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Gender", gender ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Phone", phone ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Email", email ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@AddressLine", addressline ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@City", city ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@State", state ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PostalCode", postalcode ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Country", country ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@LastUpdated", lastupdated ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@VersionId", versionid ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Insert failed", ex)); - } - } - - - /// <summary> - /// Updates a row in the fhir_Patient table. - /// </summary> - public static async Task<Result<int, SqlError>> Updatefhir_PatientAsync(this IDbTransaction transaction, string id, long? active, string givenname, string familyname, string birthdate, string gender, string phone, string email, string addressline, string city, string state, string postalcode, string country, string lastupdated, long? versionid) - { - const string sql = "UPDATE fhir_Patient SET Active = @Active, GivenName = @GivenName, FamilyName = @FamilyName, BirthDate = @BirthDate, Gender = @Gender, Phone = @Phone, Email = @Email, AddressLine = @AddressLine, City = @City, State = @State, PostalCode = @PostalCode, Country = @Country, LastUpdated = @LastUpdated, VersionId = @VersionId WHERE Id = @Id"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Active", active ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@GivenName", givenname ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@FamilyName", familyname ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@BirthDate", birthdate ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Gender", gender ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Phone", phone ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Email", email ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@AddressLine", addressline ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@City", city ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@State", state ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PostalCode", postalcode ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Country", country ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@LastUpdated", lastupdated ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@VersionId", versionid ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Update failed", ex)); - } - } - - } -} diff --git a/Clinical/Clinical.Api/GlobalUsings.cs b/Clinical/Clinical.Api/GlobalUsings.cs index 3f63934..6d7738a 100644 --- a/Clinical/Clinical.Api/GlobalUsings.cs +++ b/Clinical/Clinical.Api/GlobalUsings.cs @@ -1,95 +1,135 @@ global using System; global using Generated; global using Microsoft.Extensions.Logging; +global using Nimblesite.Sql.Model; +global using Nimblesite.Sync.Core; +global using Nimblesite.Sync.Postgres; global using Npgsql; global using Outcome; -global using Sync; -global using Sync.Postgres; global using GetConditionsError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetConditionsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetConditionsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetConditionsByPatient query result type aliases global using GetConditionsOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetConditionsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.GetConditionsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; global using GetEncountersError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetEncountersByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetEncountersByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetEncountersByPatient query result type aliases global using GetEncountersOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetEncountersByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.GetEncountersByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; global using GetMedicationsError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetMedicationsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetMedicationsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetMedicationsByPatient query result type aliases global using GetMedicationsOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetMedicationsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.GetMedicationsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; global using GetPatientByIdError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetPatientById>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetPatientById>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetPatientById>, + Nimblesite.Sql.Model.SqlError +>; // GetPatientById query result type aliases global using GetPatientByIdOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetPatientById>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetPatientById>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetPatientById>, + Nimblesite.Sql.Model.SqlError +>; global using GetPatientsError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetPatients>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetPatients>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetPatients>, + Nimblesite.Sql.Model.SqlError +>; // GetPatients query result type aliases global using GetPatientsOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetPatients>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetPatients>, Selecta.SqlError>; -global using InsertError = Outcome.Result<int, Selecta.SqlError>.Error<int, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetPatients>, + Nimblesite.Sql.Model.SqlError +>; +global using InsertError = Outcome.Result<System.Guid?, Nimblesite.Sql.Model.SqlError>.Error< + System.Guid?, + Nimblesite.Sql.Model.SqlError +>; // Insert result type aliases -global using InsertOk = Outcome.Result<int, Selecta.SqlError>.Ok<int, Selecta.SqlError>; +global using InsertOk = Outcome.Result<System.Guid?, Nimblesite.Sql.Model.SqlError>.Ok< + System.Guid?, + Nimblesite.Sql.Model.SqlError +>; global using SearchPatientsError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchPatients>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.SearchPatients>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.SearchPatients>, + Nimblesite.Sql.Model.SqlError +>; // SearchPatients query result type aliases global using SearchPatientsOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchPatients>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.SearchPatients>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.SearchPatients>, + Nimblesite.Sql.Model.SqlError +>; // Sync result type aliases -global using StringSyncError = Outcome.Result<string, Sync.SyncError>.Error<string, Sync.SyncError>; -global using StringSyncOk = Outcome.Result<string, Sync.SyncError>.Ok<string, Sync.SyncError>; +global using StringSyncError = Outcome.Result<string, Nimblesite.Sync.Core.SyncError>.Error< + string, + Nimblesite.Sync.Core.SyncError +>; +global using StringSyncOk = Outcome.Result<string, Nimblesite.Sync.Core.SyncError>.Ok< + string, + Nimblesite.Sync.Core.SyncError +>; global using SyncLogListError = Outcome.Result< - System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, - Sync.SyncError ->.Error<System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, Sync.SyncError>; + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>.Error< + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>; global using SyncLogListOk = Outcome.Result< - System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, - Sync.SyncError ->.Ok<System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, Sync.SyncError>; + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>.Ok< + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>; // Update result type aliases -global using UpdateOk = Outcome.Result<int, Selecta.SqlError>.Ok<int, Selecta.SqlError>; +global using UpdateOk = Outcome.Result<int, Nimblesite.Sql.Model.SqlError>.Ok< + int, + Nimblesite.Sql.Model.SqlError +>; diff --git a/Clinical/Clinical.Api/Program.cs b/Clinical/Clinical.Api/Program.cs index db9bb2a..a6d7bdc 100644 --- a/Clinical/Clinical.Api/Program.cs +++ b/Clinical/Clinical.Api/Program.cs @@ -94,20 +94,22 @@ Func<NpgsqlConnection> getConn ) => { using var conn = getConn(); + // Active and gender filters are applied in C# because LQL `is null` + // checks produce non-nullable params with no "match-all" sentinel. + // String LIKE filters survive empty-string params via LIKE '%%'. var result = await conn.GetPatientsAsync( - active.HasValue - ? active.Value - ? 1 - : 0 - : DBNull.Value, - familyName ?? (object)DBNull.Value, - givenName ?? (object)DBNull.Value, - gender ?? (object)DBNull.Value + familyName ?? string.Empty, + givenName ?? string.Empty ) .ConfigureAwait(false); return result switch { - GetPatientsOk(var patients) => Results.Ok(patients), + GetPatientsOk(var patients) => Results.Ok( + patients + .Where(p => !active.HasValue || p.Active == (active.Value ? 1 : 0)) + .Where(p => string.IsNullOrEmpty(gender) || p.Gender == gender) + .ToImmutableList() + ), GetPatientsError(var err) => Results.Problem(err.Message), }; } @@ -161,7 +163,7 @@ Func<NpgsqlConnection> getConn ); var result = await transaction - .Insertfhir_PatientAsync( + .Insertfhir_patientAsync( id, request.Active ? 1 : 0, request.GivenName, @@ -251,7 +253,7 @@ Func<NpgsqlConnection> getConn ); var result = await transaction - .Updatefhir_PatientAsync( + .Updatefhir_patientAsync( id, request.Active ? 1 : 0, request.GivenName, @@ -374,7 +376,7 @@ Func<NpgsqlConnection> getConn ); var result = await transaction - .Insertfhir_EncounterAsync( + .Insertfhir_encounterAsync( id, request.Status, request.Class, @@ -470,23 +472,23 @@ Func<NpgsqlConnection> getConn var recordedDate = DateTime.UtcNow.ToString("yyyy-MM-dd", CultureInfo.InvariantCulture); var result = await transaction - .Insertfhir_ConditionAsync( - id: id, - clinicalstatus: request.ClinicalStatus, - verificationstatus: request.VerificationStatus, - category: request.Category, - severity: request.Severity, - codesystem: request.CodeSystem, - codevalue: request.CodeValue, - codedisplay: request.CodeDisplay, - subjectreference: patientId, - encounterreference: request.EncounterReference, - onsetdatetime: request.OnsetDateTime, - recordeddate: recordedDate, - recorderreference: request.RecorderReference, - notetext: request.NoteText, - lastupdated: now, - versionid: 1 + .Insertfhir_conditionAsync( + id, + request.ClinicalStatus, + request.VerificationStatus, + request.Category, + request.Severity, + request.CodeSystem, + request.CodeValue, + request.CodeDisplay, + patientId, + request.EncounterReference, + request.OnsetDateTime, + recordedDate, + request.RecorderReference, + request.NoteText, + now, + 1 ) .ConfigureAwait(false); @@ -579,7 +581,7 @@ Func<NpgsqlConnection> getConn ); var result = await transaction - .Insertfhir_MedicationRequestAsync( + .Insertfhir_medicationrequestAsync( id, request.Status, request.Intent, @@ -788,7 +790,7 @@ Func<NpgsqlConnection> getConn using var conn = getConn(); using var cmd = conn.CreateCommand(); cmd.CommandText = - "SELECT ProviderId, FirstName, LastName, Specialty, SyncedAt FROM sync_Provider"; + "SELECT \"ProviderId\", \"FirstName\", \"LastName\", \"Specialty\", \"SyncedAt\" FROM sync_provider"; using var reader = cmd.ExecuteReader(); var providers = new List<object>(); while (reader.Read()) diff --git a/Clinical/Clinical.Api/Queries/GetConditionsByPatient.generated.sql b/Clinical/Clinical.Api/Queries/GetConditionsByPatient.generated.sql deleted file mode 100644 index 703b770..0000000 --- a/Clinical/Clinical.Api/Queries/GetConditionsByPatient.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Condition.Id, fhir_Condition.ClinicalStatus, fhir_Condition.VerificationStatus, fhir_Condition.Category, fhir_Condition.Severity, fhir_Condition.CodeSystem, fhir_Condition.CodeValue, fhir_Condition.CodeDisplay, fhir_Condition.SubjectReference, fhir_Condition.EncounterReference, fhir_Condition.OnsetDateTime, fhir_Condition.RecordedDate, fhir_Condition.RecorderReference, fhir_Condition.NoteText, fhir_Condition.LastUpdated, fhir_Condition.VersionId FROM fhir_Condition WHERE fhir_Condition.SubjectReference = @patientId ORDER BY fhir_Condition.RecordedDate DESC \ No newline at end of file diff --git a/Clinical/Clinical.Api/Queries/GetConditionsByPatient.lql b/Clinical/Clinical.Api/Queries/GetConditionsByPatient.lql index 6b88fa6..970a63b 100644 --- a/Clinical/Clinical.Api/Queries/GetConditionsByPatient.lql +++ b/Clinical/Clinical.Api/Queries/GetConditionsByPatient.lql @@ -1,6 +1,6 @@ -- Get conditions for a patient -- Parameters: @patientId -fhir_Condition -|> filter(fn(row) => row.fhir_Condition.SubjectReference = @patientId) -|> select(fhir_Condition.Id, fhir_Condition.ClinicalStatus, fhir_Condition.VerificationStatus, fhir_Condition.Category, fhir_Condition.Severity, fhir_Condition.CodeSystem, fhir_Condition.CodeValue, fhir_Condition.CodeDisplay, fhir_Condition.SubjectReference, fhir_Condition.EncounterReference, fhir_Condition.OnsetDateTime, fhir_Condition.RecordedDate, fhir_Condition.RecorderReference, fhir_Condition.NoteText, fhir_Condition.LastUpdated, fhir_Condition.VersionId) -|> order_by(fhir_Condition.RecordedDate desc) +fhir_condition +|> filter(fn(row) => row.fhir_condition.SubjectReference = @patientId) +|> select(fhir_condition.Id, fhir_condition.ClinicalStatus, fhir_condition.VerificationStatus, fhir_condition.Category, fhir_condition.Severity, fhir_condition.CodeSystem, fhir_condition.CodeValue, fhir_condition.CodeDisplay, fhir_condition.SubjectReference, fhir_condition.EncounterReference, fhir_condition.OnsetDateTime, fhir_condition.RecordedDate, fhir_condition.RecorderReference, fhir_condition.NoteText, fhir_condition.LastUpdated, fhir_condition.VersionId) +|> order_by(fhir_condition.RecordedDate desc) diff --git a/Clinical/Clinical.Api/Queries/GetEncountersByPatient.generated.sql b/Clinical/Clinical.Api/Queries/GetEncountersByPatient.generated.sql deleted file mode 100644 index 6b4fd8d..0000000 --- a/Clinical/Clinical.Api/Queries/GetEncountersByPatient.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Encounter.Id, fhir_Encounter.Status, fhir_Encounter.Class, fhir_Encounter.PatientId, fhir_Encounter.PractitionerId, fhir_Encounter.ServiceType, fhir_Encounter.ReasonCode, fhir_Encounter.PeriodStart, fhir_Encounter.PeriodEnd, fhir_Encounter.Notes, fhir_Encounter.LastUpdated, fhir_Encounter.VersionId FROM fhir_Encounter WHERE fhir_Encounter.PatientId = @patientId ORDER BY fhir_Encounter.PeriodStart DESC \ No newline at end of file diff --git a/Clinical/Clinical.Api/Queries/GetEncountersByPatient.lql b/Clinical/Clinical.Api/Queries/GetEncountersByPatient.lql index 2f6530f..e01f806 100644 --- a/Clinical/Clinical.Api/Queries/GetEncountersByPatient.lql +++ b/Clinical/Clinical.Api/Queries/GetEncountersByPatient.lql @@ -1,6 +1,6 @@ -- Get encounters for a patient -- Parameters: @patientId -fhir_Encounter -|> filter(fn(row) => row.fhir_Encounter.PatientId = @patientId) -|> select(fhir_Encounter.Id, fhir_Encounter.Status, fhir_Encounter.Class, fhir_Encounter.PatientId, fhir_Encounter.PractitionerId, fhir_Encounter.ServiceType, fhir_Encounter.ReasonCode, fhir_Encounter.PeriodStart, fhir_Encounter.PeriodEnd, fhir_Encounter.Notes, fhir_Encounter.LastUpdated, fhir_Encounter.VersionId) -|> order_by(fhir_Encounter.PeriodStart desc) +fhir_encounter +|> filter(fn(row) => row.fhir_encounter.PatientId = @patientId) +|> select(fhir_encounter.Id, fhir_encounter.Status, fhir_encounter.Class, fhir_encounter.PatientId, fhir_encounter.PractitionerId, fhir_encounter.ServiceType, fhir_encounter.ReasonCode, fhir_encounter.PeriodStart, fhir_encounter.PeriodEnd, fhir_encounter.Notes, fhir_encounter.LastUpdated, fhir_encounter.VersionId) +|> order_by(fhir_encounter.PeriodStart desc) diff --git a/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.generated.sql b/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.generated.sql deleted file mode 100644 index 2c77367..0000000 --- a/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_MedicationRequest.Id, fhir_MedicationRequest.Status, fhir_MedicationRequest.Intent, fhir_MedicationRequest.PatientId, fhir_MedicationRequest.PractitionerId, fhir_MedicationRequest.EncounterId, fhir_MedicationRequest.MedicationCode, fhir_MedicationRequest.MedicationDisplay, fhir_MedicationRequest.DosageInstruction, fhir_MedicationRequest.Quantity, fhir_MedicationRequest.Unit, fhir_MedicationRequest.Refills, fhir_MedicationRequest.AuthoredOn, fhir_MedicationRequest.LastUpdated, fhir_MedicationRequest.VersionId FROM fhir_MedicationRequest WHERE fhir_MedicationRequest.PatientId = @patientId ORDER BY fhir_MedicationRequest.AuthoredOn DESC \ No newline at end of file diff --git a/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.lql b/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.lql index b7e53d3..9faf86f 100644 --- a/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.lql +++ b/Clinical/Clinical.Api/Queries/GetMedicationsByPatient.lql @@ -1,6 +1,6 @@ -- Get medication requests for a patient -- Parameters: @patientId -fhir_MedicationRequest -|> filter(fn(row) => row.fhir_MedicationRequest.PatientId = @patientId) -|> select(fhir_MedicationRequest.Id, fhir_MedicationRequest.Status, fhir_MedicationRequest.Intent, fhir_MedicationRequest.PatientId, fhir_MedicationRequest.PractitionerId, fhir_MedicationRequest.EncounterId, fhir_MedicationRequest.MedicationCode, fhir_MedicationRequest.MedicationDisplay, fhir_MedicationRequest.DosageInstruction, fhir_MedicationRequest.Quantity, fhir_MedicationRequest.Unit, fhir_MedicationRequest.Refills, fhir_MedicationRequest.AuthoredOn, fhir_MedicationRequest.LastUpdated, fhir_MedicationRequest.VersionId) -|> order_by(fhir_MedicationRequest.AuthoredOn desc) +fhir_medicationrequest +|> filter(fn(row) => row.fhir_medicationrequest.PatientId = @patientId) +|> select(fhir_medicationrequest.Id, fhir_medicationrequest.Status, fhir_medicationrequest.Intent, fhir_medicationrequest.PatientId, fhir_medicationrequest.PractitionerId, fhir_medicationrequest.EncounterId, fhir_medicationrequest.MedicationCode, fhir_medicationrequest.MedicationDisplay, fhir_medicationrequest.DosageInstruction, fhir_medicationrequest.Quantity, fhir_medicationrequest.Unit, fhir_medicationrequest.Refills, fhir_medicationrequest.AuthoredOn, fhir_medicationrequest.LastUpdated, fhir_medicationrequest.VersionId) +|> order_by(fhir_medicationrequest.AuthoredOn desc) diff --git a/Clinical/Clinical.Api/Queries/GetPatientById.generated.sql b/Clinical/Clinical.Api/Queries/GetPatientById.generated.sql deleted file mode 100644 index 23f0fd0..0000000 --- a/Clinical/Clinical.Api/Queries/GetPatientById.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId FROM fhir_Patient WHERE fhir_Patient.Id = @id \ No newline at end of file diff --git a/Clinical/Clinical.Api/Queries/GetPatientById.lql b/Clinical/Clinical.Api/Queries/GetPatientById.lql index 250e0ee..5299198 100644 --- a/Clinical/Clinical.Api/Queries/GetPatientById.lql +++ b/Clinical/Clinical.Api/Queries/GetPatientById.lql @@ -1,5 +1,5 @@ -- Get patient by ID -- Parameters: @id -fhir_Patient -|> filter(fn(row) => row.fhir_Patient.Id = @id) -|> select(fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId) +fhir_patient +|> filter(fn(row) => row.fhir_patient.Id = @id) +|> select(fhir_patient.Id, fhir_patient.Active, fhir_patient.GivenName, fhir_patient.FamilyName, fhir_patient.BirthDate, fhir_patient.Gender, fhir_patient.Phone, fhir_patient.Email, fhir_patient.AddressLine, fhir_patient.City, fhir_patient.State, fhir_patient.PostalCode, fhir_patient.Country, fhir_patient.LastUpdated, fhir_patient.VersionId) diff --git a/Clinical/Clinical.Api/Queries/GetPatients.generated.sql b/Clinical/Clinical.Api/Queries/GetPatients.generated.sql deleted file mode 100644 index a4ffa5b..0000000 --- a/Clinical/Clinical.Api/Queries/GetPatients.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId FROM fhir_Patient WHERE (@active IS NULL OR fhir_Patient.Active = @active) AND (@familyName IS NULL OR fhir_Patient.FamilyName LIKE '%' || @familyName || '%') AND (@givenName IS NULL OR fhir_Patient.GivenName LIKE '%' || @givenName || '%') AND (@gender IS NULL OR fhir_Patient.Gender = @gender) ORDER BY fhir_Patient.FamilyName , fhir_Patient.GivenName \ No newline at end of file diff --git a/Clinical/Clinical.Api/Queries/GetPatients.lql b/Clinical/Clinical.Api/Queries/GetPatients.lql index 6d47e4c..a566633 100644 --- a/Clinical/Clinical.Api/Queries/GetPatients.lql +++ b/Clinical/Clinical.Api/Queries/GetPatients.lql @@ -1,6 +1,7 @@ -- Get patients with optional FHIR search parameters --- Parameters: @active, @familyName, @givenName, @gender -fhir_Patient -|> filter(fn(p) => (@active is null or p.fhir_Patient.Active = @active) and (@familyName is null or p.fhir_Patient.FamilyName like '%' || @familyName || '%') and (@givenName is null or p.fhir_Patient.GivenName like '%' || @givenName || '%') and (@gender is null or p.fhir_Patient.Gender = @gender)) -|> select(fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId) -|> order_by(fhir_Patient.FamilyName, fhir_Patient.GivenName) +-- Parameters: @familyName, @givenName (active and gender filters applied in C#). +-- Empty string passed for @familyName / @givenName matches all rows via LIKE '%%'. +fhir_patient +|> filter(fn(p) => (@familyName is null or p.fhir_patient.FamilyName like '%' || @familyName || '%') and (@givenName is null or p.fhir_patient.GivenName like '%' || @givenName || '%')) +|> select(fhir_patient.Id, fhir_patient.Active, fhir_patient.GivenName, fhir_patient.FamilyName, fhir_patient.BirthDate, fhir_patient.Gender, fhir_patient.Phone, fhir_patient.Email, fhir_patient.AddressLine, fhir_patient.City, fhir_patient.State, fhir_patient.PostalCode, fhir_patient.Country, fhir_patient.LastUpdated, fhir_patient.VersionId) +|> order_by(fhir_patient.FamilyName, fhir_patient.GivenName) diff --git a/Clinical/Clinical.Api/Queries/SearchPatients.generated.sql b/Clinical/Clinical.Api/Queries/SearchPatients.generated.sql deleted file mode 100644 index 2538861..0000000 --- a/Clinical/Clinical.Api/Queries/SearchPatients.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId FROM fhir_Patient WHERE fhir_Patient.GivenName LIKE @term OR fhir_Patient.FamilyName LIKE @term OR fhir_Patient.Email LIKE @term ORDER BY fhir_Patient.FamilyName , fhir_Patient.GivenName \ No newline at end of file diff --git a/Clinical/Clinical.Api/Queries/SearchPatients.lql b/Clinical/Clinical.Api/Queries/SearchPatients.lql index 8a256b1..dd52235 100644 --- a/Clinical/Clinical.Api/Queries/SearchPatients.lql +++ b/Clinical/Clinical.Api/Queries/SearchPatients.lql @@ -1,6 +1,6 @@ -- Search patients by name or email -- Parameters: @term -fhir_Patient -|> filter(fn(row) => row.fhir_Patient.GivenName like @term or row.fhir_Patient.FamilyName like @term or row.fhir_Patient.Email like @term) -|> select(fhir_Patient.Id, fhir_Patient.Active, fhir_Patient.GivenName, fhir_Patient.FamilyName, fhir_Patient.BirthDate, fhir_Patient.Gender, fhir_Patient.Phone, fhir_Patient.Email, fhir_Patient.AddressLine, fhir_Patient.City, fhir_Patient.State, fhir_Patient.PostalCode, fhir_Patient.Country, fhir_Patient.LastUpdated, fhir_Patient.VersionId) -|> order_by(fhir_Patient.FamilyName, fhir_Patient.GivenName) +fhir_patient +|> filter(fn(row) => row.fhir_patient.GivenName like @term or row.fhir_patient.FamilyName like @term or row.fhir_patient.Email like @term) +|> select(fhir_patient.Id, fhir_patient.Active, fhir_patient.GivenName, fhir_patient.FamilyName, fhir_patient.BirthDate, fhir_patient.Gender, fhir_patient.Phone, fhir_patient.Email, fhir_patient.AddressLine, fhir_patient.City, fhir_patient.State, fhir_patient.PostalCode, fhir_patient.Country, fhir_patient.LastUpdated, fhir_patient.VersionId) +|> order_by(fhir_patient.FamilyName, fhir_patient.GivenName) diff --git a/Clinical/Clinical.Api/clinical-schema.yaml b/Clinical/Clinical.Api/clinical-schema.yaml index 379c4ee..44e8e10 100644 --- a/Clinical/Clinical.Api/clinical-schema.yaml +++ b/Clinical/Clinical.Api/clinical-schema.yaml @@ -1,6 +1,6 @@ name: clinical tables: -- name: fhir_Patient +- name: fhir_patient columns: - name: Id type: Text @@ -44,10 +44,10 @@ tables: columns: - GivenName primaryKey: - name: PK_fhir_Patient + name: PK_fhir_patient columns: - Id -- name: fhir_Encounter +- name: fhir_encounter columns: - name: Id type: Text @@ -82,17 +82,17 @@ tables: columns: - PatientId foreignKeys: - - name: FK_fhir_Encounter_PatientId + - name: FK_fhir_encounter_PatientId columns: - PatientId - referencedTable: fhir_Patient + referencedTable: fhir_patient referencedColumns: - Id primaryKey: - name: PK_fhir_Encounter + name: PK_fhir_encounter columns: - Id -- name: fhir_Condition +- name: fhir_condition columns: - name: Id type: Text @@ -139,17 +139,17 @@ tables: columns: - SubjectReference foreignKeys: - - name: FK_fhir_Condition_SubjectReference + - name: FK_fhir_condition_SubjectReference columns: - SubjectReference - referencedTable: fhir_Patient + referencedTable: fhir_patient referencedColumns: - Id primaryKey: - name: PK_fhir_Condition + name: PK_fhir_condition columns: - Id -- name: fhir_MedicationRequest +- name: fhir_medicationrequest columns: - name: Id type: Text @@ -192,23 +192,23 @@ tables: columns: - PatientId foreignKeys: - - name: FK_fhir_MedicationRequest_PatientId + - name: FK_fhir_medicationrequest_PatientId columns: - PatientId - referencedTable: fhir_Patient + referencedTable: fhir_patient referencedColumns: - Id - - name: FK_fhir_MedicationRequest_EncounterId + - name: FK_fhir_medicationrequest_EncounterId columns: - EncounterId - referencedTable: fhir_Encounter + referencedTable: fhir_encounter referencedColumns: - Id primaryKey: - name: PK_fhir_MedicationRequest + name: PK_fhir_medicationrequest columns: - Id -- name: sync_Provider +- name: sync_provider columns: - name: ProviderId type: Text @@ -222,6 +222,6 @@ tables: type: Text defaultValue: CURRENT_TIMESTAMP primaryKey: - name: PK_sync_Provider + name: PK_sync_provider columns: - ProviderId diff --git a/Clinical/Clinical.Api/clinical.db b/Clinical/Clinical.Api/clinical.db index 1f874dd..831892a 100644 Binary files a/Clinical/Clinical.Api/clinical.db and b/Clinical/Clinical.Api/clinical.db differ diff --git a/Clinical/Clinical.Sync/Clinical.Sync.csproj b/Clinical/Clinical.Sync/Clinical.Sync.csproj index 314a31d..4f59854 100644 --- a/Clinical/Clinical.Sync/Clinical.Sync.csproj +++ b/Clinical/Clinical.Sync/Clinical.Sync.csproj @@ -7,8 +7,8 @@ <ItemGroup> <PackageReference Include="Npgsql" Version="9.0.2" /> <PackageReference Include="Microsoft.Extensions.Hosting" Version="10.0.0" /> - <PackageReference Include="MelbourneDev.Sync" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Sync.Postgres" Version="0.1.0" /> + <PackageReference Include="Nimblesite.Sync.Core" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Sync.Postgres" Version="$(DataProviderVersion)" /> </ItemGroup> <ItemGroup> diff --git a/Clinical/Clinical.Sync/SyncMappings.json b/Clinical/Clinical.Sync/SyncMappings.json index b5cefb2..f090ab5 100644 --- a/Clinical/Clinical.Sync/SyncMappings.json +++ b/Clinical/Clinical.Sync/SyncMappings.json @@ -2,7 +2,7 @@ "mappings": [ { "source_table": "fhir_Practitioner", - "target_table": "sync_Provider", + "target_table": "sync_provider", "column_mappings": [ { "source": "Id", diff --git a/Clinical/Clinical.Sync/SyncWorker.cs b/Clinical/Clinical.Sync/SyncWorker.cs index 307ce41..2c5894e 100644 --- a/Clinical/Clinical.Sync/SyncWorker.cs +++ b/Clinical/Clinical.Sync/SyncWorker.cs @@ -7,7 +7,7 @@ namespace Clinical.Sync; /// <summary> -/// Background service that pulls Practitioner data from Scheduling.Api and maps to sync_Provider. +/// Background service that pulls Practitioner data from Scheduling.Api and maps to sync_provider. /// </summary> internal sealed class SyncWorker : BackgroundService { @@ -223,7 +223,7 @@ SyncChange change { using var cmd = conn.CreateCommand(); cmd.Transaction = (NpgsqlTransaction)transaction; - cmd.CommandText = "DELETE FROM sync_Provider WHERE ProviderId = @id"; + cmd.CommandText = "DELETE FROM sync_provider WHERE \"ProviderId\" = @id"; cmd.Parameters.AddWithValue("@id", rowId); cmd.ExecuteNonQuery(); _logger.Log(LogLevel.Debug, "Deleted provider {ProviderId}", rowId); @@ -244,13 +244,13 @@ SyncChange change using var upsertCmd = conn.CreateCommand(); upsertCmd.Transaction = (NpgsqlTransaction)transaction; upsertCmd.CommandText = """ - INSERT INTO sync_Provider (ProviderId, FirstName, LastName, Specialty, SyncedAt) + INSERT INTO sync_provider ("ProviderId", "FirstName", "LastName", "Specialty", "SyncedAt") VALUES (@providerId, @firstName, @lastName, @specialty, @syncedAt) - ON CONFLICT(ProviderId) DO UPDATE SET - FirstName = @firstName, - LastName = @lastName, - Specialty = @specialty, - SyncedAt = @syncedAt + ON CONFLICT("ProviderId") DO UPDATE SET + "FirstName" = @firstName, + "LastName" = @lastName, + "Specialty" = @specialty, + "SyncedAt" = @syncedAt """; upsertCmd.Parameters.AddWithValue( diff --git a/Dashboard/Dashboard.Integration.Tests/Dashboard.Integration.Tests.csproj b/Dashboard/Dashboard.Integration.Tests/Dashboard.Integration.Tests.csproj index 5ae22d2..beaec13 100644 --- a/Dashboard/Dashboard.Integration.Tests/Dashboard.Integration.Tests.csproj +++ b/Dashboard/Dashboard.Integration.Tests/Dashboard.Integration.Tests.csproj @@ -21,7 +21,10 @@ <PackageReference Include="Npgsql" Version="9.0.2" /> <PackageReference Include="Testcontainers.PostgreSql" Version="4.3.0" /> <PackageReference Include="Microsoft.Bcl.Memory" Version="10.0.5" /> - <PackageReference Include="MelbourneDev.Gatekeeper" Version="0.1.0" /> + <PackageReference Include="coverlet.collector" Version="6.0.4"> + <PrivateAssets>all</PrivateAssets> + <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets> + </PackageReference> </ItemGroup> <ItemGroup> @@ -29,6 +32,8 @@ <ProjectReference Include="../../Clinical/Clinical.Sync/Clinical.Sync.csproj" /> <ProjectReference Include="../../Scheduling/Scheduling.Api/Scheduling.Api.csproj" /> <ProjectReference Include="../../Scheduling/Scheduling.Sync/Scheduling.Sync.csproj" /> + <ProjectReference Include="../../Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj" /> + <ProjectReference Include="../../ICD10/ICD10.TestSupport/ICD10.TestSupport.csproj" /> </ItemGroup> <!-- Copy Dashboard.Web wwwroot for Playwright tests --> diff --git a/Dashboard/Dashboard.Integration.Tests/E2EFixture.cs b/Dashboard/Dashboard.Integration.Tests/E2EFixture.cs index 86cfe34..2ebc415 100644 --- a/Dashboard/Dashboard.Integration.Tests/E2EFixture.cs +++ b/Dashboard/Dashboard.Integration.Tests/E2EFixture.cs @@ -3,6 +3,7 @@ using System.Security.Cryptography; using System.Text; using System.Text.Json; +using ICD10.TestSupport; using Microsoft.AspNetCore.Builder; using Microsoft.AspNetCore.Hosting; using Microsoft.AspNetCore.Hosting.Server; @@ -19,7 +20,7 @@ namespace Dashboard.Integration.Tests; /// <summary> /// Shared fixture that starts all services ONCE for all E2E tests. /// Set E2E_USE_LOCAL=true to skip Testcontainers/process startup and run against -/// an already-running local dev stack (started via scripts/start-local.sh). +/// an already-running local dev stack (started via `make start-local`). /// </summary> public sealed class E2EFixture : IAsyncLifetime { @@ -141,14 +142,28 @@ await Task.WhenAll( var samplesDir = Path.GetFullPath( Path.Combine(testAssemblyDir, "..", "..", "..", "..", "..") ); - var rootDir = Path.GetFullPath(Path.Combine(samplesDir, "..")); - // Run ICD-10 migration and import official CDC data - await SetupIcd10DatabaseAsync(icd10ConnStr, samplesDir, rootDir); + // Run ICD-10 migration and import official CDC data. + // ICD-10 is optional in the E2E suite (see ICD-10 API skip block below) - if + // setup fails (e.g. embedding service or Python toolchain unavailable), continue + // without it instead of failing the entire fixture. + var icd10Ready = false; + try + { + await SetupIcd10DatabaseAsync(icd10ConnStr, samplesDir); + icd10Ready = true; + } + catch (Exception ex) + { + Console.WriteLine( + $"[E2E] WARNING: ICD-10 database setup failed ({ex.Message}); " + + "ICD-10 dependent tests will be skipped" + ); + } var clinicalProjectDir = Path.Combine(samplesDir, "Clinical", "Clinical.Api"); var schedulingProjectDir = Path.Combine(samplesDir, "Scheduling", "Scheduling.Api"); - var gatekeeperProjectDir = Path.Combine(rootDir, "Gatekeeper", "Gatekeeper.Api"); + var gatekeeperProjectDir = Path.Combine(samplesDir, "Gatekeeper", "Gatekeeper.Api"); var icd10ProjectDir = Path.Combine(samplesDir, "ICD10", "ICD10.Api"); var configuration = ResolveBuildConfiguration(testAssemblyDir); @@ -226,12 +241,12 @@ await Task.WhenAll( ["ConnectionStrings__Postgres"] = icd10ConnStr, ["ConnectionStrings__DefaultConnection"] = icd10ConnStr, }; - if (File.Exists(icd10Dll)) + if (icd10Ready && File.Exists(icd10Dll)) { _icd10Process = StartApiFromDll(icd10Dll, icd10ProjectDir, Icd10Url, icd10Env); Console.WriteLine($"[E2E] ICD-10 API starting on {Icd10Url}"); } - else + else if (!File.Exists(icd10Dll)) { Console.WriteLine($"[E2E] ICD-10 API DLL missing: {icd10Dll}"); } @@ -841,27 +856,6 @@ private static async Task WaitForServiceReachableAsync(string baseUrl, string en ); } - /// <summary> - /// Waits for a service to be reachable (any HTTP response). - /// Used in local mode where services may be running but have DB issues. - /// </summary> - private static async Task WaitForServiceReachableAsync(string baseUrl, string endpoint) - { - using var client = new HttpClient { Timeout = TimeSpan.FromSeconds(2) }; - for (var i = 0; i < 60; i++) - { - try - { - _ = await client.GetAsync($"{baseUrl}{endpoint}"); - Console.WriteLine($"[E2E] Service reachable: {baseUrl}"); - return; - } - catch { } - await Task.Delay(500); - } - throw new TimeoutException($"Service at {baseUrl} is not reachable"); - } - /// <summary> /// Creates an authenticated HTTP client with test JWT token. /// </summary> @@ -1061,117 +1055,27 @@ private static async Task SeedAsync(HttpClient client, string url, string json) /// Sets up the ICD-10 database by running migration and importing official CDC data. /// Skips import if data already exists in the database. /// </summary> - private static async Task SetupIcd10DatabaseAsync( - string connectionString, - string samplesDir, - string rootDir - ) + private static async Task SetupIcd10DatabaseAsync(string connectionString, string samplesDir) { Console.WriteLine("[E2E] Setting up ICD-10 database..."); var icd10ProjectDir = Path.Combine(samplesDir, "ICD10", "ICD10.Api"); var schemaPath = Path.Combine(icd10ProjectDir, "icd10-schema.yaml"); - var migrationCliDir = Path.Combine(rootDir, "Migration", "Migration.Cli"); - var scriptsDir = Path.Combine(samplesDir, "ICD10", "scripts", "CreateDb"); // Check if schema already exists and has data if (await Icd10DatabaseHasDataAsync(connectionString)) { Console.WriteLine( - "[E2E] ICD-10 database already has data - skipping migration and import" + "[E2E] ICD-10 database already has data - skipping migration and seed" ); return; } - // Step 1: Run migration to create schema - Console.WriteLine("[E2E] Running ICD-10 schema migration..."); - var configuration = ResolveBuildConfiguration( - Path.GetDirectoryName(typeof(E2EFixture).Assembly.Location)! - ); - var migrationDll = Path.Combine( - migrationCliDir, - "bin", - configuration, - "net10.0", - "Migration.Cli.dll" - ); - - int migrationResult; - if (File.Exists(migrationDll)) - { - Console.WriteLine($"[E2E] Using pre-built Migration.Cli: {migrationDll}"); - migrationResult = await RunProcessAsync( - "dotnet", - $"exec \"{migrationDll}\" --schema \"{schemaPath}\" --output \"{connectionString}\" --provider postgres", - rootDir, - timeoutMs: 600_000 - ); - } - else - { - Console.WriteLine( - $"[E2E] Migration.Cli DLL not found at {migrationDll}, falling back to dotnet run" - ); - migrationResult = await RunProcessAsync( - "dotnet", - $"run --project \"{migrationCliDir}\" -- --schema \"{schemaPath}\" --output \"{connectionString}\" --provider postgres", - rootDir, - timeoutMs: 600_000 - ); - } - - if (migrationResult != 0) - { - throw new Exception($"ICD-10 migration failed with exit code {migrationResult}"); - } - - Console.WriteLine("[E2E] ICD-10 schema created successfully"); - - // Step 2: Set up Python virtual environment - var venvDir = Path.Combine(samplesDir, "ICD10", ".venv"); - var pythonScript = Path.Combine(scriptsDir, "import_postgres.py"); - - if (!File.Exists(pythonScript)) - { - throw new FileNotFoundException($"ICD-10 import script not found: {pythonScript}"); - } - - Console.WriteLine("[E2E] Setting up Python environment..."); - if (!Directory.Exists(venvDir)) - { - var venvResult = await RunProcessAsync("python3", $"-m venv \"{venvDir}\"", scriptsDir); - if (venvResult != 0) - { - throw new Exception($"Failed to create Python virtual environment"); - } - } - - // Install requirements - var requirementsPath = Path.Combine(scriptsDir, "requirements.txt"); - var pipResult = await RunProcessAsync( - $"{venvDir}/bin/pip", - $"install -r \"{requirementsPath}\"", - scriptsDir - ); - if (pipResult != 0) - { - throw new Exception($"Failed to install Python dependencies"); - } - - // Step 3: Import official CDC ICD-10 data - Console.WriteLine("[E2E] Importing official CDC ICD-10 data..."); - var importResult = await RunProcessAsync( - $"{venvDir}/bin/python", - $"\"{pythonScript}\" --connection-string \"{connectionString}\"", - scriptsDir, - timeoutMs: 600_000 - ); - - if (importResult != 0) - { - throw new Exception($"ICD-10 data import failed with exit code {importResult}"); - } - + // Apply schema and seed deterministic E2E reference data via the shared + // ICD10.TestSupport library. This avoids the ~3-minute Python CDC import + // (44k codes + embeddings) that previously made every dashboard run hang. + Console.WriteLine("[E2E] Applying ICD-10 schema and seeding test data..."); + await Task.Run(() => Icd10TestDatabase.Initialize(connectionString, schemaPath)); Console.WriteLine("[E2E] ICD-10 database setup complete"); } diff --git a/Dashboard/Dashboard.Integration.Tests/xunit.runner.json b/Dashboard/Dashboard.Integration.Tests/xunit.runner.json index 8d67726..34723a8 100644 --- a/Dashboard/Dashboard.Integration.Tests/xunit.runner.json +++ b/Dashboard/Dashboard.Integration.Tests/xunit.runner.json @@ -3,6 +3,7 @@ "parallelizeAssembly": false, "parallelizeTestCollections": false, "maxParallelThreads": 1, + "stopOnFail": true, "diagnosticMessages": true, "longRunningTestSeconds": 30, "methodDisplay": "method" diff --git a/Dashboard/Dashboard.Web/.config/dotnet-tools.json b/Dashboard/Dashboard.Web/.config/dotnet-tools.json deleted file mode 100644 index c93ea06..0000000 --- a/Dashboard/Dashboard.Web/.config/dotnet-tools.json +++ /dev/null @@ -1,13 +0,0 @@ -{ - "version": 1, - "isRoot": true, - "tools": { - "h5-compiler": { - "version": "26.3.64893", - "commands": [ - "h5" - ], - "rollForward": false - } - } -} \ No newline at end of file diff --git a/Directory.Build.props b/Directory.Build.props index d030a72..8c7bbfb 100644 --- a/Directory.Build.props +++ b/Directory.Build.props @@ -3,6 +3,8 @@ <NuGetAudit>false</NuGetAudit> <NuGetAuditMode>disabled</NuGetAuditMode> <RestoreAuditProperties>false</RestoreAuditProperties> + <!-- Single source of truth for every Nimblesite DataProvider / Lql / Sync / Migration package version. --> + <DataProviderVersion>0.9.6-beta</DataProviderVersion> <Version>0.1.0</Version> <Authors>ChristianFindlay</Authors> <Company>MelbourneDeveloper</Company> @@ -18,10 +20,9 @@ <Nullable>enable</Nullable> <TreatWarningsAsErrors>true</TreatWarningsAsErrors> <WarningsAsErrors>IDE0301;IDE0063;IDE0005;MSB3243</WarningsAsErrors> - <NoWarn>CA1016;CA1303;EPS06;IDE0290;CA1062;CA1002;IDE0090;CA1017;CS8509;IDE0037;NU1900;NU1901;NU1902;NU1903;NU1904</NoWarn> + <NoWarn>CA1016;CA1303;EPS06;IDE0290;CA1062;CA1002;IDE0090;CA1017;CS8509;IDE0037;NU1900;NU1901;NU1902;NU1903;NU1904;CS1591</NoWarn> <WarningsNotAsErrors>$(WarningsNotAsErrors);CA1303;EPS06;CA1016;IDE0290;CA1062;CA1002;CA1017;CS8509;IDE0037</WarningsNotAsErrors> <WarningLevel>9999</WarningLevel> - <EnforceExtendedAnalyzerRules>true</EnforceExtendedAnalyzerRules> <EnableNETAnalyzers>true</EnableNETAnalyzers> <AnalysisMode>All</AnalysisMode> <EnforceCodeStyleInBuild>true</EnforceCodeStyleInBuild> @@ -60,6 +61,14 @@ </PackageReference> </ItemGroup> + <!-- Shared xunit runner config (stopOnFail) for every test project that doesn't override it --> + <ItemGroup Condition="'$(IsTestProject)' == 'true' And !Exists('$(MSBuildProjectDirectory)\xunit.runner.json')"> + <Content Include="$(MSBuildThisFileDirectory)xunit.runner.json"> + <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> + <Link>xunit.runner.json</Link> + </Content> + </ItemGroup> + <!-- Code Analysis packages only for non-test projects --> <ItemGroup Condition="'$(IsTestProject)' != 'true'"> <PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="4.5.0" PrivateAssets="all" /> @@ -92,5 +101,4 @@ <IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets> </PackageReference> </ItemGroup> - </Project> diff --git a/Gatekeeper/Gatekeeper.Api.Tests/AuthenticationTests.cs b/Gatekeeper/Gatekeeper.Api.Tests/AuthenticationTests.cs new file mode 100644 index 0000000..92da28e --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api.Tests/AuthenticationTests.cs @@ -0,0 +1,306 @@ +namespace Gatekeeper.Api.Tests; + +/// <summary> +/// Integration tests for Gatekeeper authentication endpoints. +/// Tests WebAuthn/FIDO2 passkey registration and login flows. +/// </summary> +public sealed class AuthenticationTests : IClassFixture<GatekeeperTestFixture> +{ + private readonly HttpClient _client; + + public AuthenticationTests(GatekeeperTestFixture fixture) + { + _client = fixture.CreateClient(); + } + + [Fact] + public async Task RegisterBegin_WithValidEmail_ReturnsChallenge() + { + var request = new { Email = "test@example.com", DisplayName = "Test User" }; + + var response = await _client.PostAsJsonAsync("/auth/register/begin", request); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + + Assert.True(doc.RootElement.TryGetProperty("ChallengeId", out var challengeId)); + Assert.False(string.IsNullOrEmpty(challengeId.GetString())); + + // API returns OptionsJson as a JSON string (for JS to parse) + Assert.True(doc.RootElement.TryGetProperty("OptionsJson", out var optionsJson)); + var parsedOptions = JsonDocument.Parse(optionsJson.GetString()!); + Assert.True(parsedOptions.RootElement.TryGetProperty("challenge", out _)); + } + + [Fact] + public async Task RegisterBegin_RequiresResidentKey_ForDiscoverableCredentials() + { + // Registration must require resident keys so login works without email + var request = new { Email = "resident@example.com", DisplayName = "Resident User" }; + + var response = await _client.PostAsJsonAsync("/auth/register/begin", request); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + var optionsJson = doc.RootElement.GetProperty("OptionsJson").GetString()!; + var options = JsonDocument.Parse(optionsJson); + + // Verify authenticatorSelection requires resident key + Assert.True( + options.RootElement.TryGetProperty("authenticatorSelection", out var authSelection) + ); + Assert.True(authSelection.TryGetProperty("residentKey", out var residentKey)); + Assert.Equal("required", residentKey.GetString()); + } + + [Fact] + public async Task RegisterBegin_RequiresUserVerification() + { + // Registration must require user verification for security + var request = new { Email = "verify@example.com", DisplayName = "Verify User" }; + + var response = await _client.PostAsJsonAsync("/auth/register/begin", request); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + var optionsJson = doc.RootElement.GetProperty("OptionsJson").GetString()!; + var options = JsonDocument.Parse(optionsJson); + + var authSelection = options.RootElement.GetProperty("authenticatorSelection"); + Assert.True(authSelection.TryGetProperty("userVerification", out var userVerification)); + Assert.Equal("required", userVerification.GetString()); + } + + [Fact] + public async Task LoginBegin_WithEmptyBody_ReturnsChallenge_ForDiscoverableCredentials() + { + // Discoverable credentials flow: no email needed, browser shows all passkeys + // Server returns challenge with empty allowCredentials + var response = await _client.PostAsJsonAsync("/auth/login/begin", new { }); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + + // Should return a valid challenge + Assert.True(doc.RootElement.TryGetProperty("ChallengeId", out var challengeId)); + Assert.False(string.IsNullOrEmpty(challengeId.GetString())); + + // Verify options structure + Assert.True(doc.RootElement.TryGetProperty("OptionsJson", out var optionsJson)); + var options = JsonDocument.Parse(optionsJson.GetString()!); + Assert.True(options.RootElement.TryGetProperty("challenge", out _)); + + // allowCredentials should be empty for discoverable credentials + Assert.True( + options.RootElement.TryGetProperty("allowCredentials", out var allowCredentials) + ); + Assert.Equal(JsonValueKind.Array, allowCredentials.ValueKind); + Assert.Equal(0, allowCredentials.GetArrayLength()); + } + + [Fact] + public async Task LoginBegin_RequiresUserVerification() + { + // Login must require user verification (Touch ID, Face ID, etc.) + var response = await _client.PostAsJsonAsync("/auth/login/begin", new { }); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + var optionsJson = doc.RootElement.GetProperty("OptionsJson").GetString()!; + var options = JsonDocument.Parse(optionsJson); + + Assert.True( + options.RootElement.TryGetProperty("userVerification", out var userVerification) + ); + Assert.Equal("required", userVerification.GetString()); + } + + [Fact] + public async Task LoginComplete_WithInvalidChallengeId_ReturnsError() + { + // Attempting to complete login with invalid challenge should fail + // The endpoint validates the challenge ID and returns an error + var request = new + { + ChallengeId = "non-existent-challenge-id", + OptionsJson = "{}", + AssertionResponse = new + { + Id = "ZmFrZS1jcmVkZW50aWFsLWlk", // base64url encoded + RawId = "ZmFrZS1jcmVkZW50aWFsLWlk", + Type = "public-key", + Response = new + { + AuthenticatorData = "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", + ClientDataJson = "eyJ0eXBlIjoid2ViYXV0aG4uZ2V0IiwiY2hhbGxlbmdlIjoiYWFhYSIsIm9yaWdpbiI6Imh0dHA6Ly9sb2NhbGhvc3Q6NTE3MyJ9", + Signature = "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", + UserHandle = (string?)null, + }, + }, + }; + + var response = await _client.PostAsJsonAsync("/auth/login/complete", request); + + // Should return an error (either BadRequest for validation or Problem for processing) + Assert.True( + response.StatusCode is HttpStatusCode.BadRequest or HttpStatusCode.InternalServerError, + $"Expected error status code but got {response.StatusCode}" + ); + } + + [Fact] + public async Task RegisterComplete_WithInvalidChallengeId_ReturnsError() + { + // Attempting to complete registration with invalid challenge should fail + var request = new + { + ChallengeId = "non-existent-challenge-id", + OptionsJson = "{}", + AttestationResponse = new + { + Id = "ZmFrZS1jcmVkZW50aWFsLWlk", // base64url encoded + RawId = "ZmFrZS1jcmVkZW50aWFsLWlk", + Type = "public-key", + Response = new + { + AttestationObject = "o2NmbXRkbm9uZWdhdHRTdG10oGhhdXRoRGF0YVjE", + ClientDataJson = "eyJ0eXBlIjoid2ViYXV0aG4uY3JlYXRlIiwiY2hhbGxlbmdlIjoiYWFhYSIsIm9yaWdpbiI6Imh0dHA6Ly9sb2NhbGhvc3Q6NTE3MyJ9", + }, + }, + }; + + var response = await _client.PostAsJsonAsync("/auth/register/complete", request); + + // Should return an error (either BadRequest for validation or Problem for processing) + Assert.True( + response.StatusCode is HttpStatusCode.BadRequest or HttpStatusCode.InternalServerError, + $"Expected error status code but got {response.StatusCode}" + ); + } + + [Fact] + public async Task Session_WithoutToken_ReturnsUnauthorized() + { + var response = await _client.GetAsync("/auth/session"); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } + + [Fact] + public async Task Session_WithInvalidToken_ReturnsUnauthorized() + { + _client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", "invalid-token"); + + var response = await _client.GetAsync("/auth/session"); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } + + [Fact] + public async Task Logout_WithoutToken_ReturnsUnauthorized() + { + var response = await _client.PostAsync("/auth/logout", null); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } +} + +/// <summary> +/// Tests for Base64Url encoding used in WebAuthn credential IDs. +/// </summary> +public sealed class Base64UrlTests +{ + [Fact] + public void Encode_ProducesUrlSafeOutput() + { + // Standard base64 uses + and /, base64url uses - and _ + var input = new byte[] { 0xfb, 0xff, 0xfe }; // Would produce +//+ in standard base64 + + var result = Base64Url.Encode(input); + + Assert.DoesNotContain("+", result); + Assert.DoesNotContain("/", result); + Assert.DoesNotContain("=", result); + Assert.Contains("-", result); // Should use - instead of + + Assert.Contains("_", result); // Should use _ instead of / + } + + [Fact] + public void Encode_Decode_RoundTrip() + { + var original = new byte[] { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 }; + + var encoded = Base64Url.Encode(original); + var decoded = Base64Url.Decode(encoded); + + Assert.Equal(original, decoded); + } + + [Fact] + public void Decode_HandlesNoPadding() + { + // base64url typically omits padding + var encoded = "AQIDBA"; // No = padding + + var decoded = Base64Url.Decode(encoded); + + Assert.Equal(new byte[] { 1, 2, 3, 4 }, decoded); + } + + [Fact] + public void Decode_HandlesUrlSafeCharacters() + { + // Test decoding with - and _ (url-safe chars) + var encoded = "-_8"; // base64url for 0xfb, 0xff + + var decoded = Base64Url.Decode(encoded); + + Assert.Equal(new byte[] { 0xfb, 0xff }, decoded); + } + + [Fact] + public void Encode_MatchesWebAuthnCredentialIdFormat() + { + // WebAuthn credential IDs use base64url encoding + // This test verifies our encoding matches the expected format + var credentialId = new byte[] + { + 0x01, + 0x02, + 0x03, + 0x04, + 0x05, + 0x06, + 0x07, + 0x08, + 0x09, + 0x0a, + 0x0b, + 0x0c, + 0x0d, + 0x0e, + 0x0f, + 0x10, + }; + + var encoded = Base64Url.Encode(credentialId); + + // Should be AQIDBAUGBwgJCgsMDQ4PEA (no padding) + Assert.Equal("AQIDBAUGBwgJCgsMDQ4PEA", encoded); + + // Verify round-trip + var decoded = Base64Url.Decode(encoded); + Assert.Equal(credentialId, decoded); + } +} diff --git a/Gatekeeper/Gatekeeper.Api.Tests/AuthorizationTests.cs b/Gatekeeper/Gatekeeper.Api.Tests/AuthorizationTests.cs new file mode 100644 index 0000000..ce92188 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api.Tests/AuthorizationTests.cs @@ -0,0 +1,646 @@ +using System.Globalization; +using Npgsql; +using Outcome; + +namespace Gatekeeper.Api.Tests; + +/// <summary> +/// Integration tests for Gatekeeper authorization endpoints. +/// Tests RBAC permission checks, resource grants, and bulk evaluation. +/// </summary> +public sealed class AuthorizationTests : IClassFixture<GatekeeperTestFixture> +{ + private readonly GatekeeperTestFixture _fixture; + + public AuthorizationTests(GatekeeperTestFixture fixture) => _fixture = fixture; + + [Fact] + public async Task Check_WithoutToken_ReturnsUnauthorized() + { + var client = _fixture.CreateClient(); + + var response = await client.GetAsync("/authz/check?permission=test:read"); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } + + [Fact] + public async Task Check_WithInvalidToken_ReturnsUnauthorized() + { + var client = _fixture.CreateClient(); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", "invalid-token"); + + var response = await client.GetAsync("/authz/check?permission=test:read"); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } + + [Fact] + public async Task Check_WithValidToken_UserHasDefaultPermissions_ReturnsAllowed() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateTestUserAndGetToken("authz-user-1@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Debug: Check what's in the database using DataProvider extensions + using var conn = _fixture.OpenConnection(); + var rolePermsResult = await conn.GetRolePermissionsAsync("role-user"); + var rolePerms = rolePermsResult switch + { + GetRolePermissionsOk ok => ok.Value.Select(p => $"role-user->{p.code}").ToList(), + GetRolePermissionsError err => [$"(error: {err.Value.Message})"], + }; + + // Default 'user' role has 'user:profile' permission + var response = await client.GetAsync("/authz/check?permission=user:profile"); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.True( + doc.RootElement.GetProperty("Allowed").GetBoolean(), + $"Response: {content}, RolePerms: [{string.Join(", ", rolePerms)}]" + ); + Assert.Contains("user:profile", doc.RootElement.GetProperty("Reason").GetString()); + } + + [Fact] + public async Task Check_WithValidToken_UserLacksPermission_ReturnsDenied() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateTestUserAndGetToken("authz-user-2@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Default 'user' role does NOT have 'admin:users' permission + var response = await client.GetAsync("/authz/check?permission=admin:users"); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.False(doc.RootElement.GetProperty("Allowed").GetBoolean()); + Assert.Equal("no matching permission", doc.RootElement.GetProperty("Reason").GetString()); + } + + [Fact] + public async Task Check_AdminWildcardPermission_MatchesSubPermissions() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateAdminUserAndGetToken("admin-wildcard@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Admin role has 'admin:*' which should match 'admin:users' + var response = await client.GetAsync("/authz/check?permission=admin:users"); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.True(doc.RootElement.GetProperty("Allowed").GetBoolean()); + Assert.Contains("admin", doc.RootElement.GetProperty("Reason").GetString()); + } + + [Fact] + public async Task Check_AdminWildcardPermission_MatchesNestedSubPermissions() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateAdminUserAndGetToken("admin-nested@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Admin role has 'admin:*' which should match 'admin:users:create' + var response = await client.GetAsync("/authz/check?permission=admin:users:create"); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.True(doc.RootElement.GetProperty("Allowed").GetBoolean()); + } + + [Fact] + public async Task Permissions_WithoutToken_ReturnsUnauthorized() + { + var client = _fixture.CreateClient(); + + var response = await client.GetAsync("/authz/permissions"); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } + + [Fact] + public async Task Permissions_WithValidToken_ReturnsUserPermissions() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateTestUserAndGetToken("authz-perms@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + var response = await client.GetAsync("/authz/permissions"); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + + Assert.True(doc.RootElement.TryGetProperty("Permissions", out var perms)); + Assert.Equal(JsonValueKind.Array, perms.ValueKind); + + // Default user role has 'user:profile' and 'user:credentials' + var permCodes = perms + .EnumerateArray() + .Select(p => p.GetProperty("code").GetString()) + .ToList(); + Assert.Contains("user:profile", permCodes); + Assert.Contains("user:credentials", permCodes); + } + + [Fact] + public async Task Permissions_AdminUser_ReturnsAdminPermissions() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateAdminUserAndGetToken("admin-perms@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + var response = await client.GetAsync("/authz/permissions"); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + + var perms = doc.RootElement.GetProperty("Permissions"); + var permCodes = perms + .EnumerateArray() + .Select(p => p.GetProperty("code").GetString()) + .ToList(); + Assert.Contains("admin:*", permCodes); + } + + [Fact] + public async Task Evaluate_WithoutToken_ReturnsUnauthorized() + { + var client = _fixture.CreateClient(); + + var request = new + { + Checks = new[] + { + new + { + Permission = "test:read", + ResourceType = (string?)null, + ResourceId = (string?)null, + }, + }, + }; + var response = await client.PostAsJsonAsync("/authz/evaluate", request); + + Assert.Equal(HttpStatusCode.Unauthorized, response.StatusCode); + } + + [Fact] + public async Task Evaluate_WithValidToken_ReturnsBulkResults() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateTestUserAndGetToken("authz-eval@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + var request = new + { + Checks = new[] + { + new + { + Permission = "user:profile", + ResourceType = (string?)null, + ResourceId = (string?)null, + }, + new + { + Permission = "admin:users", + ResourceType = (string?)null, + ResourceId = (string?)null, + }, + new + { + Permission = "user:credentials", + ResourceType = (string?)null, + ResourceId = (string?)null, + }, + }, + }; + + var response = await client.PostAsJsonAsync("/authz/evaluate", request); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + + Assert.True(doc.RootElement.TryGetProperty("Results", out var results)); + Assert.Equal(3, results.GetArrayLength()); + + var resultsList = results.EnumerateArray().ToList(); + + // user:profile - allowed + Assert.True(resultsList[0].GetProperty("Allowed").GetBoolean()); + + // admin:users - denied + Assert.False(resultsList[1].GetProperty("Allowed").GetBoolean()); + + // user:credentials - allowed + Assert.True(resultsList[2].GetProperty("Allowed").GetBoolean()); + } + + [Fact] + public async Task Evaluate_EmptyChecks_ReturnsEmptyResults() + { + var client = _fixture.CreateClient(); + var token = await _fixture.CreateTestUserAndGetToken("authz-empty@example.com"); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + var request = new { Checks = Array.Empty<object>() }; + + var response = await client.PostAsJsonAsync("/authz/evaluate", request); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + + Assert.True(doc.RootElement.TryGetProperty("Results", out var results)); + Assert.Equal(0, results.GetArrayLength()); + } + + [Fact] + public async Task Check_WithResourceGrant_AllowsAccessToSpecificResource() + { + var client = _fixture.CreateClient(); + var (token, userId) = await _fixture.CreateTestUserAndGetTokenWithId( + "resource-grant@example.com" + ); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Grant access to a specific patient record + await _fixture.GrantResourceAccess(userId, "patient", "patient-123", "patient:read"); + + var response = await client.GetAsync( + "/authz/check?permission=patient:read&resourceType=patient&resourceId=patient-123" + ); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.True(doc.RootElement.GetProperty("Allowed").GetBoolean()); + Assert.Contains("resource-grant", doc.RootElement.GetProperty("Reason").GetString()); + } + + [Fact] + public async Task Check_WithResourceGrant_DeniesAccessToDifferentResource() + { + var client = _fixture.CreateClient(); + var (token, userId) = await _fixture.CreateTestUserAndGetTokenWithId( + "resource-deny@example.com" + ); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Grant access only to patient-123 + await _fixture.GrantResourceAccess(userId, "patient", "patient-123", "patient:read"); + + // Check access to patient-456 (should be denied) + var response = await client.GetAsync( + "/authz/check?permission=patient:read&resourceType=patient&resourceId=patient-456" + ); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.False(doc.RootElement.GetProperty("Allowed").GetBoolean()); + } + + [Fact] + public async Task Check_WithExpiredResourceGrant_DeniesAccess() + { + var client = _fixture.CreateClient(); + var (token, userId) = await _fixture.CreateTestUserAndGetTokenWithId( + "expired-grant@example.com" + ); + client.DefaultRequestHeaders.Authorization = + new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", token); + + // Grant access that's already expired + await _fixture.GrantResourceAccessExpired(userId, "order", "order-999", "order:read"); + + var response = await client.GetAsync( + "/authz/check?permission=order:read&resourceType=order&resourceId=order-999" + ); + + Assert.Equal(HttpStatusCode.OK, response.StatusCode); + var content = await response.Content.ReadAsStringAsync(); + var doc = JsonDocument.Parse(content); + Assert.False(doc.RootElement.GetProperty("Allowed").GetBoolean()); + } +} + +/// <summary> +/// Test fixture providing shared setup for Gatekeeper tests. +/// Creates test users and tokens without WebAuthn ceremony. +/// Uses PostgreSQL test database. +/// </summary> +public sealed class GatekeeperTestFixture : IDisposable +{ + private readonly WebApplicationFactory<Program> _factory; + private readonly byte[] _signingKey; + private readonly string _dbName; + private readonly string _connectionString; + + public GatekeeperTestFixture() + { + var baseConnectionString = + Environment.GetEnvironmentVariable("TEST_POSTGRES_CONNECTION") + ?? "Host=localhost;Database=postgres;Username=postgres;Password=changeme"; + + _dbName = $"test_gatekeeper_{Guid.NewGuid():N}"; + _signingKey = new byte[32]; + + // Create test database + using (var adminConn = new NpgsqlConnection(baseConnectionString)) + { + adminConn.Open(); + using var createCmd = adminConn.CreateCommand(); + createCmd.CommandText = $"CREATE DATABASE {_dbName}"; + createCmd.ExecuteNonQuery(); + } + + // Build connection string for test database + _connectionString = baseConnectionString.Replace( + "Database=postgres", + $"Database={_dbName}" + ); + + _factory = new WebApplicationFactory<Program>().WithWebHostBuilder(builder => + { + builder.UseSetting("ConnectionStrings:Postgres", _connectionString); + builder.UseSetting("Jwt:SigningKey", Convert.ToBase64String(_signingKey)); + }); + + // Initialize database by making HTTP requests through the factory + // This ensures the app creates and seeds the database before we access it directly + using var client = _factory.CreateClient(); + // Make a request that forces full app initialization + _ = client.PostAsJsonAsync("/auth/login/begin", new { }).GetAwaiter().GetResult(); + } + + /// <summary>Creates a fresh HTTP client for testing.</summary> + public HttpClient CreateClient() => _factory.CreateClient(); + + /// <summary>Opens a database connection for direct data access.</summary> + public NpgsqlConnection OpenConnection() + { + var conn = new NpgsqlConnection(_connectionString); + conn.Open(); + return conn; + } + + /// <summary> + /// Creates a test user and returns a valid JWT token. + /// Bypasses WebAuthn by directly inserting user and generating token. + /// Uses DataProvider generated methods for data access. + /// </summary> + public async Task<string> CreateTestUserAndGetToken(string email) + { + var (token, _) = await CreateTestUserAndGetTokenWithId(email).ConfigureAwait(false); + return token; + } + + /// <summary> + /// Creates a test user and returns both the token and user ID. + /// Uses DataProvider generated methods for data access. + /// </summary> + public async Task<(string Token, string UserId)> CreateTestUserAndGetTokenWithId(string email) + { + using var conn = OpenConnection(); + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + + var userId = Guid.NewGuid().ToString(); + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + + // Insert user using DataProvider generated method + await tx.Insertgk_userAsync( + userId, + "Test User", + email, + now, + null, // last_login_at + true, // is_active + null // metadata + ) + .ConfigureAwait(false); + + // Link user to role using DataProvider generated method + await tx.Insertgk_user_roleAsync( + userId, + "role-user", + now, + null, // granted_by + null // expires_at + ) + .ConfigureAwait(false); + + await tx.CommitAsync().ConfigureAwait(false); + + var token = TokenService.CreateToken( + userId, + "Test User", + email, + ["user"], + _signingKey, + TimeSpan.FromHours(1) + ); + + return (token, userId); + } + + /// <summary> + /// Creates an admin user and returns a valid JWT token. + /// Uses DataProvider generated methods for data access. + /// </summary> + public async Task<string> CreateAdminUserAndGetToken(string email) + { + using var conn = OpenConnection(); + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + + var userId = Guid.NewGuid().ToString(); + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + + // Insert user using DataProvider generated method + await tx.Insertgk_userAsync( + userId, + "Admin User", + email, + now, + null, // last_login_at + true, // is_active + null // metadata + ) + .ConfigureAwait(false); + + // Link user to admin role using DataProvider generated method + await tx.Insertgk_user_roleAsync( + userId, + "role-admin", + now, + null, // granted_by + null // expires_at + ) + .ConfigureAwait(false); + + await tx.CommitAsync().ConfigureAwait(false); + + var token = TokenService.CreateToken( + userId, + "Admin User", + email, + ["admin"], + _signingKey, + TimeSpan.FromHours(1) + ); + + return token; + } + + /// <summary> + /// Grants resource-level access to a user. + /// Uses DataProvider generated methods for data access. + /// </summary> + public async Task GrantResourceAccess( + string userId, + string resourceType, + string resourceId, + string permissionCode + ) + { + using var conn = OpenConnection(); + + // Look up existing permission by code BEFORE starting transaction + var permLookupResult = await conn.GetPermissionByCodeAsync(permissionCode) + .ConfigureAwait(false); + var existingPerm = permLookupResult switch + { + GetPermissionByCodeOk ok => ok.Value.FirstOrDefault(), + GetPermissionByCodeError err => throw new InvalidOperationException( + $"Permission lookup failed: {err.Value.Message}, Exception: {err.Value.InnerException?.Message}" + ), + }; + + var permId = + existingPerm?.id + ?? throw new InvalidOperationException( + $"Permission '{permissionCode}' not found in seeded database" + ); + + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + var grantId = Guid.NewGuid().ToString(); + + // Grant access using DataProvider generated method + var grantResult = await tx.Insertgk_resource_grantAsync( + grantId, + userId, + resourceType, + resourceId, + permId, + now, + null, // granted_by + null // expires_at + ) + .ConfigureAwait(false); + + if (grantResult is Result<Guid?, SqlError>.Error<Guid?, SqlError> grantErr) + { + throw new InvalidOperationException( + $"Failed to insert grant: {grantErr.Value.Message}" + ); + } + + await tx.CommitAsync().ConfigureAwait(false); + } + + /// <summary> + /// Grants resource-level access that has already expired. + /// Uses DataProvider generated methods for data access. + /// </summary> + public async Task GrantResourceAccessExpired( + string userId, + string resourceType, + string resourceId, + string permissionCode + ) + { + using var conn = OpenConnection(); + + // Look up existing permission by code BEFORE starting transaction + var permLookupResult = await conn.GetPermissionByCodeAsync(permissionCode) + .ConfigureAwait(false); + var existingPerm = permLookupResult switch + { + GetPermissionByCodeOk ok => ok.Value.FirstOrDefault(), + GetPermissionByCodeError err => throw new InvalidOperationException( + $"Permission lookup failed: {err.Value.Message}" + ), + }; + + var permId = + existingPerm?.id + ?? throw new InvalidOperationException( + $"Permission '{permissionCode}' not found in seeded database" + ); + + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + var expired = DateTime.UtcNow.AddHours(-1).ToString("o", CultureInfo.InvariantCulture); + var grantId = Guid.NewGuid().ToString(); + + // Grant access with expired timestamp using DataProvider generated method + await tx.Insertgk_resource_grantAsync( + grantId, + userId, + resourceType, + resourceId, + permId, + now, + null, // granted_by + expired // expires_at + ) + .ConfigureAwait(false); + + await tx.CommitAsync().ConfigureAwait(false); + } + + /// <summary>Disposes the test fixture and cleans up test database.</summary> + public void Dispose() + { + _factory.Dispose(); + + var baseConnectionString = + Environment.GetEnvironmentVariable("TEST_POSTGRES_CONNECTION") + ?? "Host=localhost;Database=postgres;Username=postgres;Password=changeme"; + + // Drop the test database + using var adminConn = new NpgsqlConnection(baseConnectionString); + adminConn.Open(); + + // Terminate any existing connections to the database + using var terminateCmd = adminConn.CreateCommand(); + terminateCmd.CommandText = + $"SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = '{_dbName}'"; + terminateCmd.ExecuteNonQuery(); + + using var dropCmd = adminConn.CreateCommand(); + dropCmd.CommandText = $"DROP DATABASE IF EXISTS {_dbName}"; + dropCmd.ExecuteNonQuery(); + } +} diff --git a/Gatekeeper/Gatekeeper.Api.Tests/Gatekeeper.Api.Tests.csproj b/Gatekeeper/Gatekeeper.Api.Tests/Gatekeeper.Api.Tests.csproj new file mode 100644 index 0000000..a84bc10 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api.Tests/Gatekeeper.Api.Tests.csproj @@ -0,0 +1,38 @@ +<Project Sdk="Microsoft.NET.Sdk"> + <PropertyGroup> + <OutputType>Library</OutputType> + <IsTestProject>true</IsTestProject> + <RootNamespace>Gatekeeper.Api.Tests</RootNamespace> + <NoWarn>CS1591;CA1707;CA1307;CA1062;CA1515;CA2100;CA1822;CA1859;CA1849;CA2234;CA1812;CA2007;CA2000;xUnit1030</NoWarn> + </PropertyGroup> + + <ItemGroup> + <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.12.0" /> + <PackageReference Include="xunit" Version="2.9.2" /> + <PackageReference Include="xunit.runner.visualstudio" Version="3.0.0"> + <PrivateAssets>all</PrivateAssets> + <IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets> + </PackageReference> + <PackageReference Include="Microsoft.AspNetCore.Mvc.Testing" Version="9.0.3" /> + <PackageReference Include="Npgsql" Version="9.0.2" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Postgres" + Version="$(DataProviderVersion)" + /> + <PackageReference Include="coverlet.collector" Version="6.0.4"> + <PrivateAssets>all</PrivateAssets> + <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets> + </PackageReference> + </ItemGroup> + + <ItemGroup> + <ProjectReference Include="..\Gatekeeper.Api\Gatekeeper.Api.csproj" /> + </ItemGroup> + + <!-- Copy YAML schema from Gatekeeper.Api to test output --> + <ItemGroup> + <Content Include="..\Gatekeeper.Api\gatekeeper-schema.yaml" Link="gatekeeper-schema.yaml"> + <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> + </Content> + </ItemGroup> +</Project> diff --git a/Gatekeeper/Gatekeeper.Api.Tests/GlobalUsings.cs b/Gatekeeper/Gatekeeper.Api.Tests/GlobalUsings.cs new file mode 100644 index 0000000..70958fe --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api.Tests/GlobalUsings.cs @@ -0,0 +1,51 @@ +#pragma warning disable IDE0005 // Using directive is unnecessary + +global using System.Net; +global using System.Net.Http.Json; +global using System.Text.Json; +global using Generated; +global using Microsoft.AspNetCore.Mvc.Testing; +global using Nimblesite.Sql.Model; +global using Xunit; +global using GetPermissionByCodeError = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetPermissionByCode>, + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetPermissionByCode>, + Nimblesite.Sql.Model.SqlError +>; +global using GetPermissionByCodeOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetPermissionByCode>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetPermissionByCode>, + Nimblesite.Sql.Model.SqlError +>; +global using GetRolePermissionsError = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetRolePermissions>, + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetRolePermissions>, + Nimblesite.Sql.Model.SqlError +>; +global using GetRolePermissionsOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetRolePermissions>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetRolePermissions>, + Nimblesite.Sql.Model.SqlError +>; +global using GetSessionRevokedError = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>; +global using GetSessionRevokedOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>; diff --git a/Gatekeeper/Gatekeeper.Api.Tests/TokenServiceTests.cs b/Gatekeeper/Gatekeeper.Api.Tests/TokenServiceTests.cs new file mode 100644 index 0000000..29c7ea1 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api.Tests/TokenServiceTests.cs @@ -0,0 +1,597 @@ +using System.Globalization; +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; +using Npgsql; + +namespace Gatekeeper.Api.Tests; + +/// <summary> +/// Unit tests for TokenService JWT creation, validation, and revocation. +/// </summary> +public sealed class TokenServiceTests +{ + private static readonly byte[] TestSigningKey = new byte[32]; + + [Fact] + public void CreateToken_ReturnsValidJwtFormat() + { + var token = TokenService.CreateToken( + "user-123", + "Test User", + "test@example.com", + ["user", "admin"], + TestSigningKey, + TimeSpan.FromHours(1) + ); + + // JWT has 3 parts separated by dots + var parts = token.Split('.'); + Assert.Equal(3, parts.Length); + + // All parts should be base64url encoded (no padding) + Assert.DoesNotContain("=", parts[0]); + Assert.DoesNotContain("=", parts[1]); + Assert.DoesNotContain("=", parts[2]); + } + + [Fact] + public void CreateToken_HeaderContainsCorrectAlgorithm() + { + var token = TokenService.CreateToken( + "user-123", + "Test User", + "test@example.com", + ["user"], + TestSigningKey, + TimeSpan.FromHours(1) + ); + + var parts = token.Split('.'); + var headerJson = Base64UrlDecode(parts[0]); + var header = JsonDocument.Parse(headerJson); + + Assert.Equal("HS256", header.RootElement.GetProperty("alg").GetString()); + Assert.Equal("JWT", header.RootElement.GetProperty("typ").GetString()); + } + + [Fact] + public void CreateToken_PayloadContainsAllClaims() + { + var token = TokenService.CreateToken( + "user-456", + "Jane Doe", + "jane@example.com", + ["admin", "manager"], + TestSigningKey, + TimeSpan.FromHours(2) + ); + + var parts = token.Split('.'); + var payloadJson = Base64UrlDecode(parts[1]); + var payload = JsonDocument.Parse(payloadJson); + + Assert.Equal("user-456", payload.RootElement.GetProperty("sub").GetString()); + Assert.Equal("Jane Doe", payload.RootElement.GetProperty("name").GetString()); + Assert.Equal("jane@example.com", payload.RootElement.GetProperty("email").GetString()); + + var roles = payload + .RootElement.GetProperty("roles") + .EnumerateArray() + .Select(e => e.GetString()) + .ToList(); + Assert.Contains("admin", roles); + Assert.Contains("manager", roles); + + Assert.True(payload.RootElement.TryGetProperty("jti", out var jti)); + Assert.False(string.IsNullOrEmpty(jti.GetString())); + + Assert.True(payload.RootElement.TryGetProperty("iat", out _)); + Assert.True(payload.RootElement.TryGetProperty("exp", out _)); + } + + [Fact] + public void CreateToken_ExpirationIsCorrect() + { + var beforeCreate = DateTimeOffset.UtcNow; + + var token = TokenService.CreateToken( + "user-789", + "Test", + "test@example.com", + [], + TestSigningKey, + TimeSpan.FromMinutes(30) + ); + + var parts = token.Split('.'); + var payloadJson = Base64UrlDecode(parts[1]); + var payload = JsonDocument.Parse(payloadJson); + + var exp = payload.RootElement.GetProperty("exp").GetInt64(); + var iat = payload.RootElement.GetProperty("iat").GetInt64(); + var expTime = DateTimeOffset.FromUnixTimeSeconds(exp); + var iatTime = DateTimeOffset.FromUnixTimeSeconds(iat); + + // exp should be ~30 minutes after iat + var diff = expTime - iatTime; + Assert.True(diff.TotalMinutes >= 29 && diff.TotalMinutes <= 31); + + // exp should be ~30 minutes from now + var expFromNow = expTime - beforeCreate; + Assert.True(expFromNow.TotalMinutes >= 29 && expFromNow.TotalMinutes <= 31); + } + + [Fact] + public async Task ValidateTokenAsync_ValidToken_ReturnsOk() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var token = TokenService.CreateToken( + "user-valid", + "Valid User", + "valid@example.com", + ["user"], + TestSigningKey, + TimeSpan.FromHours(1) + ); + + var result = await TokenService.ValidateTokenAsync( + conn, + token, + TestSigningKey, + checkRevocation: false + ); + + Assert.IsType<TokenService.TokenValidationOk>(result); + var ok = (TokenService.TokenValidationOk)result; + Assert.Equal("user-valid", ok.Claims.UserId); + Assert.Equal("Valid User", ok.Claims.DisplayName); + Assert.Equal("valid@example.com", ok.Claims.Email); + Assert.Contains("user", ok.Claims.Roles); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task ValidateTokenAsync_InvalidFormat_ReturnsError() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var result = await TokenService.ValidateTokenAsync( + conn, + "not-a-jwt", + TestSigningKey, + checkRevocation: false + ); + + Assert.IsType<TokenService.TokenValidationError>(result); + var error = (TokenService.TokenValidationError)result; + Assert.Equal("Invalid token format", error.Reason); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task ValidateTokenAsync_TwoPartToken_ReturnsError() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var result = await TokenService.ValidateTokenAsync( + conn, + "header.payload", + TestSigningKey, + checkRevocation: false + ); + + Assert.IsType<TokenService.TokenValidationError>(result); + var error = (TokenService.TokenValidationError)result; + Assert.Equal("Invalid token format", error.Reason); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task ValidateTokenAsync_InvalidSignature_ReturnsError() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var token = TokenService.CreateToken( + "user-sig", + "Sig User", + "sig@example.com", + [], + TestSigningKey, + TimeSpan.FromHours(1) + ); + + // Use different key for validation + var differentKey = new byte[32]; + differentKey[0] = 0xFF; + + var result = await TokenService.ValidateTokenAsync( + conn, + token, + differentKey, + checkRevocation: false + ); + + Assert.IsType<TokenService.TokenValidationError>(result); + var error = (TokenService.TokenValidationError)result; + Assert.Equal("Invalid signature", error.Reason); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task ValidateTokenAsync_ExpiredToken_ReturnsError() + { + var (conn, dbPath) = CreateTestDb(); + try + { + // Create token that expired 1 hour ago + var token = TokenService.CreateToken( + "user-expired", + "Expired User", + "expired@example.com", + [], + TestSigningKey, + TimeSpan.FromHours(-2) // Negative = already expired + ); + + var result = await TokenService.ValidateTokenAsync( + conn, + token, + TestSigningKey, + checkRevocation: false + ); + + Assert.IsType<TokenService.TokenValidationError>(result); + var error = (TokenService.TokenValidationError)result; + Assert.Equal("Token expired", error.Reason); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task ValidateTokenAsync_RevokedToken_ReturnsError() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var token = TokenService.CreateToken( + "user-revoked", + "Revoked User", + "revoked@example.com", + [], + TestSigningKey, + TimeSpan.FromHours(1) + ); + + // Extract JTI and revoke + var parts = token.Split('.'); + var payloadJson = Base64UrlDecode(parts[1]); + var payload = JsonDocument.Parse(payloadJson); + var jti = payload.RootElement.GetProperty("jti").GetString()!; + + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + var exp = DateTime.UtcNow.AddHours(1).ToString("o", CultureInfo.InvariantCulture); + + // Insert user and revoked session using raw SQL (consistent with other tests) + using var tx = conn.BeginTransaction(); + + using var userCmd = conn.CreateCommand(); + userCmd.Transaction = tx; + userCmd.CommandText = + @"INSERT INTO gk_user (id, display_name, email, created_at, last_login_at, is_active, metadata) + VALUES (@id, @name, @email, @now, NULL, true, NULL)"; + userCmd.Parameters.AddWithValue("@id", "user-revoked"); + userCmd.Parameters.AddWithValue("@name", "Revoked User"); + userCmd.Parameters.AddWithValue("@email", DBNull.Value); + userCmd.Parameters.AddWithValue("@now", now); + await userCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + using var sessionCmd = conn.CreateCommand(); + sessionCmd.Transaction = tx; + sessionCmd.CommandText = + @"INSERT INTO gk_session (id, user_id, credential_id, created_at, expires_at, last_activity_at, ip_address, user_agent, is_revoked) + VALUES (@id, @user_id, NULL, @created, @expires, @activity, NULL, NULL, true)"; + sessionCmd.Parameters.AddWithValue("@id", jti); + sessionCmd.Parameters.AddWithValue("@user_id", "user-revoked"); + sessionCmd.Parameters.AddWithValue("@created", now); + sessionCmd.Parameters.AddWithValue("@expires", exp); + sessionCmd.Parameters.AddWithValue("@activity", now); + await sessionCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + tx.Commit(); + + var result = await TokenService.ValidateTokenAsync( + conn, + token, + TestSigningKey, + checkRevocation: true + ); + + Assert.IsType<TokenService.TokenValidationError>(result); + var error = (TokenService.TokenValidationError)result; + Assert.Equal("Token revoked", error.Reason); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task ValidateTokenAsync_RevokedToken_IgnoredWhenCheckRevocationFalse() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var token = TokenService.CreateToken( + "user-revoked2", + "Revoked User 2", + "revoked2@example.com", + [], + TestSigningKey, + TimeSpan.FromHours(1) + ); + + // Extract JTI and revoke + var parts = token.Split('.'); + var payloadJson = Base64UrlDecode(parts[1]); + var payload = JsonDocument.Parse(payloadJson); + var jti = payload.RootElement.GetProperty("jti").GetString()!; + + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + var exp = DateTime.UtcNow.AddHours(1).ToString("o", CultureInfo.InvariantCulture); + + // Insert user and revoked session using raw SQL (consistent with other tests) + using var tx = conn.BeginTransaction(); + + using var userCmd = conn.CreateCommand(); + userCmd.Transaction = tx; + userCmd.CommandText = + @"INSERT INTO gk_user (id, display_name, email, created_at, last_login_at, is_active, metadata) + VALUES (@id, @name, @email, @now, NULL, true, NULL)"; + userCmd.Parameters.AddWithValue("@id", "user-revoked2"); + userCmd.Parameters.AddWithValue("@name", "Revoked User 2"); + userCmd.Parameters.AddWithValue("@email", DBNull.Value); + userCmd.Parameters.AddWithValue("@now", now); + await userCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + using var sessionCmd = conn.CreateCommand(); + sessionCmd.Transaction = tx; + sessionCmd.CommandText = + @"INSERT INTO gk_session (id, user_id, credential_id, created_at, expires_at, last_activity_at, ip_address, user_agent, is_revoked) + VALUES (@id, @user_id, NULL, @created, @expires, @activity, NULL, NULL, true)"; + sessionCmd.Parameters.AddWithValue("@id", jti); + sessionCmd.Parameters.AddWithValue("@user_id", "user-revoked2"); + sessionCmd.Parameters.AddWithValue("@created", now); + sessionCmd.Parameters.AddWithValue("@expires", exp); + sessionCmd.Parameters.AddWithValue("@activity", now); + await sessionCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + tx.Commit(); + + // With checkRevocation: false, should still validate + var result = await TokenService.ValidateTokenAsync( + conn, + token, + TestSigningKey, + checkRevocation: false + ); + + Assert.IsType<TokenService.TokenValidationOk>(result); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public async Task RevokeTokenAsync_SetsIsRevokedFlag() + { + var (conn, dbPath) = CreateTestDb(); + try + { + var jti = Guid.NewGuid().ToString(); + var userId = "user-test"; + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + var exp = DateTime.UtcNow.AddHours(1).ToString("o", CultureInfo.InvariantCulture); + + // Insert user and session using raw SQL (TEXT PK doesn't return rowid) + using var tx = conn.BeginTransaction(); + + using var userCmd = conn.CreateCommand(); + userCmd.Transaction = tx; + userCmd.CommandText = + @"INSERT INTO gk_user (id, display_name, email, created_at, last_login_at, is_active, metadata) + VALUES (@id, @name, @email, @now, NULL, true, NULL)"; + userCmd.Parameters.AddWithValue("@id", userId); + userCmd.Parameters.AddWithValue("@name", "Test User"); + userCmd.Parameters.AddWithValue("@email", DBNull.Value); + userCmd.Parameters.AddWithValue("@now", now); + await userCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + using var sessionCmd = conn.CreateCommand(); + sessionCmd.Transaction = tx; + sessionCmd.CommandText = + @"INSERT INTO gk_session (id, user_id, credential_id, created_at, expires_at, last_activity_at, ip_address, user_agent, is_revoked) + VALUES (@id, @user_id, NULL, @created, @expires, @activity, NULL, NULL, false)"; + sessionCmd.Parameters.AddWithValue("@id", jti); + sessionCmd.Parameters.AddWithValue("@user_id", userId); + sessionCmd.Parameters.AddWithValue("@created", now); + sessionCmd.Parameters.AddWithValue("@expires", exp); + sessionCmd.Parameters.AddWithValue("@activity", now); + await sessionCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + tx.Commit(); + + // Revoke + await TokenService.RevokeTokenAsync(conn, jti); + + // Verify using DataProvider generated method + var revokedResult = await conn.GetSessionRevokedAsync(jti); + var isRevoked = revokedResult switch + { + GetSessionRevokedOk ok => ok.Value.FirstOrDefault()?.is_revoked ?? false, + GetSessionRevokedError err => throw new InvalidOperationException( + $"GetSessionRevoked failed: {err.Value.Message}, {err.Value.InnerException?.Message}" + ), + }; + + Assert.True(isRevoked); + } + finally + { + CleanupTestDb(conn, dbPath); + } + } + + [Fact] + public void ExtractBearerToken_ValidHeader_ReturnsToken() + { + var token = TokenService.ExtractBearerToken("Bearer abc123xyz"); + + Assert.Equal("abc123xyz", token); + } + + [Fact] + public void ExtractBearerToken_NullHeader_ReturnsNull() + { + var token = TokenService.ExtractBearerToken(null); + + Assert.Null(token); + } + + [Fact] + public void ExtractBearerToken_EmptyHeader_ReturnsNull() + { + var token = TokenService.ExtractBearerToken(""); + + Assert.Null(token); + } + + [Fact] + public void ExtractBearerToken_NonBearerScheme_ReturnsNull() + { + var token = TokenService.ExtractBearerToken("Basic abc123xyz"); + + Assert.Null(token); + } + + [Fact] + public void ExtractBearerToken_BearerWithoutSpace_ReturnsNull() + { + var token = TokenService.ExtractBearerToken("Bearerabc123xyz"); + + Assert.Null(token); + } + + private static (NpgsqlConnection Connection, string DbName) CreateTestDb() + { + // Connect to PostgreSQL server - use environment variable or default to localhost + var baseConnectionString = + Environment.GetEnvironmentVariable("TEST_POSTGRES_CONNECTION") + ?? "Host=localhost;Database=postgres;Username=postgres;Password=changeme"; + + var dbName = $"test_tokenservice_{Guid.NewGuid():N}"; + + // Create test database + using (var adminConn = new NpgsqlConnection(baseConnectionString)) + { + adminConn.Open(); + using var createCmd = adminConn.CreateCommand(); + createCmd.CommandText = $"CREATE DATABASE {dbName}"; + createCmd.ExecuteNonQuery(); + } + + // Connect to the new test database + var testConnectionString = baseConnectionString.Replace( + "Database=postgres", + $"Database={dbName}" + ); + var conn = new NpgsqlConnection(testConnectionString); + conn.Open(); + + // Use the YAML schema to create only the needed tables + // gk_credential is needed because gk_session has a FK to it + var yamlPath = Path.Combine(AppContext.BaseDirectory, "gatekeeper-schema.yaml"); + var schema = SchemaYamlSerializer.FromYamlFile(yamlPath); + var neededTables = new[] { "gk_user", "gk_credential", "gk_session" }; + + foreach (var table in schema.Tables.Where(t => neededTables.Contains(t.Name))) + { + var ddl = PostgresDdlGenerator.Generate(new CreateTableOperation(table)); + foreach ( + var statement in ddl.Split( + ';', + StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries + ) + ) + { + if (string.IsNullOrWhiteSpace(statement)) + { + continue; + } + using var cmd = conn.CreateCommand(); + cmd.CommandText = statement; + cmd.ExecuteNonQuery(); + } + } + + return (conn, dbName); + } + + private static void CleanupTestDb(NpgsqlConnection connection, string dbName) + { + var baseConnectionString = + Environment.GetEnvironmentVariable("TEST_POSTGRES_CONNECTION") + ?? "Host=localhost;Database=postgres;Username=postgres;Password=changeme"; + + connection.Close(); + connection.Dispose(); + + // Drop the test database + using var adminConn = new NpgsqlConnection(baseConnectionString); + adminConn.Open(); + + // Terminate any existing connections to the database + using var terminateCmd = adminConn.CreateCommand(); + terminateCmd.CommandText = + $"SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = '{dbName}'"; + terminateCmd.ExecuteNonQuery(); + + using var dropCmd = adminConn.CreateCommand(); + dropCmd.CommandText = $"DROP DATABASE IF EXISTS {dbName}"; + dropCmd.ExecuteNonQuery(); + } + + private static string Base64UrlDecode(string input) + { + var padded = input.Replace("-", "+").Replace("_", "/"); + var padding = (4 - (padded.Length % 4)) % 4; + padded += new string('=', padding); + return System.Text.Encoding.UTF8.GetString(Convert.FromBase64String(padded)); + } +} diff --git a/Gatekeeper/Gatekeeper.Api/AuthorizationService.cs b/Gatekeeper/Gatekeeper.Api/AuthorizationService.cs new file mode 100644 index 0000000..34ba5f1 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/AuthorizationService.cs @@ -0,0 +1,117 @@ +using System.Text; + +namespace Gatekeeper.Api; + +/// <summary> +/// Service for evaluating authorization decisions. +/// </summary> +public static class AuthorizationService +{ + /// <summary> + /// Checks if a user has a specific permission, optionally scoped to a resource. + /// </summary> + public static async Task<(bool Allowed, string Reason)> CheckPermissionAsync( + NpgsqlConnection conn, + string userId, + string permissionCode, + string? resourceType, + string? resourceId, + string now + ) + { + // Step 1: Check resource-level grants first (most specific) + if (!string.IsNullOrEmpty(resourceType) && !string.IsNullOrEmpty(resourceId)) + { + var grantResult = await conn.CheckResourceGrantAsync( + userId, + resourceType, + resourceId, + permissionCode, + now + ) + .ConfigureAwait(false); + + if (grantResult is CheckResourceGrantOk grantOk && grantOk.Value.Count > 0) + { + return (true, $"resource-grant:{resourceType}/{resourceId}"); + } + } + + // Step 2: Check user permissions (direct grants and role-based) + var permResult = await conn.GetUserPermissionsAsync(userId, now).ConfigureAwait(false); + var permissions = permResult is GetUserPermissionsOk ok ? ok.Value : []; + + foreach (var perm in permissions) + { + if (perm.code is null) + { + continue; + } + var matches = PermissionMatches(perm.code, permissionCode); + if (!matches) + { + continue; + } + + // Check scope - handle both string and byte[] types from generated code + var scopeType = ToStringValue(perm.scope_type); + var scopeValue = ToStringValue(perm.scope_value); + + var scopeMatches = scopeType switch + { + null or "" or "all" => true, + "record" => scopeValue == resourceId, + _ => false, + }; + + if (scopeMatches) + { + // source_type is role_id for role-based permissions, permission_id for direct grants + // source_name is role name for role-based, permission code for direct + var source = + perm.source_name != perm.code ? $"role:{perm.source_name}" : "direct-grant"; + return (true, $"{source} grants {perm.code}"); + } + } + + return (false, "no matching permission"); + } + + /// <summary> + /// Converts a value to string, handling byte[] from SQLite. + /// </summary> + private static string? ToStringValue(object? value) => + value switch + { + null => null, + string s => s, + byte[] bytes => Encoding.UTF8.GetString(bytes), + _ => value.ToString(), + }; + + /// <summary> + /// Checks if a permission code matches a target, supporting wildcards. + /// </summary> + private static bool PermissionMatches(string grantedCode, string targetCode) + { + if (grantedCode == targetCode) + { + return true; + } + + // Handle wildcards like "admin:*" matching "admin:users" + if (grantedCode.EndsWith(":*", StringComparison.Ordinal)) + { + var prefix = grantedCode[..^1]; // Remove "*" + return targetCode.StartsWith(prefix, StringComparison.Ordinal); + } + + // Handle global wildcard + if (grantedCode == "*:*" || grantedCode == "*") + { + return true; + } + + return false; + } +} diff --git a/Gatekeeper/Gatekeeper.Api/DataProvider.json b/Gatekeeper/Gatekeeper.Api/DataProvider.json new file mode 100644 index 0000000..44deec1 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/DataProvider.json @@ -0,0 +1,32 @@ +{ + "queries": [ + { "name": "GetUserByEmail", "sqlFile": "Sql/GetUserByEmail.sql" }, + { "name": "GetUserById", "sqlFile": "Sql/GetUserById.sql" }, + { "name": "GetUserCredentials", "sqlFile": "Sql/GetUserCredentials.sql" }, + { "name": "GetCredentialById", "sqlFile": "Sql/GetCredentialById.sql" }, + { "name": "GetSessionById", "sqlFile": "Sql/GetSessionById.sql" }, + { "name": "GetChallengeById", "sqlFile": "Sql/GetChallengeById.sql" }, + { "name": "GetUserRoles", "sqlFile": "Sql/GetUserRoles.sql" }, + { "name": "GetUserPermissions", "sqlFile": "Sql/GetUserPermissions.sql" }, + { "name": "CheckResourceGrant", "sqlFile": "Sql/CheckResourceGrant.sql" }, + { "name": "GetActivePolicies", "sqlFile": "Sql/GetActivePolicies.sql" }, + { "name": "GetAllRoles", "sqlFile": "Sql/GetAllRoles.sql" }, + { "name": "GetAllPermissions", "sqlFile": "Sql/GetAllPermissions.sql" }, + { "name": "GetCredentialsByUserId", "sqlFile": "Sql/GetCredentialsByUserId.sql" }, + { "name": "GetRolePermissions", "sqlFile": "Sql/GetRolePermissions.sql" }, + { "name": "GetAllUsers", "sqlFile": "Sql/GetAllUsers.sql" }, + { "name": "CheckPermission", "sqlFile": "Sql/CheckPermission.sql" }, + { "name": "GetSessionRevoked", "sqlFile": "Sql/GetSessionRevoked.sql" }, + { "name": "GetSessionForRevoke", "sqlFile": "Sql/GetSessionForRevoke.sql" } + ], + "tables": [ + { "schema": "public", "name": "gk_user", "generateInsert": true, "primaryKeyColumns": ["id"] }, + { "schema": "public", "name": "gk_credential", "generateInsert": true, "primaryKeyColumns": ["id"] }, + { "schema": "public", "name": "gk_session", "generateInsert": true, "generateUpdate": true, "primaryKeyColumns": ["id"] }, + { "schema": "public", "name": "gk_challenge", "generateInsert": true, "primaryKeyColumns": ["id"] }, + { "schema": "public", "name": "gk_permission", "generateInsert": true, "primaryKeyColumns": ["id"] }, + { "schema": "public", "name": "gk_resource_grant", "generateInsert": true, "primaryKeyColumns": ["id"] }, + { "schema": "public", "name": "gk_role", "generateInsert": true, "primaryKeyColumns": ["id"] } + ], + "connectionString": "Host=localhost;Port=5432;Database=gatekeeper;Username=postgres;Password=changeme" +} diff --git a/Gatekeeper/Gatekeeper.Api/DatabaseSetup.cs b/Gatekeeper/Gatekeeper.Api/DatabaseSetup.cs new file mode 100644 index 0000000..d6cf91b --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/DatabaseSetup.cs @@ -0,0 +1,126 @@ +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; +using InitError = Outcome.Result<bool, string>.Error<bool, string>; +using InitOk = Outcome.Result<bool, string>.Ok<bool, string>; +using InitResult = Outcome.Result<bool, string>; + +namespace Gatekeeper.Api; + +/// <summary> +/// Database initialization and seeding using Migration library. +/// </summary> +internal static class DatabaseSetup +{ + /// <summary> + /// Initializes the database schema and seeds default data. + /// </summary> + public static InitResult Initialize(NpgsqlConnection conn, ILogger logger) + { + var schemaResult = CreateSchemaFromMigration(conn, logger); + if (schemaResult is InitError) + return schemaResult; + + return SeedDefaultData(conn, logger); + } + + private static InitResult CreateSchemaFromMigration(NpgsqlConnection conn, ILogger logger) + { + logger.LogInformation("Creating database schema from gatekeeper-schema.yaml"); + + try + { + // Load schema from YAML (source of truth) + var yamlPath = Path.Combine(AppContext.BaseDirectory, "gatekeeper-schema.yaml"); + var schema = SchemaYamlSerializer.FromYamlFile(yamlPath); + PostgresDdlGenerator.MigrateSchema(conn, schema); + logger.LogInformation("Created Gatekeeper database schema from YAML"); + return new InitOk(true); + } + catch (Exception ex) + { + logger.LogError(ex, "Failed to create Gatekeeper database schema"); + return new InitError($"Failed to create Gatekeeper database schema: {ex.Message}"); + } + } + + private static InitResult SeedDefaultData(NpgsqlConnection conn, ILogger logger) + { + try + { + var now = DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + + using var checkCmd = conn.CreateCommand(); + checkCmd.CommandText = "SELECT COUNT(*) FROM gk_role WHERE is_system = true"; + var count = Convert.ToInt64(checkCmd.ExecuteScalar(), CultureInfo.InvariantCulture); + + if (count > 0) + { + logger.LogInformation("Database already seeded, skipping"); + return new InitOk(true); + } + + logger.LogInformation("Seeding default roles and permissions"); + + ExecuteNonQuery( + conn, + """ + INSERT INTO gk_role (id, name, description, is_system, created_at) + VALUES ('role-admin', 'admin', 'Full system access', true, @now), + ('role-user', 'user', 'Basic authenticated user', true, @now) + """, + ("@now", now) + ); + + ExecuteNonQuery( + conn, + """ + INSERT INTO gk_permission (id, code, resource_type, action, description, created_at) + VALUES ('perm-admin-all', 'admin:*', 'admin', '*', 'Full admin access', @now), + ('perm-user-profile', 'user:profile', 'user', 'read', 'View own profile', @now), + ('perm-user-credentials', 'user:credentials', 'user', 'manage', 'Manage own passkeys', @now), + ('perm-patient-read', 'patient:read', 'patient', 'read', 'Read patient records', @now), + ('perm-order-read', 'order:read', 'order', 'read', 'Read order records', @now), + ('perm-sync-read', 'sync:read', 'sync', 'read', 'Read sync data', @now), + ('perm-sync-write', 'sync:write', 'sync', 'write', 'Write sync data', @now) + """, + ("@now", now) + ); + + ExecuteNonQuery( + conn, + """ + INSERT INTO gk_role_permission (role_id, permission_id, granted_at) + VALUES ('role-admin', 'perm-admin-all', @now), + ('role-admin', 'perm-sync-read', @now), + ('role-admin', 'perm-sync-write', @now), + ('role-user', 'perm-user-profile', @now), + ('role-user', 'perm-user-credentials', @now) + """, + ("@now", now) + ); + + logger.LogInformation("Default data seeded successfully"); + return new InitOk(true); + } + catch (Exception ex) + { + logger.LogError(ex, "Failed to seed Gatekeeper default data"); + return new InitError($"Failed to seed Gatekeeper default data: {ex.Message}"); + } + } + + private static void ExecuteNonQuery( + NpgsqlConnection conn, + string sql, + params (string name, object value)[] parameters + ) + { + using var cmd = conn.CreateCommand(); + cmd.CommandText = sql; + foreach (var (name, value) in parameters) + { + cmd.Parameters.AddWithValue(name, value); + } + cmd.ExecuteNonQuery(); + } +} diff --git a/Gatekeeper/Gatekeeper.Api/FileLoggerProvider.cs b/Gatekeeper/Gatekeeper.Api/FileLoggerProvider.cs new file mode 100644 index 0000000..8514a99 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/FileLoggerProvider.cs @@ -0,0 +1,109 @@ +namespace Gatekeeper.Api; + +/// <summary> +/// Extension methods for adding file logging. +/// </summary> +public static class FileLoggingExtensions +{ + /// <summary> + /// Adds file logging to the logging builder. + /// </summary> + public static ILoggingBuilder AddFileLogging(this ILoggingBuilder builder, string path) + { + // CA2000: DI container takes ownership and disposes when application shuts down +#pragma warning disable CA2000 + builder.Services.AddSingleton<ILoggerProvider>(new FileLoggerProvider(path)); +#pragma warning restore CA2000 + return builder; + } +} + +/// <summary> +/// Simple file logger provider for writing logs to disk. +/// </summary> +public sealed class FileLoggerProvider : ILoggerProvider +{ + private readonly string _path; + private readonly object _lock = new(); + + /// <summary> + /// Initializes a new instance of FileLoggerProvider. + /// </summary> + public FileLoggerProvider(string path) + { + _path = path; + } + + /// <summary> + /// Creates a logger for the specified category. + /// </summary> + public ILogger CreateLogger(string categoryName) => new FileLogger(_path, categoryName, _lock); + + /// <summary> + /// Disposes the provider. + /// </summary> + public void Dispose() + { + // Nothing to dispose - singleton managed by DI container + } +} + +/// <summary> +/// Simple file logger that appends log entries to a file. +/// </summary> +public sealed class FileLogger : ILogger +{ + private readonly string _path; + private readonly string _category; + private readonly object _lock; + + /// <summary> + /// Initializes a new instance of FileLogger. + /// </summary> + public FileLogger(string path, string category, object lockObj) + { + _path = path; + _category = category; + _lock = lockObj; + } + + /// <summary> + /// Begins a logical operation scope. + /// </summary> + public IDisposable? BeginScope<TState>(TState state) + where TState : notnull => null; + + /// <summary> + /// Checks if the given log level is enabled. + /// </summary> + public bool IsEnabled(LogLevel logLevel) => logLevel != LogLevel.None; + + /// <summary> + /// Writes a log entry to the file. + /// </summary> + public void Log<TState>( + LogLevel logLevel, + EventId eventId, + TState state, + Exception? exception, + Func<TState, Exception?, string> formatter + ) + { + if (!IsEnabled(logLevel)) + { + return; + } + + var message = formatter(state, exception); + var line = $"{DateTime.UtcNow:yyyy-MM-dd HH:mm:ss.fff} [{logLevel}] {_category}: {message}"; + if (exception != null) + { + line += Environment.NewLine + exception; + } + + lock (_lock) + { + File.AppendAllText(_path, line + Environment.NewLine); + } + } +} diff --git a/Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj b/Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj new file mode 100644 index 0000000..6ef69d1 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj @@ -0,0 +1,63 @@ +<Project Sdk="Microsoft.NET.Sdk.Web"> + <PropertyGroup> + <OutputType>Exe</OutputType> + <PackageId>MelbourneDev.Gatekeeper</PackageId> + <NoWarn>$(NoWarn);CA1515;CA2100;RS1035;CA1508;CA2234;CA1819;CA2007;EPC12;CS1591</NoWarn> + </PropertyGroup> + + <!-- Exclude Generated folder from default globbing - we include it explicitly in the target --> + <ItemGroup> + <Compile Remove="Generated/**" /> + </ItemGroup> + + <ItemGroup> + <PackageReference Include="Fido2" Version="4.0.0" /> + <PackageReference Include="Fido2.AspNet" Version="4.0.0" /> + <PackageReference Include="Npgsql" Version="9.0.2" /> + <PackageReference Include="Nimblesite.DataProvider.Core" Version="$(DataProviderVersion)" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Core" + Version="$(DataProviderVersion)" + /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Postgres" + Version="$(DataProviderVersion)" + /> + <PackageReference Include="Nimblesite.Sync.Postgres" Version="$(DataProviderVersion)" /> + </ItemGroup> + + <ItemGroup> + <ProjectReference Include="..\..\Shared\Authorization\Authorization.csproj" /> + </ItemGroup> + + <ItemGroup> + <AdditionalFiles Include="Sql/*.sql" /> + <AdditionalFiles Include="DataProvider.json" /> + <!-- YAML schema is source of truth, stored in git --> + <Content Include="gatekeeper-schema.yaml"> + <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> + </Content> + </ItemGroup> + + <!-- Pre-compile: generate C# from SQL via `dotnet DataProvider postgres`. + Requires a live Postgres with the gatekeeper schema migrated (see `make db-migrate`). --> + <Target + Name="GenerateDataProvider" + BeforeTargets="BeforeCompile;CoreCompile" + Inputs="$(MSBuildProjectDirectory)/DataProvider.json;@(AdditionalFiles)" + Outputs="$(MSBuildProjectDirectory)/Generated/.timestamp" + > + <RemoveDir Directories="$(MSBuildProjectDirectory)/Generated" /> + <MakeDir Directories="$(MSBuildProjectDirectory)/Generated" /> + <Exec + Command="dotnet DataProvider postgres --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" + WorkingDirectory="$(MSBuildProjectDirectory)" + StandardOutputImportance="High" + StandardErrorImportance="High" + /> + <Touch Files="$(MSBuildProjectDirectory)/Generated/.timestamp" AlwaysCreate="true" /> + <ItemGroup> + <Compile Include="$(MSBuildProjectDirectory)/Generated/**/*.g.cs" /> + </ItemGroup> + </Target> +</Project> diff --git a/Gatekeeper/Gatekeeper.Api/GlobalUsings.cs b/Gatekeeper/Gatekeeper.Api/GlobalUsings.cs new file mode 100644 index 0000000..a557fca --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/GlobalUsings.cs @@ -0,0 +1,92 @@ +#pragma warning disable IDE0005 // Using directive is unnecessary (some are unused but needed for tests) + +global using System; +global using System.Globalization; +global using System.Text.Json; +global using Fido2NetLib; +global using Fido2NetLib.Objects; +global using Generated; +global using Microsoft.Extensions.Logging; +global using Nimblesite.Sql.Model; +global using Npgsql; +global using Outcome; +global using CheckResourceGrantOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.CheckResourceGrant>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.CheckResourceGrant>, + Nimblesite.Sql.Model.SqlError +>; +// Insert result type alias +global using GetChallengeByIdOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetChallengeById>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetChallengeById>, + Nimblesite.Sql.Model.SqlError +>; +// Additional query result type aliases +global using GetCredentialByIdOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetCredentialById>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetCredentialById>, + Nimblesite.Sql.Model.SqlError +>; +global using GetSessionRevokedError = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>; +global using GetSessionRevokedOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetSessionRevoked>, + Nimblesite.Sql.Model.SqlError +>; +// Query result type aliases +global using GetUserByEmailOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetUserByEmail>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetUserByEmail>, + Nimblesite.Sql.Model.SqlError +>; +global using GetUserByIdOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetUserById>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetUserById>, + Nimblesite.Sql.Model.SqlError +>; +global using GetUserCredentialsError = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetUserCredentials>, + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetUserCredentials>, + Nimblesite.Sql.Model.SqlError +>; +global using GetUserCredentialsOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetUserCredentials>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetUserCredentials>, + Nimblesite.Sql.Model.SqlError +>; +global using GetUserPermissionsOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetUserPermissions>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetUserPermissions>, + Nimblesite.Sql.Model.SqlError +>; +global using GetUserRolesOk = Outcome.Result< + System.Collections.Immutable.ImmutableList<Generated.GetUserRoles>, + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetUserRoles>, + Nimblesite.Sql.Model.SqlError +>; diff --git a/Gatekeeper/Gatekeeper.Api/JunctionTableInserts.cs b/Gatekeeper/Gatekeeper.Api/JunctionTableInserts.cs new file mode 100644 index 0000000..73637ea --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/JunctionTableInserts.cs @@ -0,0 +1,174 @@ +// Hand-written replacements for junction-table insert extensions. +// +// The DataProvider Postgres code generator unconditionally appends +// "RETURNING id" to every generated INSERT, but `gk_user_role` and +// `gk_role_permission` use composite primary keys and do not have an +// `id` column. The generated code therefore throws at runtime +// ("column 'id' does not exist"), the error is silently swallowed +// into a Result.Error by the generated try/catch, and dependent +// queries (e.g. /authz/permissions) return empty data. +// +// We disable generateInsert for these tables in DataProvider.json and +// provide drop-in replacements here so existing call sites continue +// to compile against the same `Insertgk_user_roleAsync` / +// `Insertgk_role_permissionAsync` extension method names. + +#nullable enable + +using System.Data; + +namespace Generated; + +/// <summary> +/// Hand-written extension methods for inserting into <c>gk_user_role</c>. +/// Replaces the broken DataProvider-generated version. +/// </summary> +public static class gk_user_roleExtensions +{ + private const string Sql = + @"INSERT INTO public.gk_user_role (user_id, role_id, granted_at, granted_by, expires_at) + VALUES (@user_id, @role_id, @granted_at, @granted_by, @expires_at) + ON CONFLICT DO NOTHING"; + + /// <summary>Inserts a row into <c>gk_user_role</c>.</summary> + public static async Task<Result<Guid?, SqlError>> Insertgk_user_roleAsync( + this NpgsqlConnection conn, + string user_id, + string role_id, + string? granted_at, + string? granted_by, + string? expires_at + ) + { + try + { + await using var cmd = new NpgsqlCommand(Sql, conn); + BindParameters(cmd, user_id, role_id, granted_at, granted_by, expires_at); + _ = await cmd.ExecuteNonQueryAsync().ConfigureAwait(false); + return new Result<Guid?, SqlError>.Ok<Guid?, SqlError>(null); + } + catch (Exception ex) + { + return new Result<Guid?, SqlError>.Error<Guid?, SqlError>(SqlError.FromException(ex)); + } + } + + /// <summary>Transaction overload of <see cref="Insertgk_user_roleAsync(NpgsqlConnection, string, string, string?, string?, string?)"/>.</summary> + public static async Task<Result<Guid?, SqlError>> Insertgk_user_roleAsync( + this IDbTransaction transaction, + string user_id, + string role_id, + string? granted_at, + string? granted_by, + string? expires_at + ) + { + if (transaction.Connection is not NpgsqlConnection conn) + { + return new Result<Guid?, SqlError>.Error<Guid?, SqlError>( + new SqlError("Transaction.Connection must be NpgsqlConnection") + ); + } + + try + { + await using var cmd = new NpgsqlCommand(Sql, conn, (NpgsqlTransaction)transaction); + BindParameters(cmd, user_id, role_id, granted_at, granted_by, expires_at); + _ = await cmd.ExecuteNonQueryAsync().ConfigureAwait(false); + return new Result<Guid?, SqlError>.Ok<Guid?, SqlError>(null); + } + catch (Exception ex) + { + return new Result<Guid?, SqlError>.Error<Guid?, SqlError>(SqlError.FromException(ex)); + } + } + + private static void BindParameters( + NpgsqlCommand cmd, + string user_id, + string role_id, + string? granted_at, + string? granted_by, + string? expires_at + ) + { + cmd.Parameters.AddWithValue("user_id", user_id); + cmd.Parameters.AddWithValue("role_id", role_id); + cmd.Parameters.AddWithValue("granted_at", (object?)granted_at ?? DBNull.Value); + cmd.Parameters.AddWithValue("granted_by", (object?)granted_by ?? DBNull.Value); + cmd.Parameters.AddWithValue("expires_at", (object?)expires_at ?? DBNull.Value); + } +} + +/// <summary> +/// Hand-written extension methods for inserting into <c>gk_role_permission</c>. +/// Replaces the broken DataProvider-generated version. +/// </summary> +public static class gk_role_permissionExtensions +{ + private const string Sql = + @"INSERT INTO public.gk_role_permission (role_id, permission_id, granted_at) + VALUES (@role_id, @permission_id, @granted_at) + ON CONFLICT DO NOTHING"; + + /// <summary>Inserts a row into <c>gk_role_permission</c>.</summary> + public static async Task<Result<Guid?, SqlError>> Insertgk_role_permissionAsync( + this NpgsqlConnection conn, + string role_id, + string permission_id, + string? granted_at + ) + { + try + { + await using var cmd = new NpgsqlCommand(Sql, conn); + BindParameters(cmd, role_id, permission_id, granted_at); + _ = await cmd.ExecuteNonQueryAsync().ConfigureAwait(false); + return new Result<Guid?, SqlError>.Ok<Guid?, SqlError>(null); + } + catch (Exception ex) + { + return new Result<Guid?, SqlError>.Error<Guid?, SqlError>(SqlError.FromException(ex)); + } + } + + /// <summary>Transaction overload of <see cref="Insertgk_role_permissionAsync(NpgsqlConnection, string, string, string?)"/>.</summary> + public static async Task<Result<Guid?, SqlError>> Insertgk_role_permissionAsync( + this IDbTransaction transaction, + string role_id, + string permission_id, + string? granted_at + ) + { + if (transaction.Connection is not NpgsqlConnection conn) + { + return new Result<Guid?, SqlError>.Error<Guid?, SqlError>( + new SqlError("Transaction.Connection must be NpgsqlConnection") + ); + } + + try + { + await using var cmd = new NpgsqlCommand(Sql, conn, (NpgsqlTransaction)transaction); + BindParameters(cmd, role_id, permission_id, granted_at); + _ = await cmd.ExecuteNonQueryAsync().ConfigureAwait(false); + return new Result<Guid?, SqlError>.Ok<Guid?, SqlError>(null); + } + catch (Exception ex) + { + return new Result<Guid?, SqlError>.Error<Guid?, SqlError>(SqlError.FromException(ex)); + } + } + + private static void BindParameters( + NpgsqlCommand cmd, + string role_id, + string permission_id, + string? granted_at + ) + { + cmd.Parameters.AddWithValue("role_id", role_id); + cmd.Parameters.AddWithValue("permission_id", permission_id); + cmd.Parameters.AddWithValue("granted_at", (object?)granted_at ?? DBNull.Value); + } +} diff --git a/Gatekeeper/Gatekeeper.Api/Program.cs b/Gatekeeper/Gatekeeper.Api/Program.cs new file mode 100644 index 0000000..50ef2e4 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Program.cs @@ -0,0 +1,729 @@ +#pragma warning disable IDE0037 // Use inferred member name + +using System.Text; +using Gatekeeper.Api; +using Microsoft.AspNetCore.Http.Json; +using InitError = Outcome.Result<bool, string>.Error<bool, string>; + +var builder = WebApplication.CreateBuilder(args); + +// File logging - use LOG_PATH env var or default to /tmp in containers +var logPath = + Environment.GetEnvironmentVariable("LOG_PATH") + ?? ( + Environment.GetEnvironmentVariable("DOTNET_RUNNING_IN_CONTAINER") == "true" + ? "/tmp/gatekeeper.log" + : Path.Combine(AppContext.BaseDirectory, "gatekeeper.log") + ); +builder.Logging.AddFileLogging(logPath); + +builder.Services.Configure<JsonOptions>(options => + options.SerializerOptions.PropertyNamingPolicy = null +); + +builder.Services.AddCors(options => + options.AddPolicy( + "Dashboard", + policy => policy.AllowAnyOrigin().AllowAnyHeader().AllowAnyMethod() + ) +); + +var serverDomain = builder.Configuration["Fido2:ServerDomain"] ?? "localhost"; +var serverName = builder.Configuration["Fido2:ServerName"] ?? "Gatekeeper"; +var origin = builder.Configuration["Fido2:Origin"] ?? "http://localhost:5173"; + +builder.Services.AddFido2(options => +{ + options.ServerDomain = serverDomain; + options.ServerName = serverName; + options.Origins = new HashSet<string> { origin }; + options.TimestampDriftTolerance = 300000; +}); + +var connectionString = + builder.Configuration.GetConnectionString("Postgres") + ?? throw new InvalidOperationException("PostgreSQL connection string 'Postgres' is required"); + +builder.Services.AddSingleton(new DbConfig(connectionString)); + +var signingKeyBase64 = builder.Configuration["Jwt:SigningKey"]; +var signingKey = string.IsNullOrEmpty(signingKeyBase64) + ? new byte[32] // Default dev key (32 zeros) - MUST match Clinical/Scheduling APIs + : Convert.FromBase64String(signingKeyBase64); +builder.Services.AddSingleton(new JwtConfig(signingKey, TimeSpan.FromHours(24))); + +var app = builder.Build(); + +using (var conn = new NpgsqlConnection(connectionString)) +{ + conn.Open(); + if (DatabaseSetup.Initialize(conn, app.Logger) is InitError initErr) + Environment.FailFast(initErr.Value); +} + +app.UseCors("Dashboard"); + +static string Now() => DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture); + +static NpgsqlConnection OpenConnection(DbConfig db) +{ + var conn = new NpgsqlConnection(db.ConnectionString); + conn.Open(); + return conn; +} + +var authGroup = app.MapGroup("/auth").WithTags("Authentication"); + +authGroup.MapPost( + "/register/begin", + async (RegisterBeginRequest request, IFido2 fido2, DbConfig db, ILogger<Program> logger) => + { + try + { + using var conn = OpenConnection(db); + var now = Now(); + + var existingUser = await conn.GetUserByEmailAsync(request.Email).ConfigureAwait(false); + var isNewUser = existingUser is not GetUserByEmailOk { Value.Count: > 0 }; + var userId = isNewUser + ? Guid.NewGuid().ToString() + : ((GetUserByEmailOk)existingUser).Value[0].id ?? Guid.NewGuid().ToString(); + + if (isNewUser) + { + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + _ = await tx.Insertgk_userAsync( + userId, + request.DisplayName, + request.Email, + now, + null, + true, + null + ) + .ConfigureAwait(false); + await tx.CommitAsync().ConfigureAwait(false); + } + + var existingCredentials = await conn.GetUserCredentialsAsync(userId!) + .ConfigureAwait(false); + var excludeCredentials = existingCredentials switch + { + GetUserCredentialsOk ok => ok + .Value.Where(c => c.id is not null) + .Select(c => new PublicKeyCredentialDescriptor(Base64Url.Decode(c.id!))) + .ToList(), + GetUserCredentialsError _ => [], + }; + + var user = new Fido2User + { + Id = Encoding.UTF8.GetBytes(userId!), + Name = request.Email, + DisplayName = request.DisplayName, + }; + // Don't restrict to platform authenticators only - allows security keys too + // Chrome on macOS can timeout with Platform-only restriction + var authSelector = new AuthenticatorSelection + { + ResidentKey = ResidentKeyRequirement.Required, + UserVerification = UserVerificationRequirement.Required, + }; + + var options = fido2.RequestNewCredential( + new RequestNewCredentialParams + { + User = user, + ExcludeCredentials = excludeCredentials, + AuthenticatorSelection = authSelector, + AttestationPreference = AttestationConveyancePreference.None, + } + ); + var challengeId = Guid.NewGuid().ToString(); + var challengeExpiry = DateTime + .UtcNow.AddMinutes(5) + .ToString("o", CultureInfo.InvariantCulture); + + await using var tx2 = await conn.BeginTransactionAsync().ConfigureAwait(false); + _ = await tx2.Insertgk_challengeAsync( + challengeId, + userId, + options.Challenge, + "registration", + now, + challengeExpiry + ) + .ConfigureAwait(false); + await tx2.CommitAsync().ConfigureAwait(false); + + return Results.Ok(new { ChallengeId = challengeId, OptionsJson = options.ToJson() }); + } + catch (Exception ex) + { + logger.LogError(ex, "Registration begin failed"); + return Results.Problem("Registration failed"); + } + } +); + +authGroup.MapPost( + "/login/begin", + async (IFido2 fido2, DbConfig db, ILogger<Program> logger) => + { + try + { + using var conn = OpenConnection(db); + var now = Now(); + + // Discoverable credentials: empty allowCredentials lets browser show all stored passkeys + // The credential contains userHandle which we use in /login/complete to identify the user + // See: https://webauthn.guide/ and fido2-net-lib docs + var options = fido2.GetAssertionOptions( + new GetAssertionOptionsParams + { + AllowedCredentials = [], // Empty = discoverable credentials + UserVerification = UserVerificationRequirement.Required, + } + ); + var challengeId = Guid.NewGuid().ToString(); + var challengeExpiry = DateTime + .UtcNow.AddMinutes(5) + .ToString("o", CultureInfo.InvariantCulture); + + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + _ = await tx.Insertgk_challengeAsync( + challengeId, + null, // No user ID - discovered from credential in /login/complete + options.Challenge, + "authentication", + now, + challengeExpiry + ) + .ConfigureAwait(false); + await tx.CommitAsync().ConfigureAwait(false); + + return Results.Ok(new { ChallengeId = challengeId, OptionsJson = options.ToJson() }); + } + catch (Exception ex) + { + logger.LogError(ex, "Login begin failed"); + return Results.Problem("Login failed"); + } + } +); + +authGroup.MapPost( + "/register/complete", + async ( + RegisterCompleteRequest request, + IFido2 fido2, + DbConfig db, + JwtConfig jwtConfig, + ILogger<Program> logger + ) => + { + try + { + using var conn = OpenConnection(db); + var now = Now(); + + // Get the stored challenge + var challengeResult = await conn.GetChallengeByIdAsync(request.ChallengeId, now) + .ConfigureAwait(false); + if (challengeResult is not GetChallengeByIdOk { Value.Count: > 0 } challengeOk) + { + return Results.BadRequest(new { Error = "Challenge not found or expired" }); + } + + var storedChallenge = challengeOk.Value[0]; + if (string.IsNullOrEmpty(storedChallenge.user_id)) + { + return Results.BadRequest(new { Error = "Invalid challenge" }); + } + + // Parse the authenticator response + var options = CredentialCreateOptions.FromJson(request.OptionsJson); + + // Verify the attestation + var credentialResult = await fido2 + .MakeNewCredentialAsync( + new MakeNewCredentialParams + { + AttestationResponse = request.AttestationResponse, + OriginalOptions = options, + IsCredentialIdUniqueToUserCallback = async (args, ct) => + { + var existing = await conn.GetCredentialByIdAsync( + Base64Url.Encode(args.CredentialId) + ) + .ConfigureAwait(false); + return existing is not GetCredentialByIdOk { Value.Count: > 0 }; + }, + } + ) + .ConfigureAwait(false); + + var cred = credentialResult; + + // Store the credential - use base64url encoding to match WebAuthn spec + await using var tx = await conn.BeginTransactionAsync().ConfigureAwait(false); + _ = await tx.Insertgk_credentialAsync( + Base64Url.Encode(cred.Id), + storedChallenge.user_id, + cred.PublicKey, + (int?)cred.SignCount, + cred.AaGuid.ToString(), + cred.Type.ToString(), + cred.Transports != null ? string.Join(",", cred.Transports) : null, + cred.AttestationFormat, + now, + null, + request.DeviceName, + cred.IsBackupEligible, + cred.IsBackedUp + ) + .ConfigureAwait(false); + + // Assign default user role + _ = await tx.Insertgk_user_roleAsync( + storedChallenge.user_id, + "role-user", + now, + null, + null + ) + .ConfigureAwait(false); + + await tx.CommitAsync().ConfigureAwait(false); + + // Get user info for token + var userResult = await conn.GetUserByIdAsync(storedChallenge.user_id) + .ConfigureAwait(false); + var user = userResult is GetUserByIdOk { Value.Count: > 0 } userOk + ? userOk.Value[0] + : null; + + // Get user roles + var rolesResult = await conn.GetUserRolesAsync(storedChallenge.user_id, now) + .ConfigureAwait(false); + var roles = rolesResult is GetUserRolesOk rolesOk + ? rolesOk + .Value.Select(r => r.name) + .Where(n => n is not null) + .Select(n => n!) + .ToList() + : new List<string>(); + + // Generate JWT + var token = TokenService.CreateToken( + storedChallenge.user_id ?? string.Empty, + user?.display_name, + user?.email, + roles, + jwtConfig.SigningKey, + jwtConfig.TokenLifetime + ); + + return Results.Ok( + new + { + Token = token, + UserId = storedChallenge.user_id, + DisplayName = user?.display_name, + Email = user?.email, + Roles = roles, + } + ); + } + catch (Exception ex) + { + logger.LogError(ex, "Registration complete failed"); + return Results.Problem("Registration failed"); + } + } +); + +authGroup.MapPost( + "/login/complete", + async ( + LoginCompleteRequest request, + IFido2 fido2, + DbConfig db, + JwtConfig jwtConfig, + ILogger<Program> logger + ) => + { + try + { + using var conn = OpenConnection(db); + var now = Now(); + + // Get the stored challenge + var challengeResult = await conn.GetChallengeByIdAsync(request.ChallengeId, now) + .ConfigureAwait(false); + if (challengeResult is not GetChallengeByIdOk { Value.Count: > 0 } challengeOk) + { + return Results.BadRequest(new { Error = "Challenge not found or expired" }); + } + + var storedChallenge = challengeOk.Value[0]; + + var credentialId = request.AssertionResponse.Id; + logger.LogInformation("Login attempt - credential ID: {CredentialId}", credentialId); + var credResult = await conn.GetCredentialByIdAsync(credentialId).ConfigureAwait(false); + if (credResult is not GetCredentialByIdOk { Value.Count: > 0 } credOk) + { + logger.LogWarning("Credential not found for ID: {CredentialId}", credentialId); + return Results.BadRequest(new { Error = "Credential not found" }); + } + + var storedCred = credOk.Value[0]; + + // Parse the assertion options + var options = AssertionOptions.FromJson(request.OptionsJson); + + // Verify the assertion + var assertionResult = await fido2 + .MakeAssertionAsync( + new MakeAssertionParams + { + AssertionResponse = request.AssertionResponse, + OriginalOptions = options, + StoredPublicKey = storedCred.public_key ?? Array.Empty<byte>(), + StoredSignatureCounter = (uint)(storedCred.sign_count ?? 0), + IsUserHandleOwnerOfCredentialIdCallback = (args, _) => + { + var userIdFromHandle = Encoding.UTF8.GetString(args.UserHandle); + return Task.FromResult(storedCred.user_id == userIdFromHandle); + }, + } + ) + .ConfigureAwait(false); + + // Update sign count and last used + using var updateCmd = conn.CreateCommand(); + updateCmd.CommandText = + @" + UPDATE gk_credential + SET sign_count = @signCount, last_used_at = @now + WHERE id = @id"; + updateCmd.Parameters.AddWithValue("@signCount", (long)assertionResult.SignCount); + updateCmd.Parameters.AddWithValue("@now", now); + updateCmd.Parameters.AddWithValue("@id", credentialId); + await updateCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + // Update user last login + using var userUpdateCmd = conn.CreateCommand(); + userUpdateCmd.CommandText = "UPDATE gk_user SET last_login_at = @now WHERE id = @id"; + userUpdateCmd.Parameters.AddWithValue("@now", now); + userUpdateCmd.Parameters.AddWithValue( + "@id", + (object?)storedCred.user_id ?? DBNull.Value + ); + await userUpdateCmd.ExecuteNonQueryAsync().ConfigureAwait(false); + + // Get user info for token + var userResult = await conn.GetUserByIdAsync(storedCred.user_id ?? string.Empty) + .ConfigureAwait(false); + var user = userResult is GetUserByIdOk { Value.Count: > 0 } userOk + ? userOk.Value[0] + : null; + + // Get user roles + var rolesResult = await conn.GetUserRolesAsync(storedCred.user_id ?? string.Empty, now) + .ConfigureAwait(false); + var roles = rolesResult is GetUserRolesOk rolesOk + ? rolesOk + .Value.Select(r => r.name) + .Where(n => n is not null) + .Select(n => n!) + .ToList() + : new List<string>(); + + // Generate JWT + var token = TokenService.CreateToken( + storedCred.user_id ?? string.Empty, + user?.display_name, + user?.email, + roles, + jwtConfig.SigningKey, + jwtConfig.TokenLifetime + ); + + return Results.Ok( + new + { + Token = token, + UserId = storedCred.user_id, + DisplayName = user?.display_name, + Email = user?.email, + Roles = roles, + } + ); + } + catch (Exception ex) + { + logger.LogError(ex, "Login complete failed"); + return Results.Problem("Login failed"); + } + } +); + +authGroup.MapGet( + "/session", + async (HttpContext ctx, DbConfig db, JwtConfig jwtConfig) => + { + var token = TokenService.ExtractBearerToken(ctx.Request.Headers.Authorization); + if (string.IsNullOrEmpty(token)) + { + return Results.Unauthorized(); + } + + using var conn = OpenConnection(db); + + var result = await TokenService + .ValidateTokenAsync(conn, token, jwtConfig.SigningKey, checkRevocation: true) + .ConfigureAwait(false); + if (result is not TokenService.TokenValidationOk ok) + { + return Results.Unauthorized(); + } + + return Results.Ok( + new + { + ok.Claims.UserId, + ok.Claims.DisplayName, + ok.Claims.Email, + ok.Claims.Roles, + ExpiresAt = DateTimeOffset + .FromUnixTimeSeconds(ok.Claims.Exp) + .ToString("o", CultureInfo.InvariantCulture), + } + ); + } +); + +authGroup.MapPost( + "/logout", + async (HttpContext ctx, DbConfig db, JwtConfig jwtConfig) => + { + var token = TokenService.ExtractBearerToken(ctx.Request.Headers.Authorization); + if (string.IsNullOrEmpty(token)) + { + return Results.Unauthorized(); + } + + using var conn = OpenConnection(db); + + var result = await TokenService + .ValidateTokenAsync(conn, token, jwtConfig.SigningKey, checkRevocation: false) + .ConfigureAwait(false); + if (result is TokenService.TokenValidationOk ok) + { + await TokenService.RevokeTokenAsync(conn, ok.Claims.Jti).ConfigureAwait(false); + } + + return Results.NoContent(); + } +); + +var authzGroup = app.MapGroup("/authz").WithTags("Authorization"); + +authzGroup.MapGet( + "/check", + async ( + string permission, + string? resourceType, + string? resourceId, + HttpContext ctx, + DbConfig db, + JwtConfig jwtConfig + ) => + { + var token = TokenService.ExtractBearerToken(ctx.Request.Headers.Authorization); + if (string.IsNullOrEmpty(token)) + { + return Results.Unauthorized(); + } + + using var conn = OpenConnection(db); + + var validateResult = await TokenService + .ValidateTokenAsync(conn, token, jwtConfig.SigningKey, checkRevocation: true) + .ConfigureAwait(false); + if (validateResult is not TokenService.TokenValidationOk ok) + { + return Results.Unauthorized(); + } + + var (allowed, reason) = await AuthorizationService + .CheckPermissionAsync( + conn, + ok.Claims.UserId, + permission, + resourceType, + resourceId, + Now() + ) + .ConfigureAwait(false); + return Results.Ok(new { Allowed = allowed, Reason = reason }); + } +); + +authzGroup.MapGet( + "/permissions", + async (HttpContext ctx, DbConfig db, JwtConfig jwtConfig) => + { + var token = TokenService.ExtractBearerToken(ctx.Request.Headers.Authorization); + if (string.IsNullOrEmpty(token)) + { + return Results.Unauthorized(); + } + + using var conn = OpenConnection(db); + + var validateResult = await TokenService + .ValidateTokenAsync(conn, token, jwtConfig.SigningKey, checkRevocation: true) + .ConfigureAwait(false); + if (validateResult is not TokenService.TokenValidationOk ok) + { + return Results.Unauthorized(); + } + + var permissionsResult = await conn.GetUserPermissionsAsync(ok.Claims.UserId, Now()) + .ConfigureAwait(false); + var permissions = permissionsResult is GetUserPermissionsOk permOk + ? permOk + .Value.Select(p => new + { + p.code, + p.source_name, + p.source_type, + p.scope_type, + p.scope_value, + }) + .ToList() + : []; + + return Results.Ok(new { Permissions = permissions }); + } +); + +authzGroup.MapPost( + "/evaluate", + async (EvaluateRequest request, HttpContext ctx, DbConfig db, JwtConfig jwtConfig) => + { + var token = TokenService.ExtractBearerToken(ctx.Request.Headers.Authorization); + if (string.IsNullOrEmpty(token)) + { + return Results.Unauthorized(); + } + + using var conn = OpenConnection(db); + + var validateResult = await TokenService + .ValidateTokenAsync(conn, token, jwtConfig.SigningKey, checkRevocation: true) + .ConfigureAwait(false); + if (validateResult is not TokenService.TokenValidationOk ok) + { + return Results.Unauthorized(); + } + + var now = Now(); + var results = new List<object>(); + foreach (var check in request.Checks) + { + var (allowed, _) = await AuthorizationService + .CheckPermissionAsync( + conn, + ok.Claims.UserId, + check.Permission, + check.ResourceType, + check.ResourceId, + now + ) + .ConfigureAwait(false); + results.Add( + new + { + check.Permission, + check.ResourceId, + Allowed = allowed, + } + ); + } + + return Results.Ok(new { Results = results }); + } +); + +app.Run(); + +namespace Gatekeeper.Api +{ + /// <summary> + /// Program entry point marker for WebApplicationFactory. + /// </summary> + public partial class Program { } + + /// <summary>Database connection configuration.</summary> + public sealed record DbConfig(string ConnectionString); + + /// <summary>JWT signing configuration.</summary> + public sealed record JwtConfig(byte[] SigningKey, TimeSpan TokenLifetime); + + /// <summary>Request to begin passkey registration.</summary> + public sealed record RegisterBeginRequest(string Email, string DisplayName); + + /// <summary>Request to begin passkey login.</summary> + public sealed record LoginBeginRequest(string? Email); + + /// <summary>Request to evaluate multiple permissions.</summary> + public sealed record EvaluateRequest(List<PermissionCheck> Checks); + + /// <summary>Single permission check.</summary> + public sealed record PermissionCheck( + string Permission, + string? ResourceType, + string? ResourceId + ); + + /// <summary>Request to complete passkey registration.</summary> + public sealed record RegisterCompleteRequest( + string ChallengeId, + string OptionsJson, + AuthenticatorAttestationRawResponse AttestationResponse, + string? DeviceName + ); + + /// <summary>Request to complete passkey login.</summary> + public sealed record LoginCompleteRequest( + string ChallengeId, + string OptionsJson, + AuthenticatorAssertionRawResponse AssertionResponse + ); + + /// <summary>Base64URL encoding utilities for WebAuthn credential IDs.</summary> + public static class Base64Url + { + /// <summary>Encodes bytes to base64url string.</summary> + public static string Encode(byte[] input) => + Convert + .ToBase64String(input) + .Replace("+", "-", StringComparison.Ordinal) + .Replace("/", "_", StringComparison.Ordinal) + .TrimEnd('='); + + /// <summary>Decodes base64url string to bytes.</summary> + public static byte[] Decode(string input) + { + var padded = input + .Replace("-", "+", StringComparison.Ordinal) + .Replace("_", "/", StringComparison.Ordinal); + var padding = (4 - (padded.Length % 4)) % 4; + padded += new string('=', padding); + return Convert.FromBase64String(padded); + } + } +} diff --git a/Gatekeeper/Gatekeeper.Api/Properties/launchSettings.json b/Gatekeeper/Gatekeeper.Api/Properties/launchSettings.json new file mode 100644 index 0000000..7b7463b --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Properties/launchSettings.json @@ -0,0 +1,14 @@ +{ + "profiles": { + "Gatekeeper.Api": { + "commandName": "Project", + "dotnetRunMessages": true, + "launchBrowser": false, + "applicationUrl": "http://localhost:5002", + "environmentVariables": { + "ASPNETCORE_ENVIRONMENT": "Development", + "ConnectionStrings__Postgres": "Host=localhost;Database=gatekeeper;Username=gatekeeper;Password=changeme" + } + } + } +} diff --git a/Gatekeeper/Gatekeeper.Api/Sql/CheckPermission.sql b/Gatekeeper/Gatekeeper.Api/Sql/CheckPermission.sql new file mode 100644 index 0000000..2d606f1 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/CheckPermission.sql @@ -0,0 +1,24 @@ +-- name: CheckPermission +-- Checks if user has a specific permission code (via roles or direct grant) +SELECT 1 AS has_permission +FROM gk_permission p +WHERE p.code = @permissionCode + AND ( + -- Check role permissions + EXISTS ( + SELECT 1 FROM gk_role_permission rp + JOIN gk_user_role ur ON rp.role_id = ur.role_id + WHERE rp.permission_id = p.id + AND ur.user_id = @userId + AND (ur.expires_at IS NULL OR ur.expires_at > @now) + ) + OR + -- Check direct permissions + EXISTS ( + SELECT 1 FROM gk_user_permission up + WHERE up.permission_id = p.id + AND up.user_id = @userId + AND (up.expires_at IS NULL OR up.expires_at > @now) + ) + ) +LIMIT 1 diff --git a/Gatekeeper/Gatekeeper.Api/Sql/CheckResourceGrant.sql b/Gatekeeper/Gatekeeper.Api/Sql/CheckResourceGrant.sql new file mode 100644 index 0000000..ec8489e --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/CheckResourceGrant.sql @@ -0,0 +1,10 @@ +-- name: CheckResourceGrant +SELECT rg.id, rg.user_id, rg.resource_type, rg.resource_id, rg.permission_id, + rg.granted_at, rg.granted_by, rg.expires_at, p.code as permission_code +FROM gk_resource_grant rg +JOIN gk_permission p ON rg.permission_id = p.id +WHERE rg.user_id = @user_id + AND rg.resource_type = @resource_type + AND rg.resource_id = @resource_id + AND p.code = @permission_code + AND (rg.expires_at IS NULL OR rg.expires_at > @now) diff --git a/Gatekeeper/Gatekeeper.Api/Sql/CountSystemRoles.sql b/Gatekeeper/Gatekeeper.Api/Sql/CountSystemRoles.sql new file mode 100644 index 0000000..ccad080 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/CountSystemRoles.sql @@ -0,0 +1,2 @@ +-- name: CountSystemRoles +SELECT COUNT(*) as cnt FROM gk_role WHERE is_system = true diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetActivePolicies.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetActivePolicies.sql new file mode 100644 index 0000000..39bd443 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetActivePolicies.sql @@ -0,0 +1,7 @@ +-- name: GetActivePolicies +SELECT id, name, description, resource_type, action, condition, effect, priority +FROM gk_policy +WHERE is_active = true + AND (resource_type = @resource_type OR resource_type = '*') + AND (action = @action OR action = '*') +ORDER BY priority DESC diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetAllPermissions.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetAllPermissions.sql new file mode 100644 index 0000000..9272aad --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetAllPermissions.sql @@ -0,0 +1,4 @@ +-- name: GetAllPermissions +SELECT id, code, resource_type, action, description, created_at +FROM gk_permission +ORDER BY resource_type, action diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetAllRoles.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetAllRoles.sql new file mode 100644 index 0000000..2c087d4 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetAllRoles.sql @@ -0,0 +1,4 @@ +-- name: GetAllRoles +SELECT id, name, description, is_system, created_at, parent_role_id +FROM gk_role +ORDER BY name diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetAllUsers.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetAllUsers.sql new file mode 100644 index 0000000..173fd7c --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetAllUsers.sql @@ -0,0 +1,4 @@ +-- name: GetAllUsers +SELECT id, display_name, email, created_at, last_login_at, is_active +FROM gk_user +ORDER BY display_name diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetChallengeById.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetChallengeById.sql new file mode 100644 index 0000000..344a195 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetChallengeById.sql @@ -0,0 +1,4 @@ +-- name: GetChallengeById +SELECT id, user_id, challenge, type, created_at, expires_at +FROM gk_challenge +WHERE id = @id AND expires_at > @now diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetCredentialById.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetCredentialById.sql new file mode 100644 index 0000000..e15905c --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetCredentialById.sql @@ -0,0 +1,7 @@ +-- name: GetCredentialById +SELECT c.id, c.user_id, c.public_key, c.sign_count, c.aaguid, c.credential_type, c.transports, + c.attestation_format, c.created_at, c.last_used_at, c.device_name, c.is_backup_eligible, c.is_backed_up, + u.display_name, u.email +FROM gk_credential c +JOIN gk_user u ON c.user_id = u.id +WHERE c.id = @id AND u.is_active = true diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetCredentialsByUserId.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetCredentialsByUserId.sql new file mode 100644 index 0000000..ff8f83a --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetCredentialsByUserId.sql @@ -0,0 +1,6 @@ +-- name: GetCredentialsByUserId +SELECT id, user_id, public_key, sign_count, aaguid, credential_type, transports, + attestation_format, created_at, last_used_at, device_name, + is_backup_eligible, is_backed_up +FROM gk_credential +WHERE user_id = @userId diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetPermissionByCode.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetPermissionByCode.sql new file mode 100644 index 0000000..7bb724f --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetPermissionByCode.sql @@ -0,0 +1,4 @@ +-- name: GetPermissionByCode +SELECT id, code, resource_type, action, description, created_at +FROM gk_permission +WHERE code = @code diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetRolePermissions.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetRolePermissions.sql new file mode 100644 index 0000000..933ce01 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetRolePermissions.sql @@ -0,0 +1,6 @@ +-- name: GetRolePermissions +SELECT p.id, p.code, p.resource_type, p.action, p.description, p.created_at, + rp.granted_at +FROM gk_permission p +JOIN gk_role_permission rp ON p.id = rp.permission_id +WHERE rp.role_id = @roleId diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetSessionById.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetSessionById.sql new file mode 100644 index 0000000..4d71809 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetSessionById.sql @@ -0,0 +1,7 @@ +-- name: GetSessionById +SELECT s.id, s.user_id, s.credential_id, s.created_at, s.expires_at, s.last_activity_at, + s.ip_address, s.user_agent, s.is_revoked, + u.display_name, u.email +FROM gk_session s +JOIN gk_user u ON s.user_id = u.id +WHERE s.id = @id AND s.is_revoked = false AND s.expires_at > @now AND u.is_active = true diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetSessionForRevoke.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetSessionForRevoke.sql new file mode 100644 index 0000000..92dd7e5 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetSessionForRevoke.sql @@ -0,0 +1,6 @@ +-- Gets a session for revocation (no filters) +-- @jti: The session ID (JWT ID) to get +SELECT id, user_id, credential_id, created_at, expires_at, last_activity_at, + ip_address, user_agent, is_revoked +FROM gk_session +WHERE id = @jti diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetSessionRevoked.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetSessionRevoked.sql new file mode 100644 index 0000000..1525177 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetSessionRevoked.sql @@ -0,0 +1,3 @@ +-- Gets the revocation status of a session +-- @jti: The session ID (JWT ID) to check +SELECT is_revoked FROM gk_session WHERE id = @jti diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetUserByEmail.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetUserByEmail.sql new file mode 100644 index 0000000..6532c54 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetUserByEmail.sql @@ -0,0 +1,4 @@ +-- name: GetUserByEmail +SELECT id, display_name, email, created_at, last_login_at, is_active, metadata +FROM gk_user +WHERE email = @email AND is_active = true diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetUserById.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetUserById.sql new file mode 100644 index 0000000..3e4f33a --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetUserById.sql @@ -0,0 +1,4 @@ +-- name: GetUserById +SELECT id, display_name, email, created_at, last_login_at, is_active, metadata +FROM gk_user +WHERE id = @id diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetUserCredentials.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetUserCredentials.sql new file mode 100644 index 0000000..1b6c59f --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetUserCredentials.sql @@ -0,0 +1,5 @@ +-- name: GetUserCredentials +SELECT id, user_id, public_key, sign_count, aaguid, credential_type, transports, + attestation_format, created_at, last_used_at, device_name, is_backup_eligible, is_backed_up +FROM gk_credential +WHERE user_id = @user_id diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetUserPermissions.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetUserPermissions.sql new file mode 100644 index 0000000..cdf9225 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetUserPermissions.sql @@ -0,0 +1,26 @@ +-- name: GetUserPermissions +-- Returns all permissions for a user: from roles + direct grants +-- Note: source_type column uses role name prefix to indicate source (role-based vs direct) +SELECT DISTINCT p.id, p.code, p.resource_type, p.action, p.description, + r.name as source_name, + ur.role_id as source_type, + NULL as scope_type, + NULL as scope_value +FROM gk_user_role ur +JOIN gk_role r ON ur.role_id = r.id +JOIN gk_role_permission rp ON r.id = rp.role_id +JOIN gk_permission p ON rp.permission_id = p.id +WHERE ur.user_id = @user_id + AND (ur.expires_at IS NULL OR ur.expires_at > @now) + +UNION ALL + +SELECT p.id, p.code, p.resource_type, p.action, p.description, + p.code as source_name, + up.permission_id as source_type, + COALESCE(up.scope_type, p.resource_type) as scope_type, + COALESCE(up.scope_value, p.action) as scope_value +FROM gk_user_permission up +JOIN gk_permission p ON up.permission_id = p.id +WHERE up.user_id = @user_id + AND (up.expires_at IS NULL OR up.expires_at > @now) diff --git a/Gatekeeper/Gatekeeper.Api/Sql/GetUserRoles.sql b/Gatekeeper/Gatekeeper.Api/Sql/GetUserRoles.sql new file mode 100644 index 0000000..f2675ed --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/GetUserRoles.sql @@ -0,0 +1,6 @@ +-- name: GetUserRoles +SELECT r.id, r.name, r.description, r.is_system, ur.granted_at, ur.expires_at +FROM gk_user_role ur +JOIN gk_role r ON ur.role_id = r.id +WHERE ur.user_id = @user_id + AND (ur.expires_at IS NULL OR ur.expires_at > @now) diff --git a/Gatekeeper/Gatekeeper.Api/Sql/RevokeSession.sql b/Gatekeeper/Gatekeeper.Api/Sql/RevokeSession.sql new file mode 100644 index 0000000..4a37a3b --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/Sql/RevokeSession.sql @@ -0,0 +1,3 @@ +-- Revokes a session by setting is_revoked = true +-- @jti: The session ID (JWT ID) to revoke +UPDATE gk_session SET is_revoked = true WHERE id = @jti RETURNING id, is_revoked diff --git a/Gatekeeper/Gatekeeper.Api/TokenService.cs b/Gatekeeper/Gatekeeper.Api/TokenService.cs new file mode 100644 index 0000000..2947582 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/TokenService.cs @@ -0,0 +1,191 @@ +using System.Security.Cryptography; +using System.Text; + +namespace Gatekeeper.Api; + +/// <summary> +/// JWT token generation and validation service. +/// </summary> +public static class TokenService +{ + /// <summary>Token claims data.</summary> + public sealed record TokenClaims( + string UserId, + string? DisplayName, + string? Email, + IReadOnlyList<string> Roles, + string Jti, + long Exp + ); + + /// <summary>Successful token validation result.</summary> + public sealed record TokenValidationOk(TokenClaims Claims); + + /// <summary>Failed token validation result.</summary> + public sealed record TokenValidationError(string Reason); + + /// <summary> + /// Extracts the token from a Bearer authorization header. + /// </summary> + public static string? ExtractBearerToken(string? authHeader) => + authHeader?.StartsWith("Bearer ", StringComparison.Ordinal) == true + ? authHeader["Bearer ".Length..] + : null; + + /// <summary> + /// Creates a JWT token for the given user. + /// </summary> + public static string CreateToken( + string userId, + string? displayName, + string? email, + IReadOnlyList<string> roles, + byte[] signingKey, + TimeSpan lifetime + ) + { + var now = DateTimeOffset.UtcNow; + var exp = now.Add(lifetime); + var jti = Guid.NewGuid().ToString(); + + var header = Base64UrlEncode( + JsonSerializer.SerializeToUtf8Bytes(new { alg = "HS256", typ = "JWT" }) + ); + + var payload = Base64UrlEncode( + JsonSerializer.SerializeToUtf8Bytes( + new + { + sub = userId, + name = displayName, + email, + roles, + jti, + iat = now.ToUnixTimeSeconds(), + exp = exp.ToUnixTimeSeconds(), + } + ) + ); + + var signature = ComputeSignature(header, payload, signingKey); + return $"{header}.{payload}.{signature}"; + } + + /// <summary> + /// Validates a JWT token. + /// </summary> + public static async Task<object> ValidateTokenAsync( + NpgsqlConnection conn, + string token, + byte[] signingKey, + bool checkRevocation, + ILogger? logger = null + ) + { + try + { + var parts = token.Split('.'); + if (parts.Length != 3) + { + return new TokenValidationError("Invalid token format"); + } + + var expectedSignature = ComputeSignature(parts[0], parts[1], signingKey); + if ( + !CryptographicOperations.FixedTimeEquals( + Encoding.UTF8.GetBytes(expectedSignature), + Encoding.UTF8.GetBytes(parts[2]) + ) + ) + { + return new TokenValidationError("Invalid signature"); + } + + var payloadBytes = Base64UrlDecode(parts[1]); + using var doc = JsonDocument.Parse(payloadBytes); + var root = doc.RootElement; + + var exp = root.GetProperty("exp").GetInt64(); + if (DateTimeOffset.UtcNow.ToUnixTimeSeconds() > exp) + { + return new TokenValidationError("Token expired"); + } + + var jti = root.GetProperty("jti").GetString() ?? string.Empty; + + if (checkRevocation) + { + var isRevoked = await IsTokenRevokedAsync(conn, jti).ConfigureAwait(false); + if (isRevoked) + { + return new TokenValidationError("Token revoked"); + } + } + + var roles = root.TryGetProperty("roles", out var rolesElement) + ? rolesElement.EnumerateArray().Select(e => e.GetString() ?? string.Empty).ToList() + : []; + + var claims = new TokenClaims( + UserId: root.GetProperty("sub").GetString() ?? string.Empty, + DisplayName: root.TryGetProperty("name", out var nameElem) + ? nameElem.GetString() + : null, + Email: root.TryGetProperty("email", out var emailElem) + ? emailElem.GetString() + : null, + Roles: roles, + Jti: jti, + Exp: exp + ); + + return new TokenValidationOk(claims); + } + catch (Exception ex) + { + logger?.LogError(ex, "Token validation failed"); + return new TokenValidationError("Token validation failed"); + } + } + + /// <summary> + /// Revokes a token by JTI using DataProvider generated method. + /// </summary> + public static async Task RevokeTokenAsync(NpgsqlConnection conn, string jti) => + _ = await conn.RevokeSessionAsync(jti).ConfigureAwait(false); + + private static async Task<bool> IsTokenRevokedAsync(NpgsqlConnection conn, string jti) + { + var result = await conn.GetSessionRevokedAsync(jti).ConfigureAwait(false); + return result switch + { + GetSessionRevokedOk ok => ok.Value.FirstOrDefault()?.is_revoked == true, + GetSessionRevokedError => false, + }; + } + + private static string Base64UrlEncode(byte[] input) => + Convert + .ToBase64String(input) + .Replace("+", "-", StringComparison.Ordinal) + .Replace("/", "_", StringComparison.Ordinal) + .TrimEnd('='); + + private static byte[] Base64UrlDecode(string input) + { + var padded = input + .Replace("-", "+", StringComparison.Ordinal) + .Replace("_", "/", StringComparison.Ordinal); + var padding = (4 - (padded.Length % 4)) % 4; + padded += new string('=', padding); + return Convert.FromBase64String(padded); + } + + private static string ComputeSignature(string header, string payload, byte[] key) + { + var data = Encoding.UTF8.GetBytes($"{header}.{payload}"); + using var hmac = new HMACSHA256(key); + var hash = hmac.ComputeHash(data); + return Base64UrlEncode(hash); + } +} diff --git a/Gatekeeper/Gatekeeper.Api/gatekeeper-schema.yaml b/Gatekeeper/Gatekeeper.Api/gatekeeper-schema.yaml new file mode 100644 index 0000000..809eb19 --- /dev/null +++ b/Gatekeeper/Gatekeeper.Api/gatekeeper-schema.yaml @@ -0,0 +1,397 @@ +name: gatekeeper +tables: +- name: gk_user + columns: + - name: id + type: Text + - name: display_name + type: Text + - name: email + type: Text + - name: created_at + type: Text + - name: last_login_at + type: Text + - name: is_active + type: Boolean + defaultValue: "true" + - name: metadata + type: Json + indexes: + - name: idx_user_email + columns: + - email + isUnique: true + primaryKey: + name: PK_gk_user + columns: + - id +- name: gk_credential + columns: + - name: id + type: Text + - name: user_id + type: Text + - name: public_key + type: Blob + - name: sign_count + type: Int + defaultValue: 0 + - name: aaguid + type: Text + - name: credential_type + type: Text + - name: transports + type: Json + - name: attestation_format + type: Text + - name: created_at + type: Text + - name: last_used_at + type: Text + - name: device_name + type: Text + - name: is_backup_eligible + type: Boolean + - name: is_backed_up + type: Boolean + indexes: + - name: idx_credential_user + columns: + - user_id + foreignKeys: + - name: FK_gk_credential_user_id + columns: + - user_id + referencedTable: gk_user + referencedColumns: + - id + onDelete: Cascade + primaryKey: + name: PK_gk_credential + columns: + - id +- name: gk_session + columns: + - name: id + type: Text + - name: user_id + type: Text + - name: credential_id + type: Text + - name: created_at + type: Text + - name: expires_at + type: Text + - name: last_activity_at + type: Text + - name: ip_address + type: Text + - name: user_agent + type: Text + - name: is_revoked + type: Boolean + defaultValue: "false" + indexes: + - name: idx_session_user + columns: + - user_id + - name: idx_session_expires + columns: + - expires_at + foreignKeys: + - name: FK_gk_session_user_id + columns: + - user_id + referencedTable: gk_user + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_session_credential_id + columns: + - credential_id + referencedTable: gk_credential + referencedColumns: + - id + primaryKey: + name: PK_gk_session + columns: + - id +- name: gk_challenge + columns: + - name: id + type: Text + - name: user_id + type: Text + - name: challenge + type: Blob + - name: type + type: Text + - name: created_at + type: Text + - name: expires_at + type: Text + primaryKey: + name: PK_gk_challenge + columns: + - id +- name: gk_role + columns: + - name: id + type: Text + - name: name + type: Text + - name: description + type: Text + - name: is_system + type: Boolean + defaultValue: "false" + - name: created_at + type: Text + - name: parent_role_id + type: Text + indexes: + - name: idx_role_name + columns: + - name + isUnique: true + foreignKeys: + - name: FK_gk_role_parent_role_id + columns: + - parent_role_id + referencedTable: gk_role + referencedColumns: + - id + primaryKey: + name: PK_gk_role + columns: + - id +- name: gk_user_role + columns: + - name: user_id + type: Text + - name: role_id + type: Text + - name: granted_at + type: Text + - name: granted_by + type: Text + - name: expires_at + type: Text + foreignKeys: + - name: FK_gk_user_role_user_id + columns: + - user_id + referencedTable: gk_user + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_user_role_role_id + columns: + - role_id + referencedTable: gk_role + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_user_role_granted_by + columns: + - granted_by + referencedTable: gk_user + referencedColumns: + - id + primaryKey: + name: PK_gk_user_role + columns: + - user_id + - role_id +- name: gk_permission + columns: + - name: id + type: Text + - name: code + type: Text + - name: resource_type + type: Text + - name: action + type: Text + - name: description + type: Text + - name: created_at + type: Text + indexes: + - name: idx_permission_code + columns: + - code + isUnique: true + - name: idx_permission_resource + columns: + - resource_type + primaryKey: + name: PK_gk_permission + columns: + - id +- name: gk_role_permission + columns: + - name: role_id + type: Text + - name: permission_id + type: Text + - name: granted_at + type: Text + foreignKeys: + - name: FK_gk_role_permission_role_id + columns: + - role_id + referencedTable: gk_role + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_role_permission_permission_id + columns: + - permission_id + referencedTable: gk_permission + referencedColumns: + - id + onDelete: Cascade + primaryKey: + name: PK_gk_role_permission + columns: + - role_id + - permission_id +- name: gk_user_permission + columns: + - name: user_id + type: Text + - name: permission_id + type: Text + - name: scope_type + type: Text + - name: scope_value + type: Text + - name: granted_at + type: Text + - name: granted_by + type: Text + - name: expires_at + type: Text + - name: reason + type: Text + indexes: + - name: idx_user_permission + columns: + - user_id + - permission_id + - scope_value + isUnique: true + foreignKeys: + - name: FK_gk_user_permission_user_id + columns: + - user_id + referencedTable: gk_user + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_user_permission_permission_id + columns: + - permission_id + referencedTable: gk_permission + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_user_permission_granted_by + columns: + - granted_by + referencedTable: gk_user + referencedColumns: + - id +- name: gk_resource_grant + columns: + - name: id + type: Text + - name: user_id + type: Text + - name: resource_type + type: Text + - name: resource_id + type: Text + - name: permission_id + type: Text + - name: granted_at + type: Text + - name: granted_by + type: Text + - name: expires_at + type: Text + indexes: + - name: idx_resource_grant_user + columns: + - user_id + - name: idx_resource_grant_resource + columns: + - resource_type + - resource_id + foreignKeys: + - name: FK_gk_resource_grant_user_id + columns: + - user_id + referencedTable: gk_user + referencedColumns: + - id + onDelete: Cascade + - name: FK_gk_resource_grant_permission_id + columns: + - permission_id + referencedTable: gk_permission + referencedColumns: + - id + - name: FK_gk_resource_grant_granted_by + columns: + - granted_by + referencedTable: gk_user + referencedColumns: + - id + primaryKey: + name: PK_gk_resource_grant + columns: + - id + uniqueConstraints: + - name: uq_resource_grant + columns: + - user_id + - resource_type + - resource_id + - permission_id +- name: gk_policy + columns: + - name: id + type: Text + - name: name + type: Text + - name: description + type: Text + - name: resource_type + type: Text + - name: action + type: Text + - name: condition + type: Json + - name: effect + type: Text + defaultValue: "'allow'" + - name: priority + type: Int + defaultValue: 0 + - name: is_active + type: Boolean + defaultValue: "true" + - name: created_at + type: Text + indexes: + - name: idx_policy_name + columns: + - name + isUnique: true + primaryKey: + name: PK_gk_policy + columns: + - id diff --git a/Gatekeeper/Gatekeeper.Api/gatekeeper.db b/Gatekeeper/Gatekeeper.Api/gatekeeper.db new file mode 100644 index 0000000..18d58e3 Binary files /dev/null and b/Gatekeeper/Gatekeeper.Api/gatekeeper.db differ diff --git a/Gatekeeper/README.md b/Gatekeeper/README.md new file mode 100644 index 0000000..94c3ca0 --- /dev/null +++ b/Gatekeeper/README.md @@ -0,0 +1,13 @@ +# Gatekeeper + +An independent authentication and authorization microservice: passkey-only authentication (WebAuthn/FIDO2) and fine-grained role-based access control with record-level permissions. + +| Project | Description | +|---------|-------------| +| `Gatekeeper.Api` | REST API with WebAuthn and authorization endpoints | +| `Gatekeeper.Migration` | Database schema using DataProvider migrations | +| `Gatekeeper.Api.Tests` | Integration tests | + +## Documentation + +- Full specification: [docs/specs/gatekeeper-spec.md](../docs/specs/gatekeeper-spec.md) diff --git a/HealthcareSamples.sln b/HealthcareSamples.sln index 844d9a9..33c90da 100644 --- a/HealthcareSamples.sln +++ b/HealthcareSamples.sln @@ -1,4 +1,4 @@ - + Microsoft Visual Studio Solution File, Format Version 12.00 # Visual Studio Version 17 VisualStudioVersion = 17.5.2.0 @@ -39,64 +39,219 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Shared", "Shared", "{A1B2C3 EndProject Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Authorization", "Shared\Authorization\Authorization.csproj", "{CA395494-F072-4A5B-9DD4-950530A69E0E}" EndProject +Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Gatekeeper", "Gatekeeper", "{048F5F03-6DDC-C04F-70D5-B8139DC8E373}" +EndProject +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Gatekeeper.Api", "Gatekeeper\Gatekeeper.Api\Gatekeeper.Api.csproj", "{3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}" +EndProject +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Gatekeeper.Api.Tests", "Gatekeeper\Gatekeeper.Api.Tests\Gatekeeper.Api.Tests.csproj", "{0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}" +EndProject +Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "ICD10.TestSupport", "ICD10\ICD10.TestSupport\ICD10.TestSupport.csproj", "{817E658D-F40C-43E5-8D25-92FE882D1760}" +EndProject Global GlobalSection(SolutionConfigurationPlatforms) = preSolution Debug|Any CPU = Debug|Any CPU + Debug|x64 = Debug|x64 + Debug|x86 = Debug|x86 Release|Any CPU = Release|Any CPU + Release|x64 = Release|x64 + Release|x86 = Release|x86 EndGlobalSection GlobalSection(ProjectConfigurationPlatforms) = postSolution {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Debug|Any CPU.Build.0 = Debug|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Debug|x64.ActiveCfg = Debug|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Debug|x64.Build.0 = Debug|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Debug|x86.ActiveCfg = Debug|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Debug|x86.Build.0 = Debug|Any CPU {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Release|Any CPU.ActiveCfg = Release|Any CPU {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Release|Any CPU.Build.0 = Release|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Release|x64.ActiveCfg = Release|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Release|x64.Build.0 = Release|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Release|x86.ActiveCfg = Release|Any CPU + {D53426B7-469F-4FBB-9935-4AA3C303DE8D}.Release|x86.Build.0 = Release|Any CPU {4189D963-E5AA-4782-AD78-72FBA9536B59}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {4189D963-E5AA-4782-AD78-72FBA9536B59}.Debug|Any CPU.Build.0 = Debug|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Debug|x64.ActiveCfg = Debug|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Debug|x64.Build.0 = Debug|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Debug|x86.ActiveCfg = Debug|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Debug|x86.Build.0 = Debug|Any CPU {4189D963-E5AA-4782-AD78-72FBA9536B59}.Release|Any CPU.ActiveCfg = Release|Any CPU {4189D963-E5AA-4782-AD78-72FBA9536B59}.Release|Any CPU.Build.0 = Release|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Release|x64.ActiveCfg = Release|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Release|x64.Build.0 = Release|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Release|x86.ActiveCfg = Release|Any CPU + {4189D963-E5AA-4782-AD78-72FBA9536B59}.Release|x86.Build.0 = Release|Any CPU {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Debug|Any CPU.Build.0 = Debug|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Debug|x64.ActiveCfg = Debug|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Debug|x64.Build.0 = Debug|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Debug|x86.ActiveCfg = Debug|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Debug|x86.Build.0 = Debug|Any CPU {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Release|Any CPU.ActiveCfg = Release|Any CPU {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Release|Any CPU.Build.0 = Release|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Release|x64.ActiveCfg = Release|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Release|x64.Build.0 = Release|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Release|x86.ActiveCfg = Release|Any CPU + {8131E980-CA39-4BAD-9ADE-34E6597BD00F}.Release|x86.Build.0 = Release|Any CPU {0F990389-7C88-4C7A-99F8-60E5243216FF}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {0F990389-7C88-4C7A-99F8-60E5243216FF}.Debug|Any CPU.Build.0 = Debug|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Debug|x64.ActiveCfg = Debug|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Debug|x64.Build.0 = Debug|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Debug|x86.ActiveCfg = Debug|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Debug|x86.Build.0 = Debug|Any CPU {0F990389-7C88-4C7A-99F8-60E5243216FF}.Release|Any CPU.ActiveCfg = Release|Any CPU {0F990389-7C88-4C7A-99F8-60E5243216FF}.Release|Any CPU.Build.0 = Release|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Release|x64.ActiveCfg = Release|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Release|x64.Build.0 = Release|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Release|x86.ActiveCfg = Release|Any CPU + {0F990389-7C88-4C7A-99F8-60E5243216FF}.Release|x86.Build.0 = Release|Any CPU {7782890E-712E-4658-8BF2-0DC5794A87AC}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {7782890E-712E-4658-8BF2-0DC5794A87AC}.Debug|Any CPU.Build.0 = Debug|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Debug|x64.ActiveCfg = Debug|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Debug|x64.Build.0 = Debug|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Debug|x86.ActiveCfg = Debug|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Debug|x86.Build.0 = Debug|Any CPU {7782890E-712E-4658-8BF2-0DC5794A87AC}.Release|Any CPU.ActiveCfg = Release|Any CPU {7782890E-712E-4658-8BF2-0DC5794A87AC}.Release|Any CPU.Build.0 = Release|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Release|x64.ActiveCfg = Release|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Release|x64.Build.0 = Release|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Release|x86.ActiveCfg = Release|Any CPU + {7782890E-712E-4658-8BF2-0DC5794A87AC}.Release|x86.Build.0 = Release|Any CPU {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Debug|Any CPU.Build.0 = Debug|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Debug|x64.ActiveCfg = Debug|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Debug|x64.Build.0 = Debug|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Debug|x86.ActiveCfg = Debug|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Debug|x86.Build.0 = Debug|Any CPU {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Release|Any CPU.ActiveCfg = Release|Any CPU {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Release|Any CPU.Build.0 = Release|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Release|x64.ActiveCfg = Release|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Release|x64.Build.0 = Release|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Release|x86.ActiveCfg = Release|Any CPU + {C23F467D-B5F1-400D-9EEA-96E3F467BAB7}.Release|x86.Build.0 = Release|Any CPU {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Debug|Any CPU.Build.0 = Debug|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Debug|x64.ActiveCfg = Debug|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Debug|x64.Build.0 = Debug|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Debug|x86.ActiveCfg = Debug|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Debug|x86.Build.0 = Debug|Any CPU {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Release|Any CPU.ActiveCfg = Release|Any CPU {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Release|Any CPU.Build.0 = Release|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Release|x64.ActiveCfg = Release|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Release|x64.Build.0 = Release|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Release|x86.ActiveCfg = Release|Any CPU + {94C443C0-AB5B-4FEC-9DB1-C1F29AB86653}.Release|x86.Build.0 = Release|Any CPU {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Debug|Any CPU.Build.0 = Debug|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Debug|x64.ActiveCfg = Debug|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Debug|x64.Build.0 = Debug|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Debug|x86.ActiveCfg = Debug|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Debug|x86.Build.0 = Debug|Any CPU {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Release|Any CPU.ActiveCfg = Release|Any CPU {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Release|Any CPU.Build.0 = Release|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Release|x64.ActiveCfg = Release|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Release|x64.Build.0 = Release|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Release|x86.ActiveCfg = Release|Any CPU + {31970639-E4E9-4AEF-83A1-B4DF00A4720C}.Release|x86.Build.0 = Release|Any CPU {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Debug|Any CPU.Build.0 = Debug|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Debug|x64.ActiveCfg = Debug|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Debug|x64.Build.0 = Debug|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Debug|x86.ActiveCfg = Debug|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Debug|x86.Build.0 = Debug|Any CPU {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Release|Any CPU.ActiveCfg = Release|Any CPU {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Release|Any CPU.Build.0 = Release|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Release|x64.ActiveCfg = Release|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Release|x64.Build.0 = Release|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Release|x86.ActiveCfg = Release|Any CPU + {57FF1C59-233D-49F2-B9A5-3E996EB484DE}.Release|x86.Build.0 = Release|Any CPU {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Debug|Any CPU.Build.0 = Debug|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Debug|x64.ActiveCfg = Debug|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Debug|x64.Build.0 = Debug|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Debug|x86.ActiveCfg = Debug|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Debug|x86.Build.0 = Debug|Any CPU {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Release|Any CPU.ActiveCfg = Release|Any CPU {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Release|Any CPU.Build.0 = Release|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Release|x64.ActiveCfg = Release|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Release|x64.Build.0 = Release|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Release|x86.ActiveCfg = Release|Any CPU + {3A1E29E7-2A50-4F26-96D7-D38D3328E595}.Release|x86.Build.0 = Release|Any CPU {A82453CD-8E3C-44B7-A78F-97F392016385}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {A82453CD-8E3C-44B7-A78F-97F392016385}.Debug|Any CPU.Build.0 = Debug|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Debug|x64.ActiveCfg = Debug|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Debug|x64.Build.0 = Debug|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Debug|x86.ActiveCfg = Debug|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Debug|x86.Build.0 = Debug|Any CPU {A82453CD-8E3C-44B7-A78F-97F392016385}.Release|Any CPU.ActiveCfg = Release|Any CPU {A82453CD-8E3C-44B7-A78F-97F392016385}.Release|Any CPU.Build.0 = Release|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Release|x64.ActiveCfg = Release|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Release|x64.Build.0 = Release|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Release|x86.ActiveCfg = Release|Any CPU + {A82453CD-8E3C-44B7-A78F-97F392016385}.Release|x86.Build.0 = Release|Any CPU {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Debug|Any CPU.Build.0 = Debug|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Debug|x64.ActiveCfg = Debug|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Debug|x64.Build.0 = Debug|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Debug|x86.ActiveCfg = Debug|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Debug|x86.Build.0 = Debug|Any CPU {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Release|Any CPU.ActiveCfg = Release|Any CPU {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Release|Any CPU.Build.0 = Release|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Release|x64.ActiveCfg = Release|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Release|x64.Build.0 = Release|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Release|x86.ActiveCfg = Release|Any CPU + {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58}.Release|x86.Build.0 = Release|Any CPU {CA395494-F072-4A5B-9DD4-950530A69E0E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU {CA395494-F072-4A5B-9DD4-950530A69E0E}.Debug|Any CPU.Build.0 = Debug|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Debug|x64.ActiveCfg = Debug|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Debug|x64.Build.0 = Debug|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Debug|x86.ActiveCfg = Debug|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Debug|x86.Build.0 = Debug|Any CPU {CA395494-F072-4A5B-9DD4-950530A69E0E}.Release|Any CPU.ActiveCfg = Release|Any CPU {CA395494-F072-4A5B-9DD4-950530A69E0E}.Release|Any CPU.Build.0 = Release|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Release|x64.ActiveCfg = Release|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Release|x64.Build.0 = Release|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Release|x86.ActiveCfg = Release|Any CPU + {CA395494-F072-4A5B-9DD4-950530A69E0E}.Release|x86.Build.0 = Release|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Debug|Any CPU.Build.0 = Debug|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Debug|x64.ActiveCfg = Debug|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Debug|x64.Build.0 = Debug|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Debug|x86.ActiveCfg = Debug|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Debug|x86.Build.0 = Debug|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Release|Any CPU.ActiveCfg = Release|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Release|Any CPU.Build.0 = Release|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Release|x64.ActiveCfg = Release|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Release|x64.Build.0 = Release|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Release|x86.ActiveCfg = Release|Any CPU + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12}.Release|x86.Build.0 = Release|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Debug|Any CPU.Build.0 = Debug|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Debug|x64.ActiveCfg = Debug|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Debug|x64.Build.0 = Debug|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Debug|x86.ActiveCfg = Debug|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Debug|x86.Build.0 = Debug|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Release|Any CPU.ActiveCfg = Release|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Release|Any CPU.Build.0 = Release|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Release|x64.ActiveCfg = Release|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Release|x64.Build.0 = Release|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Release|x86.ActiveCfg = Release|Any CPU + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358}.Release|x86.Build.0 = Release|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Debug|Any CPU.ActiveCfg = Debug|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Debug|Any CPU.Build.0 = Debug|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Debug|x64.ActiveCfg = Debug|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Debug|x64.Build.0 = Debug|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Debug|x86.ActiveCfg = Debug|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Debug|x86.Build.0 = Debug|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Release|Any CPU.ActiveCfg = Release|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Release|Any CPU.Build.0 = Release|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Release|x64.ActiveCfg = Release|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Release|x64.Build.0 = Release|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Release|x86.ActiveCfg = Release|Any CPU + {817E658D-F40C-43E5-8D25-92FE882D1760}.Release|x86.Build.0 = Release|Any CPU + EndGlobalSection + GlobalSection(SolutionProperties) = preSolution + HideSolutionNode = FALSE EndGlobalSection GlobalSection(NestedProjects) = preSolution {D53426B7-469F-4FBB-9935-4AA3C303DE8D} = {A1B2C3D4-0001-0001-0001-000000000001} @@ -112,5 +267,8 @@ Global {A82453CD-8E3C-44B7-A78F-97F392016385} = {A1B2C3D4-0001-0001-0001-000000000004} {83E43658-7186-4E8B-AFD0-BDE5DB7BFB58} = {A1B2C3D4-0001-0001-0001-000000000004} {CA395494-F072-4A5B-9DD4-950530A69E0E} = {A1B2C3D4-0001-0001-0001-000000000005} + {3A6684C8-1A85-4BF7-8B5C-E07F4E123F12} = {048F5F03-6DDC-C04F-70D5-B8139DC8E373} + {0FC88CC8-1203-4215-AEFC-6CFA0A8DB358} = {048F5F03-6DDC-C04F-70D5-B8139DC8E373} + {817E658D-F40C-43E5-8D25-92FE882D1760} = {A1B2C3D4-0001-0001-0001-000000000003} EndGlobalSection EndGlobal diff --git a/ICD10/.gitignore b/ICD10/.gitignore index 09530ff..64e63e3 100644 --- a/ICD10/.gitignore +++ b/ICD10/.gitignore @@ -1,8 +1,3 @@ -# Generated files -*.generated.sql -*.db -Generated/ - # Python __pycache__/ *.pyc diff --git a/ICD10/ICD10.Api.Tests/ICD10.Api.Tests.csproj b/ICD10/ICD10.Api.Tests/ICD10.Api.Tests.csproj index fcb64e0..1020c2b 100644 --- a/ICD10/ICD10.Api.Tests/ICD10.Api.Tests.csproj +++ b/ICD10/ICD10.Api.Tests/ICD10.Api.Tests.csproj @@ -15,6 +15,10 @@ </PackageReference> <PackageReference Include="Microsoft.AspNetCore.Mvc.Testing" Version="10.0.3" /> <PackageReference Include="Npgsql" Version="9.0.2" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Postgres" + Version="$(DataProviderVersion)" + /> <PackageReference Include="coverlet.collector" Version="6.0.4"> <PrivateAssets>all</PrivateAssets> <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets> @@ -23,6 +27,7 @@ <ItemGroup> <ProjectReference Include="..\ICD10.Api\ICD10.Api.csproj" /> + <ProjectReference Include="..\ICD10.TestSupport\ICD10.TestSupport.csproj" /> <ProjectReference Include="..\..\Shared\Authorization\Authorization.csproj" /> </ItemGroup> </Project> diff --git a/ICD10/ICD10.Api.Tests/ICD10ApiFactory.cs b/ICD10/ICD10.Api.Tests/ICD10ApiFactory.cs index 5dde012..5537cbd 100644 --- a/ICD10/ICD10.Api.Tests/ICD10ApiFactory.cs +++ b/ICD10/ICD10.Api.Tests/ICD10ApiFactory.cs @@ -1,7 +1,8 @@ +using ICD10.TestSupport; using Microsoft.AspNetCore.Hosting; using Microsoft.AspNetCore.Mvc.Testing; -using Migration; -using Migration.Postgres; +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; using Npgsql; namespace ICD10.Api.Tests; diff --git a/ICD10/ICD10.Api/DataProvider.json b/ICD10/ICD10.Api/DataProvider.json index ffff099..ab11c11 100644 --- a/ICD10/ICD10.Api/DataProvider.json +++ b/ICD10/ICD10.Api/DataProvider.json @@ -1,136 +1,145 @@ { - "queries": [ - { - "name": "GetChapters", - "sqlFile": "Queries/GetChapters.generated.sql" - }, - { - "name": "GetBlocksByChapter", - "sqlFile": "Queries/GetBlocksByChapter.generated.sql" - }, - { - "name": "GetCategoriesByBlock", - "sqlFile": "Queries/GetCategoriesByBlock.generated.sql" - }, - { - "name": "GetCodesByCategory", - "sqlFile": "Queries/GetCodesByCategory.generated.sql" - }, - { - "name": "GetCodeByCode", - "sqlFile": "Queries/GetCodeByCode.generated.sql" - }, - { - "name": "GetAchiBlocks", - "sqlFile": "Queries/GetAchiBlocks.generated.sql" - }, - { - "name": "GetAchiCodesByBlock", - "sqlFile": "Queries/GetAchiCodesByBlock.generated.sql" - }, - { - "name": "GetAchiCodeByCode", - "sqlFile": "Queries/GetAchiCodeByCode.generated.sql" - }, - { - "name": "GetCodeEmbedding", - "sqlFile": "Queries/GetCodeEmbedding.generated.sql" - }, - { - "name": "GetAllCodeEmbeddings", - "sqlFile": "Queries/GetAllCodeEmbeddings.generated.sql" - }, - { - "name": "SearchAchiCodes", - "sqlFile": "Queries/SearchAchiCodes.sql" - }, - { - "name": "SearchIcd10Codes", - "sqlFile": "Queries/SearchIcd10Codes.sql" - } - ], - "tables": [ - { - "schema": "main", - "name": "icd10_chapter", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "icd10_block", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "icd10_category", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "icd10_code", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "icd10_code_embedding", - "generateInsert": false, - "generateUpdate": true, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "achi_block", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "achi_code", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "achi_code_embedding", - "generateInsert": false, - "generateUpdate": true, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "user_search_history", - "generateInsert": false, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - } - ], - "connectionString": "Data Source=icd10.db" -} + "queries": [ + { + "name": "GetChapters", + "sqlFile": "Queries/GetChapters.generated.sql" + }, + { + "name": "GetBlocksByChapter", + "sqlFile": "Queries/GetBlocksByChapter.generated.sql" + }, + { + "name": "GetCategoriesByBlock", + "sqlFile": "Queries/GetCategoriesByBlock.generated.sql" + }, + { + "name": "GetCodesByCategory", + "sqlFile": "Queries/GetCodesByCategory.generated.sql" + }, + { + "name": "GetCodeByCode", + "sqlFile": "Queries/GetCodeByCode.generated.sql" + }, + { + "name": "GetAchiBlocks", + "sqlFile": "Queries/GetAchiBlocks.generated.sql" + }, + { + "name": "GetAchiCodesByBlock", + "sqlFile": "Queries/GetAchiCodesByBlock.generated.sql" + }, + { + "name": "GetAchiCodeByCode", + "sqlFile": "Queries/GetAchiCodeByCode.generated.sql" + }, + { + "name": "GetCodeEmbedding", + "sqlFile": "Queries/GetCodeEmbedding.generated.sql" + }, + { + "name": "GetAllCodeEmbeddings", + "sqlFile": "Queries/GetAllCodeEmbeddings.generated.sql" + }, + { + "name": "SearchAchiCodes", + "sqlFile": "Queries/SearchAchiCodes.sql" + }, + { + "name": "SearchIcd10Codes", + "sqlFile": "Queries/SearchIcd10Codes.sql" + } + ], + "tables": [ + { + "schema": "public", + "name": "icd10_chapter", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "icd10_block", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "icd10_category", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "icd10_code", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "icd10_code_embedding", + "generateInsert": true, + "generateUpdate": true, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "achi_block", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "achi_code", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "achi_code_embedding", + "generateInsert": true, + "generateUpdate": true, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "user_search_history", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + } + ], + "connectionString": "Host=localhost;Port=5432;Database=icd10;Username=postgres;Password=changeme" +} \ No newline at end of file diff --git a/ICD10/ICD10.Api/DatabaseSetup.cs b/ICD10/ICD10.Api/DatabaseSetup.cs index e9fc36a..9c37a46 100644 --- a/ICD10/ICD10.Api/DatabaseSetup.cs +++ b/ICD10/ICD10.Api/DatabaseSetup.cs @@ -1,5 +1,5 @@ -using Migration; -using Migration.Postgres; +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; using InitError = Outcome.Result<bool, string>.Error<bool, string>; using InitOk = Outcome.Result<bool, string>.Ok<bool, string>; using InitResult = Outcome.Result<bool, string>; @@ -50,15 +50,7 @@ public static InitResult Initialize(NpgsqlConnection connection, ILogger logger) var yamlPath = Path.Combine(AppContext.BaseDirectory, "icd10-schema.yaml"); var schema = SchemaYamlSerializer.FromYamlFile(yamlPath); - - foreach (var table in schema.Tables) - { - var ddl = PostgresDdlGenerator.Generate(new CreateTableOperation(table)); - using var cmd = connection.CreateCommand(); - cmd.CommandText = ddl; - cmd.ExecuteNonQuery(); - logger.Log(LogLevel.Debug, "Created table {TableName}", table.Name); - } + PostgresDdlGenerator.MigrateSchema(connection, schema); // Create vector indexes for fast similarity search EnsureVectorIndexes(connection, logger); @@ -100,7 +92,7 @@ private static void EnsureVectorIndexes(NpgsqlConnection connection, ILogger log cmd.CommandText = $""" CREATE INDEX IF NOT EXISTS idx_icd10_embedding_vector ON icd10_code_embedding - USING ivfflat (("embedding"::vector(384)) vector_cosine_ops) + USING ivfflat (("Embedding"::vector(384)) vector_cosine_ops) WITH (lists = {lists}) """; cmd.ExecuteNonQuery(); @@ -127,7 +119,7 @@ USING ivfflat (("embedding"::vector(384)) vector_cosine_ops) cmd.CommandText = $""" CREATE INDEX IF NOT EXISTS idx_achi_embedding_vector ON achi_code_embedding - USING ivfflat (("embedding"::vector(384)) vector_cosine_ops) + USING ivfflat (("Embedding"::vector(384)) vector_cosine_ops) WITH (lists = {lists}) """; cmd.ExecuteNonQuery(); diff --git a/ICD10/ICD10.Api/GlobalUsings.cs b/ICD10/ICD10.Api/GlobalUsings.cs index dce7ddf..a97f387 100644 --- a/ICD10/ICD10.Api/GlobalUsings.cs +++ b/ICD10/ICD10.Api/GlobalUsings.cs @@ -2,101 +2,156 @@ global using System.Collections.Immutable; global using Generated; global using Microsoft.Extensions.Logging; +global using Nimblesite.Sql.Model; global using Npgsql; global using Outcome; global using GetAchiBlocksError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAchiBlocks>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetAchiBlocks>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetAchiBlocks>, + Nimblesite.Sql.Model.SqlError +>; // GetAchiBlocks query result type aliases global using GetAchiBlocksOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAchiBlocks>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetAchiBlocks>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetAchiBlocks>, + Nimblesite.Sql.Model.SqlError +>; global using GetAchiCodeByCodeError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAchiCodeByCode>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetAchiCodeByCode>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetAchiCodeByCode>, + Nimblesite.Sql.Model.SqlError +>; // GetAchiCodeByCode query result type aliases global using GetAchiCodeByCodeOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAchiCodeByCode>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetAchiCodeByCode>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetAchiCodeByCode>, + Nimblesite.Sql.Model.SqlError +>; global using GetAchiCodesByBlockError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAchiCodesByBlock>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetAchiCodesByBlock>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetAchiCodesByBlock query result type aliases global using GetAchiCodesByBlockOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAchiCodesByBlock>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetAchiCodesByBlock>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetAchiCodesByBlock>, + Nimblesite.Sql.Model.SqlError +>; global using GetBlocksByChapterError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetBlocksByChapter>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetBlocksByChapter>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetBlocksByChapter>, + Nimblesite.Sql.Model.SqlError +>; // GetBlocksByChapter query result type aliases global using GetBlocksByChapterOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetBlocksByChapter>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetBlocksByChapter>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetBlocksByChapter>, + Nimblesite.Sql.Model.SqlError +>; global using GetCategoriesByBlockError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetCategoriesByBlock>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetCategoriesByBlock>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetCategoriesByBlock query result type aliases global using GetCategoriesByBlockOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetCategoriesByBlock>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetCategoriesByBlock>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetCategoriesByBlock>, + Nimblesite.Sql.Model.SqlError +>; global using GetChaptersError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetChapters>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetChapters>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetChapters>, + Nimblesite.Sql.Model.SqlError +>; // GetChapters query result type aliases global using GetChaptersOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetChapters>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetChapters>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetChapters>, + Nimblesite.Sql.Model.SqlError +>; global using GetCodeByCodeError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetCodeByCode>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetCodeByCode>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetCodeByCode>, + Nimblesite.Sql.Model.SqlError +>; // GetCodeByCode query result type aliases global using GetCodeByCodeOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetCodeByCode>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetCodeByCode>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetCodeByCode>, + Nimblesite.Sql.Model.SqlError +>; global using GetCodesByCategoryError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetCodesByCategory>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetCodesByCategory>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetCodesByCategory>, + Nimblesite.Sql.Model.SqlError +>; // GetCodesByCategory query result type aliases global using GetCodesByCategoryOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetCodesByCategory>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetCodesByCategory>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetCodesByCategory>, + Nimblesite.Sql.Model.SqlError +>; global using SearchAchiCodesError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchAchiCodes>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.SearchAchiCodes>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.SearchAchiCodes>, + Nimblesite.Sql.Model.SqlError +>; // SearchAchiCodes query result type aliases global using SearchAchiCodesOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchAchiCodes>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.SearchAchiCodes>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.SearchAchiCodes>, + Nimblesite.Sql.Model.SqlError +>; global using SearchIcd10CodesError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchIcd10Codes>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.SearchIcd10Codes>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.SearchIcd10Codes>, + Nimblesite.Sql.Model.SqlError +>; // SearchIcd10Codes query result type aliases global using SearchIcd10CodesOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchIcd10Codes>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.SearchIcd10Codes>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.SearchIcd10Codes>, + Nimblesite.Sql.Model.SqlError +>; diff --git a/ICD10/ICD10.Api/ICD10.Api.csproj b/ICD10/ICD10.Api/ICD10.Api.csproj index a8cfcf3..80b55e6 100644 --- a/ICD10/ICD10.Api/ICD10.Api.csproj +++ b/ICD10/ICD10.Api/ICD10.Api.csproj @@ -1,7 +1,7 @@ <Project Sdk="Microsoft.NET.Sdk.Web"> <PropertyGroup> <OutputType>Exe</OutputType> - <NoWarn>CA1515;CA2100;RS1035;CA1508;CA2234</NoWarn> + <NoWarn>$(NoWarn);CA1515;CA2100;RS1035;CA1508;CA2234;CS1591</NoWarn> <EnableLqlTranspile>true</EnableLqlTranspile> </PropertyGroup> @@ -12,11 +12,16 @@ <ItemGroup> <PackageReference Include="Npgsql" Version="9.0.2" /> - <PackageReference Include="MelbourneDev.DataProvider" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Lql.Postgres" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Selecta" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Migration" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Migration.Postgres" Version="0.1.0" /> + <PackageReference Include="Nimblesite.DataProvider.Core" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Lql.Postgres" Version="$(DataProviderVersion)" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Core" + Version="$(DataProviderVersion)" + /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Postgres" + Version="$(DataProviderVersion)" + /> </ItemGroup> <ItemGroup> @@ -30,25 +35,12 @@ <Content Include="icd10-schema.yaml"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> </Content> - <!-- SQLite database for local testing --> - <Content Include="icd10.db" Condition="Exists('icd10.db')"> - <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> - </Content> </ItemGroup> - <!-- Create database from YAML using Migration.Cli (installed as dotnet tool) --> - <Target Name="CreateDatabaseSchema" BeforeTargets="TranspileLqlAndGenerateDataProvider"> - <Exec - Command="dotnet migration-cli -- --schema "$(MSBuildProjectDirectory)/icd10-schema.yaml" --output "$(MSBuildProjectDirectory)/icd10.db" --provider sqlite" - WorkingDirectory="$(MSBuildProjectDirectory)" - StandardOutputImportance="High" - StandardErrorImportance="High" - /> - </Target> - - <!-- Pre-compile: transpile LQL to SQL, then generate C# from SQL using CLI tools --> + <!-- Pre-compile: transpile LQL to SQL, then generate C# via `dotnet DataProvider postgres`. + Requires a live Postgres with the icd10 schema migrated (see `make db-migrate`). --> <Target - Name="TranspileLqlAndGenerateDataProvider" + Name="GenerateDataProvider" BeforeTargets="BeforeCompile;CoreCompile" Inputs="$(MSBuildProjectDirectory)/DataProvider.json;@(AdditionalFiles);@(LqlFiles)" Outputs="$(MSBuildProjectDirectory)/Generated/.timestamp" @@ -60,19 +52,17 @@ </ItemGroup> <Message Importance="High" Text="Transpiling LQL files (@(LqlFiles))" /> <Exec - Command="dotnet lqlcli-sqlite -- --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" + Command="dotnet Lql postgres --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" Condition="'$(EnableLqlTranspile)' == 'true' and @(LqlFiles) != ''" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" - ContinueOnError="WarnAndContinue" /> <Exec - Command="dotnet dataprovider-sqlite-cli -- --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated" --connection-type NpgsqlConnection" + Command="dotnet DataProvider postgres --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" - IgnoreExitCode="true" /> <Touch Files="$(MSBuildProjectDirectory)/Generated/.timestamp" AlwaysCreate="true" /> <ItemGroup> diff --git a/ICD10/ICD10.Api/Program.cs b/ICD10/ICD10.Api/Program.cs index 8f51fc7..3e984bd 100644 --- a/ICD10/ICD10.Api/Program.cs +++ b/ICD10/ICD10.Api/Program.cs @@ -315,12 +315,12 @@ IHttpClientFactory httpClientFactory await using (icdCmd.ConfigureAwait(false)) { icdCmd.CommandText = """ - SELECT c."code", c."shortdescription", c."longdescription", - c."inclusionterms", c."exclusionterms", c."codealso", c."codefirst", - 1 - (e."embedding"::vector <=> @queryVector::vector) as similarity + SELECT c."Code", c."ShortDescription", c."LongDescription", + c."InclusionTerms", c."ExclusionTerms", c."CodeAlso", c."CodeFirst", + 1 - (e."Embedding"::vector <=> @queryVector::vector) as similarity FROM icd10_code c - JOIN icd10_code_embedding e ON c."id" = e."codeid" - ORDER BY e."embedding"::vector <=> @queryVector::vector + JOIN icd10_code_embedding e ON c."Id" = e."CodeId" + ORDER BY e."Embedding"::vector <=> @queryVector::vector LIMIT @limit """; icdCmd.Parameters.AddWithValue("@queryVector", vectorString); @@ -370,11 +370,11 @@ LIMIT @limit await using (achiCmd.ConfigureAwait(false)) { achiCmd.CommandText = """ - SELECT c."code", c."shortdescription", c."longdescription", - 1 - (e."embedding"::vector <=> @queryVector::vector) as similarity + SELECT c."Code", c."ShortDescription", c."LongDescription", + 1 - (e."Embedding"::vector <=> @queryVector::vector) as similarity FROM achi_code c - JOIN achi_code_embedding e ON c."id" = e."codeid" - ORDER BY e."embedding"::vector <=> @queryVector::vector + JOIN achi_code_embedding e ON c."Id" = e."CodeId" + ORDER BY e."Embedding"::vector <=> @queryVector::vector LIMIT @limit """; achiCmd.Parameters.AddWithValue("@queryVector", vectorString); @@ -484,23 +484,21 @@ LIMIT @limit // HELPER METHODS // ============================================================================ -/// <summary> -/// Enriches a code record with derived hierarchy info when DB values are null. -/// Uses Icd10Chapters to derive chapter/category from code prefix. -/// </summary> +// Enriches a code record with derived hierarchy info when DB values are null. +// Uses Icd10Chapters to derive chapter/category from code prefix. static GetCodeByCode EnrichCodeWithDerivedHierarchy(GetCodeByCode code) { var (chapterNum, chapterTitle) = string.IsNullOrEmpty(code.ChapterNumber) - ? Icd10Chapters.GetChapter(code.Code) + ? Icd10Chapters.GetChapter(code.Code ?? string.Empty) : (code.ChapterNumber, code.ChapterTitle ?? ""); var categoryCode = string.IsNullOrEmpty(code.CategoryCode) - ? Icd10Chapters.GetCategory(code.Code) + ? Icd10Chapters.GetCategory(code.Code ?? string.Empty) : code.CategoryCode; // Derive block from category when not in DB - use category code as pseudo-block var (blockCode, blockTitle) = string.IsNullOrEmpty(code.BlockCode) - ? Icd10Chapters.GetBlock(code.Code) + ? Icd10Chapters.GetBlock(code.Code ?? string.Empty) : (code.BlockCode, code.BlockTitle ?? ""); return code with @@ -548,9 +546,7 @@ static object ToFhirProcedure(GetAchiCodeByCode code) => Property = new[] { new { Code = "block", ValueString = code.BlockNumber } }, }; -/// <summary> -/// Enriches search result with derived hierarchy when DB values are null. -/// </summary> +// Enriches search result with derived hierarchy when DB values are null. static object EnrichSearchResult(SearchIcd10Codes code) { var codeValue = code.Code ?? ""; diff --git a/ICD10/ICD10.Api/Queries/SearchAchiCodes.sql b/ICD10/ICD10.Api/Queries/SearchAchiCodes.sql index c60dbed..4c09bb1 100644 --- a/ICD10/ICD10.Api/Queries/SearchAchiCodes.sql +++ b/ICD10/ICD10.Api/Queries/SearchAchiCodes.sql @@ -1,5 +1,5 @@ -SELECT Id, BlockId, Code, ShortDescription, LongDescription, Billable +SELECT "Id", "BlockId", "Code", "ShortDescription", "LongDescription", "Billable" FROM achi_code -WHERE Code ILIKE @term OR ShortDescription ILIKE @term OR LongDescription ILIKE @term -ORDER BY Code +WHERE "Code" ILIKE @term OR "ShortDescription" ILIKE @term OR "LongDescription" ILIKE @term +ORDER BY "Code" LIMIT @limit diff --git a/ICD10/ICD10.Api/Queries/SearchIcd10Codes.sql b/ICD10/ICD10.Api/Queries/SearchIcd10Codes.sql index 214060c..386a6fb 100644 --- a/ICD10/ICD10.Api/Queries/SearchIcd10Codes.sql +++ b/ICD10/ICD10.Api/Queries/SearchIcd10Codes.sql @@ -1,12 +1,12 @@ -SELECT c.Id, c.Code, c.ShortDescription, c.LongDescription, c.Billable, - cat.CategoryCode, cat.Title AS CategoryTitle, - b.BlockCode, b.Title AS BlockTitle, - ch.ChapterNumber, ch.Title AS ChapterTitle, - c.InclusionTerms, c.ExclusionTerms, c.CodeAlso, c.CodeFirst, c.Synonyms, c.Edition +SELECT c."Id", c."Code", c."ShortDescription", c."LongDescription", c."Billable", + cat."CategoryCode", cat."Title" AS "CategoryTitle", + b."BlockCode", b."Title" AS "BlockTitle", + ch."ChapterNumber", ch."Title" AS "ChapterTitle", + c."InclusionTerms", c."ExclusionTerms", c."CodeAlso", c."CodeFirst", c."Synonyms", c."Edition" FROM icd10_code c -LEFT JOIN icd10_category cat ON c.CategoryId = cat.Id -LEFT JOIN icd10_block b ON cat.BlockId = b.Id -LEFT JOIN icd10_chapter ch ON b.ChapterId = ch.Id -WHERE c.Code ILIKE @term OR c.ShortDescription ILIKE @term OR c.LongDescription ILIKE @term -ORDER BY c.Code +LEFT JOIN icd10_category cat ON c."CategoryId" = cat."Id" +LEFT JOIN icd10_block b ON cat."BlockId" = b."Id" +LEFT JOIN icd10_chapter ch ON b."ChapterId" = ch."Id" +WHERE c."Code" ILIKE @term OR c."ShortDescription" ILIKE @term OR c."LongDescription" ILIKE @term +ORDER BY c."Code" LIMIT @limit diff --git a/ICD10/ICD10.TestSupport/ICD10.TestSupport.csproj b/ICD10/ICD10.TestSupport/ICD10.TestSupport.csproj new file mode 100644 index 0000000..fe163ee --- /dev/null +++ b/ICD10/ICD10.TestSupport/ICD10.TestSupport.csproj @@ -0,0 +1,19 @@ +<Project Sdk="Microsoft.NET.Sdk"> + <PropertyGroup> + <RootNamespace>ICD10.TestSupport</RootNamespace> + <NoWarn>CA1707;CA1062;CA1515;CA2100;CA1812;CA1849</NoWarn> + </PropertyGroup> + + <ItemGroup> + <PackageReference Include="Npgsql" Version="9.0.2" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Core" + Version="$(DataProviderVersion)" + /> + </ItemGroup> + + <ItemGroup> + <ProjectReference Include="..\..\Shared\Authorization\Authorization.csproj" /> + <ProjectReference Include="..\ICD10.Api\ICD10.Api.csproj" /> + </ItemGroup> +</Project> diff --git a/ICD10/ICD10.TestSupport/Icd10TestDatabase.cs b/ICD10/ICD10.TestSupport/Icd10TestDatabase.cs new file mode 100644 index 0000000..f0497e3 --- /dev/null +++ b/ICD10/ICD10.TestSupport/Icd10TestDatabase.cs @@ -0,0 +1,45 @@ +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; +using Npgsql; + +namespace ICD10.TestSupport; + +/// <summary> +/// Helpers to provision an ICD-10 test database (schema + seed data) without +/// running the heavyweight Python CDC import (~3 minutes for 44k embeddings). +/// </summary> +public static class Icd10TestDatabase +{ + /// <summary> + /// Enables pgvector, applies the icd10-schema.yaml schema via the Migration + /// library, then seeds reference data and (if the embedding service at + /// http://localhost:8000 is available) embeddings. + /// </summary> + /// <param name="connectionString">Connection string to a fresh database.</param> + /// <param name="schemaYamlPath">Absolute path to icd10-schema.yaml.</param> + public static void Initialize(string connectionString, string schemaYamlPath) + { + if (!File.Exists(schemaYamlPath)) + { + throw new FileNotFoundException( + $"icd10-schema.yaml not found at '{schemaYamlPath}'", + schemaYamlPath + ); + } + + using var conn = new NpgsqlConnection(connectionString); + conn.Open(); + + using (var cmd = conn.CreateCommand()) + { + cmd.CommandText = "CREATE EXTENSION IF NOT EXISTS vector"; + cmd.ExecuteNonQuery(); + } + + var schema = SchemaYamlSerializer.FromYamlFile(schemaYamlPath); + PostgresDdlGenerator.MigrateSchema(conn, schema); + + TestDataSeeder.Seed(conn); + TestDataSeeder.SeedEmbeddings(conn); + } +} diff --git a/ICD10/ICD10.Api.Tests/TestDataSeeder.cs b/ICD10/ICD10.TestSupport/TestDataSeeder.cs similarity index 59% rename from ICD10/ICD10.Api.Tests/TestDataSeeder.cs rename to ICD10/ICD10.TestSupport/TestDataSeeder.cs index 35949a7..cb89e23 100644 --- a/ICD10/ICD10.Api.Tests/TestDataSeeder.cs +++ b/ICD10/ICD10.TestSupport/TestDataSeeder.cs @@ -1,28 +1,42 @@ +using System.Net.Http.Json; +using System.Text.Json; +using Generated; +using Nimblesite.Sql.Model; using Npgsql; +using Outcome; -namespace ICD10.Api.Tests; +namespace ICD10.TestSupport; /// <summary> -/// Seeds ICD-10 reference data into a PostgreSQL test database. -/// All column names are lowercase to match PostgresDdlGenerator output. +/// Seeds ICD-10 reference data into a PostgreSQL test database via the +/// generated DataProvider Insert extension methods. /// </summary> -internal static class TestDataSeeder +public static class TestDataSeeder { - internal static void Seed(NpgsqlConnection conn) + private const string IcdEmbeddingModel = "MedEmbed-Small-v0.1"; + + /// <summary> + /// Seeds chapters, blocks, categories, codes, ACHI blocks and ACHI codes + /// required by both API and Dashboard E2E tests. + /// </summary> + public static void Seed(NpgsqlConnection conn) { - SeedChapters(conn); - SeedBlocks(conn); - SeedCategories(conn); - SeedCodes(conn); - SeedAchiBlocks(conn); - SeedAchiCodes(conn); + SeedChaptersAsync(conn).GetAwaiter().GetResult(); + SeedBlocksAsync(conn).GetAwaiter().GetResult(); + SeedCategoriesAsync(conn).GetAwaiter().GetResult(); + SeedCodesAsync(conn).GetAwaiter().GetResult(); + SeedAchiBlocksAsync(conn).GetAwaiter().GetResult(); + SeedAchiCodesAsync(conn).GetAwaiter().GetResult(); } /// <summary> /// Seeds embeddings by calling the embedding service at localhost:8000. /// If the service is unavailable, silently returns (search tests will fail via skip check). /// </summary> - internal static void SeedEmbeddings(NpgsqlConnection conn) + public static void SeedEmbeddings(NpgsqlConnection conn) => + SeedEmbeddingsAsync(conn).GetAwaiter().GetResult(); + + private static async Task SeedEmbeddingsAsync(NpgsqlConnection conn) { var icdItems = new (string EmbId, string CodeId, string Text)[] { @@ -110,105 +124,89 @@ internal static void SeedEmbeddings(NpgsqlConnection conn) ), }; + List<List<float>>? embeddings; try { - using var client = new HttpClient { Timeout = TimeSpan.FromSeconds(60) }; - - var healthCheck = client - .GetAsync("http://localhost:8000/health") - .GetAwaiter() - .GetResult(); - if (!healthCheck.IsSuccessStatusCode) - return; - - var allTexts = icdItems - .Select(t => t.Text) - .Concat(achiItems.Select(t => t.Text)) - .ToList(); - - var batchResponse = client - .PostAsJsonAsync("http://localhost:8000/embed/batch", new { texts = allTexts }) - .GetAwaiter() - .GetResult(); - - if (!batchResponse.IsSuccessStatusCode) - return; - - var jsonOptions = new JsonSerializerOptions { PropertyNameCaseInsensitive = true }; - var batchResult = batchResponse - .Content.ReadFromJsonAsync<BatchEmbeddingResponse>(jsonOptions) - .GetAwaiter() - .GetResult(); - - if (batchResult is null || batchResult.Embeddings.Count != allTexts.Count) - return; - - InsertEmbeddings( - conn: conn, - table: "icd10_code_embedding", - items: icdItems, - embeddings: batchResult.Embeddings, - offset: 0 + embeddings = await FetchEmbeddingsAsync( + icdItems.Select(t => t.Text).Concat(achiItems.Select(t => t.Text)).ToList() ); + } + catch (HttpRequestException) + { + return; + } + catch (TaskCanceledException) + { + return; + } + + if (embeddings is null || embeddings.Count != icdItems.Length + achiItems.Length) + return; - InsertEmbeddings( - conn: conn, - table: "achi_code_embedding", - items: achiItems, - embeddings: batchResult.Embeddings, - offset: icdItems.Length + for (var i = 0; i < icdItems.Length; i++) + { + var (embId, codeId, _) = icdItems[i]; + EnsureInserted( + await conn.Inserticd10_code_embeddingAsync( + Id: embId, + CodeId: codeId, + Embedding: SerializeVector(embeddings[i]), + EmbeddingModel: IcdEmbeddingModel, + LastUpdated: null + ), + "icd10_code_embedding", + embId ); } - catch + + for (var i = 0; i < achiItems.Length; i++) { - // Embedding service unavailable - search tests will be skipped + var (embId, codeId, _) = achiItems[i]; + EnsureInserted( + await conn.Insertachi_code_embeddingAsync( + Id: embId, + CodeId: codeId, + Embedding: SerializeVector(embeddings[icdItems.Length + i]), + EmbeddingModel: IcdEmbeddingModel, + LastUpdated: null + ), + "achi_code_embedding", + embId + ); } } - private static void InsertEmbeddings( - NpgsqlConnection conn, - string table, - (string EmbId, string CodeId, string Text)[] items, - List<List<float>> embeddings, - int offset - ) + private static async Task<List<List<float>>?> FetchEmbeddingsAsync(List<string> texts) { - using var cmd = conn.CreateCommand(); - cmd.CommandText = $""" - INSERT INTO "public"."{table}" ("id", "codeid", "embedding", "embeddingmodel") - VALUES (@id, @codeid, @embedding, @model) - """; + using var client = new HttpClient { Timeout = TimeSpan.FromSeconds(60) }; - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pCodeId = cmd.Parameters.Add( - new NpgsqlParameter("@codeid", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pEmbedding = cmd.Parameters.Add( - new NpgsqlParameter("@embedding", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pModel = cmd.Parameters.Add( - new NpgsqlParameter("@model", NpgsqlTypes.NpgsqlDbType.Text) + var healthCheck = await client.GetAsync(new Uri("http://localhost:8000/health")); + if (!healthCheck.IsSuccessStatusCode) + return null; + + var batchResponse = await client.PostAsJsonAsync( + new Uri("http://localhost:8000/embed/batch"), + new { texts } ); - cmd.Prepare(); + if (!batchResponse.IsSuccessStatusCode) + return null; - for (var i = 0; i < items.Length; i++) - { - pId.Value = items[i].EmbId; - pCodeId.Value = items[i].CodeId; - pEmbedding.Value = - "[" - + string.Join( - ",", - embeddings[offset + i] - .Select(f => f.ToString(System.Globalization.CultureInfo.InvariantCulture)) - ) - + "]"; - pModel.Value = "MedEmbed-Small-v0.1"; - cmd.ExecuteNonQuery(); - } + var jsonOptions = new JsonSerializerOptions { PropertyNameCaseInsensitive = true }; + var batchResult = await batchResponse.Content.ReadFromJsonAsync<BatchEmbeddingResponse>( + jsonOptions + ); + return batchResult?.Embeddings; } + private static string SerializeVector(List<float> values) => + "[" + + string.Join( + ",", + values.Select(f => f.ToString(System.Globalization.CultureInfo.InvariantCulture)) + ) + + "]"; + private sealed record BatchEmbeddingResponse( List<List<float>> Embeddings, string Model, @@ -216,9 +214,8 @@ private sealed record BatchEmbeddingResponse( int Count ); - private static void SeedChapters(NpgsqlConnection conn) + private static async Task SeedChaptersAsync(NpgsqlConnection conn) { - // All 21 ICD-10-CM chapters with numeric chapter numbers var chapters = new (string Id, string Number, string Title, string Start, string End)[] { ("ch-01", "1", "Certain infectious and parasitic diseases", "A00", "B99"), @@ -268,36 +265,25 @@ private static void SeedChapters(NpgsqlConnection conn) ), }; - using var cmd = conn.CreateCommand(); - cmd.CommandText = """ - INSERT INTO "public"."icd10_chapter" ("id", "chapternumber", "title", "coderangestart", "coderangeend") - VALUES (@id, @num, @title, @start, @end) - """; - - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pNum = cmd.Parameters.Add(new NpgsqlParameter("@num", NpgsqlTypes.NpgsqlDbType.Text)); - var pTitle = cmd.Parameters.Add( - new NpgsqlParameter("@title", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pStart = cmd.Parameters.Add( - new NpgsqlParameter("@start", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pEnd = cmd.Parameters.Add(new NpgsqlParameter("@end", NpgsqlTypes.NpgsqlDbType.Text)); - - cmd.Prepare(); - foreach (var (id, number, title, start, end) in chapters) { - pId.Value = id; - pNum.Value = number; - pTitle.Value = title; - pStart.Value = start; - pEnd.Value = end; - cmd.ExecuteNonQuery(); + EnsureInserted( + await conn.Inserticd10_chapterAsync( + Id: id, + ChapterNumber: number, + Title: title, + CodeRangeStart: start, + CodeRangeEnd: end, + LastUpdated: null, + VersionId: null + ), + "icd10_chapter", + id + ); } } - private static void SeedBlocks(NpgsqlConnection conn) + private static async Task SeedBlocksAsync(NpgsqlConnection conn) { var blocks = new ( string Id, @@ -337,38 +323,26 @@ string End ("blk-s70-s79", "ch-19", "S70-S79", "Injuries to the hip and thigh", "S70", "S79"), }; - using var cmd = conn.CreateCommand(); - cmd.CommandText = """ - INSERT INTO "public"."icd10_block" ("id", "chapterid", "blockcode", "title", "coderangestart", "coderangeend") - VALUES (@id, @chid, @code, @title, @start, @end) - """; - - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pChId = cmd.Parameters.Add(new NpgsqlParameter("@chid", NpgsqlTypes.NpgsqlDbType.Text)); - var pCode = cmd.Parameters.Add(new NpgsqlParameter("@code", NpgsqlTypes.NpgsqlDbType.Text)); - var pTitle = cmd.Parameters.Add( - new NpgsqlParameter("@title", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pStart = cmd.Parameters.Add( - new NpgsqlParameter("@start", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pEnd = cmd.Parameters.Add(new NpgsqlParameter("@end", NpgsqlTypes.NpgsqlDbType.Text)); - - cmd.Prepare(); - foreach (var (id, chapterId, blockCode, title, start, end) in blocks) { - pId.Value = id; - pChId.Value = chapterId; - pCode.Value = blockCode; - pTitle.Value = title; - pStart.Value = start; - pEnd.Value = end; - cmd.ExecuteNonQuery(); + EnsureInserted( + await conn.Inserticd10_blockAsync( + Id: id, + ChapterId: chapterId, + BlockCode: blockCode, + Title: title, + CodeRangeStart: start, + CodeRangeEnd: end, + LastUpdated: null, + VersionId: null + ), + "icd10_block", + id + ); } } - private static void SeedCategories(NpgsqlConnection conn) + private static async Task SeedCategoriesAsync(NpgsqlConnection conn) { var categories = new (string Id, string BlockId, string CategoryCode, string Title)[] { @@ -394,34 +368,25 @@ private static void SeedCategories(NpgsqlConnection conn) ("cat-s72", "blk-s70-s79", "S72", "Fracture of femur"), }; - using var cmd = conn.CreateCommand(); - cmd.CommandText = """ - INSERT INTO "public"."icd10_category" ("id", "blockid", "categorycode", "title") - VALUES (@id, @bid, @code, @title) - """; - - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pBid = cmd.Parameters.Add(new NpgsqlParameter("@bid", NpgsqlTypes.NpgsqlDbType.Text)); - var pCode = cmd.Parameters.Add(new NpgsqlParameter("@code", NpgsqlTypes.NpgsqlDbType.Text)); - var pTitle = cmd.Parameters.Add( - new NpgsqlParameter("@title", NpgsqlTypes.NpgsqlDbType.Text) - ); - - cmd.Prepare(); - foreach (var (id, blockId, categoryCode, title) in categories) { - pId.Value = id; - pBid.Value = blockId; - pCode.Value = categoryCode; - pTitle.Value = title; - cmd.ExecuteNonQuery(); + EnsureInserted( + await conn.Inserticd10_categoryAsync( + Id: id, + BlockId: blockId, + CategoryCode: categoryCode, + Title: title, + LastUpdated: null, + VersionId: null + ), + "icd10_category", + id + ); } } - private static void SeedCodes(NpgsqlConnection conn) + private static async Task SeedCodesAsync(NpgsqlConnection conn) { - // All codes required by tests (Id, CategoryId, Code, Short, Long, Synonyms) var codes = new ( string Id, string CategoryId, @@ -455,6 +420,30 @@ string Synonyms "Type 2 diabetes mellitus without complications", "adult-onset diabetes; non-insulin-dependent diabetes" ), + ( + "code-e11-0", + "cat-e11", + "E11.0", + "Type 2 diabetes mellitus with hyperosmolarity", + "Type 2 diabetes mellitus with hyperosmolarity", + "" + ), + ( + "code-e11-21", + "cat-e11", + "E11.21", + "Type 2 diabetes mellitus with diabetic nephropathy", + "Type 2 diabetes mellitus with diabetic nephropathy", + "type 2 diabetes with kidney complications" + ), + ( + "code-e11-65", + "cat-e11", + "E11.65", + "Type 2 diabetes mellitus with hyperglycemia", + "Type 2 diabetes mellitus with hyperglycemia", + "" + ), ( "code-g43-909", "cat-g43", @@ -537,7 +526,6 @@ string Synonyms "" ), ("code-r07-89", "cat-r07", "R07.89", "Other chest pain", "Other chest pain", ""), - // Additional codes for search tests ( "code-a00-1", "cat-a00", @@ -573,45 +561,34 @@ string Synonyms ), }; - using var cmd = conn.CreateCommand(); - cmd.CommandText = """ - INSERT INTO "public"."icd10_code" - ("id", "categoryid", "code", "shortdescription", "longdescription", - "inclusionterms", "exclusionterms", "codealso", "codefirst", "synonyms", - "billable", "effectivefrom", "effectiveto", "edition") - VALUES (@id, @catid, @code, @short, @long, - '', '', '', '', @synonyms, - 1, '2025-07-01', '', '2025') - """; - - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pCatId = cmd.Parameters.Add( - new NpgsqlParameter("@catid", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pCode = cmd.Parameters.Add(new NpgsqlParameter("@code", NpgsqlTypes.NpgsqlDbType.Text)); - var pShort = cmd.Parameters.Add( - new NpgsqlParameter("@short", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pLong = cmd.Parameters.Add(new NpgsqlParameter("@long", NpgsqlTypes.NpgsqlDbType.Text)); - var pSynonyms = cmd.Parameters.Add( - new NpgsqlParameter("@synonyms", NpgsqlTypes.NpgsqlDbType.Text) - ); - - cmd.Prepare(); - foreach (var (id, categoryId, code, shortDesc, longDesc, synonyms) in codes) { - pId.Value = id; - pCatId.Value = categoryId; - pCode.Value = code; - pShort.Value = shortDesc; - pLong.Value = longDesc; - pSynonyms.Value = synonyms; - cmd.ExecuteNonQuery(); + EnsureInserted( + await conn.Inserticd10_codeAsync( + Id: id, + CategoryId: categoryId, + Code: code, + ShortDescription: shortDesc, + LongDescription: longDesc, + InclusionTerms: "", + ExclusionTerms: "", + CodeAlso: "", + CodeFirst: "", + Synonyms: synonyms, + Billable: 1, + EffectiveFrom: "2025-07-01", + EffectiveTo: "", + Edition: "2025", + LastUpdated: null, + VersionId: null + ), + "icd10_code", + id + ); } } - private static void SeedAchiBlocks(NpgsqlConnection conn) + private static async Task SeedAchiBlocksAsync(NpgsqlConnection conn) { var blocks = new (string Id, string BlockNumber, string Title, string Start, string End)[] { @@ -626,36 +603,25 @@ private static void SeedAchiBlocks(NpgsqlConnection conn) ), }; - using var cmd = conn.CreateCommand(); - cmd.CommandText = """ - INSERT INTO "public"."achi_block" ("id", "blocknumber", "title", "coderangestart", "coderangeend") - VALUES (@id, @num, @title, @start, @end) - """; - - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pNum = cmd.Parameters.Add(new NpgsqlParameter("@num", NpgsqlTypes.NpgsqlDbType.Text)); - var pTitle = cmd.Parameters.Add( - new NpgsqlParameter("@title", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pStart = cmd.Parameters.Add( - new NpgsqlParameter("@start", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pEnd = cmd.Parameters.Add(new NpgsqlParameter("@end", NpgsqlTypes.NpgsqlDbType.Text)); - - cmd.Prepare(); - foreach (var (id, number, title, start, end) in blocks) { - pId.Value = id; - pNum.Value = number; - pTitle.Value = title; - pStart.Value = start; - pEnd.Value = end; - cmd.ExecuteNonQuery(); + EnsureInserted( + await conn.Insertachi_blockAsync( + Id: id, + BlockNumber: number, + Title: title, + CodeRangeStart: start, + CodeRangeEnd: end, + LastUpdated: null, + VersionId: null + ), + "achi_block", + id + ); } } - private static void SeedAchiCodes(NpgsqlConnection conn) + private static async Task SeedAchiCodesAsync(NpgsqlConnection conn) { var codes = new (string Id, string BlockId, string Code, string Short, string Long)[] { @@ -677,33 +643,35 @@ private static void SeedAchiCodes(NpgsqlConnection conn) ("achi-30571-00", "achi-blk-3", "30571-00", "Cholecystectomy", "Cholecystectomy"), }; - using var cmd = conn.CreateCommand(); - cmd.CommandText = """ - INSERT INTO "public"."achi_code" - ("id", "blockid", "code", "shortdescription", "longdescription", - "billable", "effectivefrom", "effectiveto", "edition") - VALUES (@id, @bid, @code, @short, @long, - 1, '2025-07-01', '', '13') - """; - - var pId = cmd.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text)); - var pBid = cmd.Parameters.Add(new NpgsqlParameter("@bid", NpgsqlTypes.NpgsqlDbType.Text)); - var pCode = cmd.Parameters.Add(new NpgsqlParameter("@code", NpgsqlTypes.NpgsqlDbType.Text)); - var pShort = cmd.Parameters.Add( - new NpgsqlParameter("@short", NpgsqlTypes.NpgsqlDbType.Text) - ); - var pLong = cmd.Parameters.Add(new NpgsqlParameter("@long", NpgsqlTypes.NpgsqlDbType.Text)); - - cmd.Prepare(); - foreach (var (id, blockId, code, shortDesc, longDesc) in codes) { - pId.Value = id; - pBid.Value = blockId; - pCode.Value = code; - pShort.Value = shortDesc; - pLong.Value = longDesc; - cmd.ExecuteNonQuery(); + EnsureInserted( + await conn.Insertachi_codeAsync( + Id: id, + BlockId: blockId, + Code: code, + ShortDescription: shortDesc, + LongDescription: longDesc, + Billable: 1, + EffectiveFrom: "2025-07-01", + EffectiveTo: "", + Edition: "13", + LastUpdated: null, + VersionId: null + ), + "achi_code", + id + ); + } + } + + private static void EnsureInserted(Result<Guid?, SqlError> result, string table, string id) + { + if (result is Result<Guid?, SqlError>.Error<Guid?, SqlError> err) + { + throw new InvalidOperationException( + $"Insert into {table} for id '{id}' failed: {err.Value.Message}" + ); } } } diff --git a/Makefile b/Makefile index b835eb8..cf3a933 100644 --- a/Makefile +++ b/Makefile @@ -1,10 +1,10 @@ -# agent-pmo:d58c330 +# agent-pmo:29b9dcf # ============================================================================= # Standard Makefile — HealthcareSamples # Cross-platform: Linux, macOS, Windows (via GNU Make) # ============================================================================= -.PHONY: build test lint fmt fmt-check clean check ci coverage coverage-check setup +.PHONY: build test lint fmt fmt-check clean check ci coverage coverage-check setup db-up db-down db-reset db-wait db-migrate kill-ports-local kill-ports-docker clean-local clean-docker start-local start-docker # ----------------------------------------------------------------------------- # OS Detection @@ -20,41 +20,72 @@ else MKDIR = mkdir -p endif -# Coverage threshold (override in CI via env var or per-repo) -COVERAGE_THRESHOLD ?= 80 +# Per-project coverage thresholds live in this JSON file. Each test +# project gets its own minimum line-rate; bump them via `make coverage-check` +# output minus 1 percentage point (rounding margin). +COVERAGE_THRESHOLDS_FILE ?= coverage-thresholds.json + +# Postgres dev database (docker compose). Override in CI via env vars. +DB_COMPOSE_FILE ?= docker/docker-compose.db.yml +DB_PASSWORD ?= changeme +DB_HOST ?= localhost +DB_PORT ?= 5432 +PG_BASE_URL ?= Host=$(DB_HOST);Port=$(DB_PORT);Username=postgres;Password=$(DB_PASSWORD) # ============================================================================= # PRIMARY TARGETS # ============================================================================= -## build: Compile/assemble all artifacts -build: +## build: Compile/assemble all artifacts (requires running Postgres + migrated schemas) +build: db-migrate @echo "==> Building..." dotnet build HealthcareSamples.sln --configuration Release -## test: Run full test suite with coverage -test: - @echo "==> Testing..." - dotnet test HealthcareSamples.sln --configuration Release \ - --settings coverlet.runsettings \ - --collect:"XPlat Code Coverage" \ - --results-directory TestResults \ - --verbosity normal +# Test projects in execution order. Cheapest / most foundational first so a +# break in a lower layer fails the run immediately, before slower E2E suites. +TEST_PROJECTS = \ + Gatekeeper/Gatekeeper.Api.Tests/Gatekeeper.Api.Tests.csproj \ + Clinical/Clinical.Api.Tests/Clinical.Api.Tests.csproj \ + Scheduling/Scheduling.Api.Tests/Scheduling.Api.Tests.csproj \ + ICD10/ICD10.Api.Tests/ICD10.Api.Tests.csproj \ + ICD10/ICD10.Cli.Tests/ICD10.Cli.Tests.csproj \ + Dashboard/Dashboard.Integration.Tests/Dashboard.Integration.Tests.csproj + +## test: Run full test suite with coverage (FAIL FAST) +## - Stops at the first failing test inside an assembly (xunit stopOnFail) +## - Stops at the first failing assembly across the suite (set -e) +## - Each project's coverage lands under TestResults/<project-dir>/ so +## `make coverage-check` can attribute results back to a project. +test: db-migrate + @echo "==> Testing (fail-fast)..." + @set -e; \ + rm -rf TestResults; \ + for proj in $(TEST_PROJECTS); do \ + proj_dir=$$(dirname "$$proj"); \ + echo ""; \ + echo "==> Testing $$proj"; \ + dotnet test "$$proj" --configuration Release \ + --settings coverlet.runsettings \ + --collect:"XPlat Code Coverage" \ + --results-directory "TestResults/$$proj_dir" \ + --verbosity normal \ + || { echo ""; echo "FAIL: $$proj failed -- aborting remaining test projects"; exit 1; }; \ + done ## lint: Run all linters (fails on any warning) -lint: fmt-check +lint: fmt-check db-migrate @echo "==> Linting..." dotnet build HealthcareSamples.sln --configuration Release ## fmt: Format all code in-place fmt: @echo "==> Formatting..." - dotnet csharpier . + dotnet csharpier format . ## fmt-check: Check formatting without modifying fmt-check: @echo "==> Checking format..." - dotnet csharpier . --check + dotnet csharpier check . ## clean: Remove all build artifacts clean: @@ -82,41 +113,296 @@ coverage: -reporttypes:Html @echo "==> HTML report: coverage/html/index.html" -## coverage-check: Assert thresholds (exits non-zero if below) +## coverage-check: Assert per-project line-rate >= threshold from $(COVERAGE_THRESHOLDS_FILE) +## The JSON file declares { "default_threshold": N, "projects": { "<dir>": { "threshold": N } } }. +## A project that is missing from the file inherits "default_threshold". +## When coverage actually goes UP, edit the file: floor(measured) - 1 to leave a rounding cushion. coverage-check: - @echo "==> Checking coverage thresholds..." - @COBERTURA=$$(find TestResults -name 'coverage.cobertura.xml' | head -1); \ - if [ -z "$$COBERTURA" ]; then echo "FAIL: No coverage.cobertura.xml found"; exit 1; fi; \ - LINE_RATE=$$(grep -oP 'line-rate="\K[^"]+' "$$COBERTURA" | head -1); \ - PCT=$$(awk "BEGIN{printf \"%.1f\", $${LINE_RATE:-0}*100}"); \ - PCT_INT=$$(awk "BEGIN{printf \"%d\", $${LINE_RATE:-0}*100}"); \ - echo "Line coverage: $${PCT}% (threshold: $(COVERAGE_THRESHOLD)%)"; \ - if [ "$$PCT_INT" -lt "$(COVERAGE_THRESHOLD)" ]; then \ - echo "FAIL: $${PCT}% < $(COVERAGE_THRESHOLD)%"; exit 1; \ - else \ - echo "OK: $${PCT}% >= $(COVERAGE_THRESHOLD)%"; \ + @echo "==> Checking coverage thresholds (file: $(COVERAGE_THRESHOLDS_FILE))..." + @command -v jq >/dev/null 2>&1 || { echo "FAIL: jq is required (brew install jq / apt-get install jq)"; exit 1; } + @if [ ! -f "$(COVERAGE_THRESHOLDS_FILE)" ]; then \ + echo "FAIL: $(COVERAGE_THRESHOLDS_FILE) not found"; exit 1; \ fi + @set -e; \ + default=$$(jq -r '.default_threshold' $(COVERAGE_THRESHOLDS_FILE)); \ + any_failed=0; \ + for proj in $(TEST_PROJECTS); do \ + proj_dir=$$(dirname "$$proj"); \ + cobertura=$$(find "TestResults/$$proj_dir" -name 'coverage.cobertura.xml' 2>/dev/null | head -1); \ + threshold=$$(jq -r --arg p "$$proj_dir" --arg d "$$default" '.projects[$$p].threshold // ($$d | tonumber)' $(COVERAGE_THRESHOLDS_FILE)); \ + if [ -z "$$cobertura" ]; then \ + echo "FAIL ($$proj_dir): no coverage.cobertura.xml under TestResults/$$proj_dir"; \ + any_failed=1; continue; \ + fi; \ + line_rate=$$(awk 'match($$0, /line-rate="[0-9.]+"/) { s=substr($$0, RSTART+11, RLENGTH-12); print s; exit }' "$$cobertura"); \ + pct=$$(awk "BEGIN{printf \"%.1f\", $${line_rate:-0}*100}"); \ + pct_int=$$(awk "BEGIN{printf \"%d\", $${line_rate:-0}*100}"); \ + if [ "$$pct_int" -lt "$$threshold" ]; then \ + printf "FAIL %-44s %s%% < %s%%\n" "$$proj_dir" "$$pct" "$$threshold"; \ + any_failed=1; \ + else \ + printf "OK %-44s %s%% >= %s%%\n" "$$proj_dir" "$$pct" "$$threshold"; \ + fi; \ + done; \ + if [ "$$any_failed" -ne 0 ]; then \ + echo ""; \ + echo "FAIL: one or more projects below threshold (see $(COVERAGE_THRESHOLDS_FILE))"; \ + exit 1; \ + fi; \ + echo ""; \ + echo "OK: all projects meet their coverage thresholds" ## setup: Post-create dev environment setup setup: @echo "==> Setting up development environment..." - dotnet restore dotnet tool restore + dotnet restore @echo "==> Setup complete. Run 'make ci' to validate." +# ============================================================================= +# DEV DATABASE (Postgres via docker compose) +# ============================================================================= + +## db-up: Start Postgres (pgvector) container in background +db-up: + @echo "==> Starting Postgres..." + DB_PASSWORD=$(DB_PASSWORD) docker compose -f $(DB_COMPOSE_FILE) up -d + @$(MAKE) db-wait + +## db-down: Stop and remove Postgres container (preserves volume) +db-down: + @echo "==> Stopping Postgres..." + docker compose -f $(DB_COMPOSE_FILE) down + +## db-reset: Destroy DB volume and recreate from init scripts +db-reset: + @echo "==> Resetting Postgres (DESTRUCTIVE)..." + docker compose -f $(DB_COMPOSE_FILE) down -v + DB_PASSWORD=$(DB_PASSWORD) docker compose -f $(DB_COMPOSE_FILE) up -d + @$(MAKE) db-wait + +## db-wait: Block until Postgres healthcheck reports healthy +db-wait: + @echo "==> Waiting for Postgres to be ready..." + @for i in $$(seq 1 60); do \ + STATUS=$$(docker inspect --format '{{.State.Health.Status}}' healthcaresamples-db 2>/dev/null || echo "missing"); \ + if [ "$$STATUS" = "healthy" ]; then echo "Postgres ready"; exit 0; fi; \ + sleep 1; \ + done; \ + echo "FAIL: Postgres did not become healthy"; \ + docker logs healthcaresamples-db 2>&1 | tail -50; \ + exit 1 + +## db-migrate: Ensure DB is up and apply YAML schemas via DataProviderMigrate to all four databases +db-migrate: db-up + @echo "==> Migrating Postgres schemas..." + dotnet DataProviderMigrate --schema Gatekeeper/Gatekeeper.Api/gatekeeper-schema.yaml \ + --output "$(PG_BASE_URL);Database=gatekeeper" --provider postgres + dotnet DataProviderMigrate --schema Clinical/Clinical.Api/clinical-schema.yaml \ + --output "$(PG_BASE_URL);Database=clinical" --provider postgres + dotnet DataProviderMigrate --schema Scheduling/Scheduling.Api/scheduling-schema.yaml \ + --output "$(PG_BASE_URL);Database=scheduling" --provider postgres + dotnet DataProviderMigrate --schema ICD10/ICD10.Api/icd10-schema.yaml \ + --output "$(PG_BASE_URL);Database=icd10" --provider postgres + +# ============================================================================= +# LOCAL DEV STACK +# ============================================================================= + +# Ports owned by the local dev stack (4 APIs + dashboard + embedding service) +LOCAL_PORTS := 5002 5080 5001 5090 5173 8000 +# Same as LOCAL_PORTS plus the Postgres host port (docker stack publishes it) +DOCKER_PORTS := 5432 5002 5080 5001 5090 5173 + +## kill-ports-local: Free ports used by the local dev stack +kill-ports-local: + @echo "==> Clearing local dev ports..." + @for port in $(LOCAL_PORTS); do \ + pids=$$(lsof -ti :$$port 2>/dev/null || true); \ + if [ -n "$$pids" ]; then \ + echo " killing port $$port: $$pids"; \ + echo "$$pids" | xargs kill -9 2>/dev/null || true; \ + fi; \ + done + +## kill-ports-docker: Free ports used by the docker stack (incl. Postgres) +kill-ports-docker: + @echo "==> Clearing docker dev ports..." + @for port in $(DOCKER_PORTS); do \ + pids=$$(lsof -ti :$$port 2>/dev/null || true); \ + if [ -n "$$pids" ]; then \ + echo " killing port $$port: $$pids"; \ + echo "$$pids" | xargs kill -9 2>/dev/null || true; \ + fi; \ + done + +## clean-local: Kill local dev processes and drop the Postgres dev volume +clean-local: kill-ports-local + @echo "==> Removing Postgres dev volume..." + docker compose -f $(DB_COMPOSE_FILE) down -v 2>/dev/null || true + @echo "Clean complete." + +## clean-docker: Kill docker stack and drop all docker-compose volumes +clean-docker: kill-ports-docker + @echo "==> Removing docker volumes..." + cd docker && docker compose down -v + @echo "Clean complete." + +## start-docker: Build the dashboard locally then start the docker compose stack +## Usage: make start-docker [BUILD=1] +## BUILD=1 force image rebuild (passes --build to docker compose up) +start-docker: + @echo "==> Building Dashboard locally (H5 requires native build)..." + cd Dashboard/Dashboard.Web && \ + dotnet publish -c Release -o ../../docker/dashboard-build --nologo -v q + @echo "==> Starting docker stack..." + cd docker && docker compose up $(if $(BUILD),--build,) + +# Embedded runner for the local dev stack. Inlined as a `define` block so the +# orchestration (background processes, trap-based cleanup, log prefixing) runs +# in a single shell — Make's default one-shell-per-line model can't express it. +define START_LOCAL_RUNNER +set -e +PIDS=() + +cleanup() { + echo "" + echo "Shutting down..." + for pid in "$${PIDS[@]}"; do + kill "$$pid" 2>/dev/null || true + done + wait 2>/dev/null || true + echo "All services stopped." +} +trap cleanup EXIT INT TERM + +DB_PASS="$${DB_PASSWORD:-changeme}" +VENV_DIR="ICD10/.venv" +EMBED_DIR="ICD10/embedding-service" + +echo "Starting Embedding Service on :8000 (model loading may take a moment)..." +"$$VENV_DIR/bin/python" -m uvicorn main:app --host 0.0.0.0 --port 8000 \ + --app-dir "$$EMBED_DIR" 2>&1 | sed 's/^/ [embedding] /' & +PIDS+=($$!) + +populate_icd10() { + local CONN_STR="Host=localhost;Database=icd10;Username=icd10;Password=$$DB_PASS" + local SCRIPTS_DIR="ICD10/scripts/CreateDb" + + echo " [icd10-import] Waiting for ICD10 API..." + for i in $$(seq 1 60); do + if curl -sf http://localhost:5090/health >/dev/null 2>&1; then + echo " [icd10-import] ICD10 API is up." + break + fi + sleep 2 + done + + echo " [icd10-import] Waiting for embedding service..." + for i in $$(seq 1 120); do + if curl -sf http://localhost:8000/health >/dev/null 2>&1; then + echo " [icd10-import] Embedding service ready." + break + fi + sleep 2 + done + + local CHAPTERS + CHAPTERS=$$(curl -sf http://localhost:5090/api/icd10/chapters 2>/dev/null || echo "[]") + if [ "$$CHAPTERS" = "[]" ] || [ "$$CHAPTERS" = "" ]; then + echo " [icd10-import] No ICD10 data found. Running full Postgres import..." + EMBEDDING_SERVICE_URL="http://localhost:8000" \ + "$$VENV_DIR/bin/python" "$$SCRIPTS_DIR/import_postgres.py" \ + --connection-string "$$CONN_STR" \ + || echo " [icd10-import] Import encountered errors (check logs above)" + else + echo " [icd10-import] ICD10 codes already populated. Generating missing embeddings..." + EMBEDDING_SERVICE_URL="http://localhost:8000" \ + "$$VENV_DIR/bin/python" "$$SCRIPTS_DIR/import_postgres.py" \ + --connection-string "$$CONN_STR" --embeddings-only \ + || echo " [icd10-import] Embedding generation encountered errors" + fi +} + +echo "Starting Gatekeeper.Api on :5002..." +ConnectionStrings__Postgres="Host=localhost;Database=gatekeeper;Username=gatekeeper;Password=$$DB_PASS" \ + dotnet run --no-build --project Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj --no-launch-profile \ + --urls "http://localhost:5002" 2>&1 | sed 's/^/ [gatekeeper] /' & +PIDS+=($$!) + +echo "Starting Clinical.Api on :5080..." +ConnectionStrings__Postgres="Host=localhost;Database=clinical;Username=clinical;Password=$$DB_PASS" \ + dotnet run --no-build --project Clinical/Clinical.Api/Clinical.Api.csproj --no-launch-profile \ + --urls "http://localhost:5080" 2>&1 | sed 's/^/ [clinical] /' & +PIDS+=($$!) + +echo "Starting Scheduling.Api on :5001..." +ConnectionStrings__Postgres="Host=localhost;Database=scheduling;Username=scheduling;Password=$$DB_PASS" \ + dotnet run --no-build --project Scheduling/Scheduling.Api/Scheduling.Api.csproj --no-launch-profile \ + --urls "http://localhost:5001" 2>&1 | sed 's/^/ [scheduling] /' & +PIDS+=($$!) + +echo "Starting ICD10.Api on :5090..." +ConnectionStrings__Postgres="Host=localhost;Database=icd10;Username=icd10;Password=$$DB_PASS" \ + dotnet run --no-build --project ICD10/ICD10.Api/ICD10.Api.csproj --no-launch-profile \ + --urls "http://localhost:5090" 2>&1 | sed 's/^/ [icd10] /' & +PIDS+=($$!) + +echo "Starting Dashboard on :5173..." +python3 -m http.server 5173 --directory Dashboard/Dashboard.Web/wwwroot 2>&1 | sed 's/^/ [dashboard] /' & +PIDS+=($$!) + +populate_icd10 & +PIDS+=($$!) + +echo "" +echo "════════════════════════════════════════" +echo " Gatekeeper: http://localhost:5002" +echo " Clinical: http://localhost:5080" +echo " Scheduling: http://localhost:5001" +echo " ICD10: http://localhost:5090" +echo " Embedding: http://localhost:8000" +echo " Dashboard: http://localhost:5173" +echo "════════════════════════════════════════" +echo " Press Ctrl+C to stop all services" +echo "" + +wait +endef +export START_LOCAL_RUNNER + +## start-local: Run all 4 APIs locally against the docker postgres dev DB +## Builds projects in Debug, dashboard in Release, then runs everything in +## the foreground with prefixed log output. Ctrl+C cleans up all children. +start-local: db-up + @echo "==> Setting up Python environment..." + @if [ ! -d ICD10/.venv ]; then python3 -m venv ICD10/.venv; fi + @ICD10/.venv/bin/pip install -q -r ICD10/embedding-service/requirements.txt psycopg2-binary click requests + @echo "==> Building all projects..." + dotnet build Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj --nologo -v q + dotnet build Clinical/Clinical.Api/Clinical.Api.csproj --nologo -v q + dotnet build Scheduling/Scheduling.Api/Scheduling.Api.csproj --nologo -v q + dotnet build ICD10/ICD10.Api/ICD10.Api.csproj --nologo -v q + dotnet build Dashboard/Dashboard.Web/Dashboard.Web.csproj -c Release --nologo -v q + @bash -c "$$START_LOCAL_RUNNER" + # ============================================================================= # HELP # ============================================================================= help: @echo "Available targets:" - @echo " build - Compile/assemble all artifacts" - @echo " test - Run full test suite with coverage" - @echo " lint - Run all linters (errors mode)" - @echo " fmt - Format all code in-place" - @echo " fmt-check - Check formatting (no modification)" - @echo " clean - Remove build artifacts" - @echo " check - lint + test (pre-commit)" - @echo " ci - lint + test + build (full CI)" - @echo " coverage - Generate and open coverage report" - @echo " coverage-check - Assert coverage thresholds" - @echo " setup - Post-create dev environment setup" + @echo " build - Compile/assemble all artifacts" + @echo " test - Run full test suite with coverage" + @echo " lint - Run all linters (errors mode)" + @echo " fmt - Format all code in-place" + @echo " fmt-check - Check formatting (no modification)" + @echo " clean - Remove build artifacts" + @echo " check - lint + test (pre-commit)" + @echo " ci - lint + test + build (full CI)" + @echo " coverage - Generate and open coverage report" + @echo " coverage-check - Assert coverage thresholds" + @echo " setup - Post-create dev environment setup" + @echo " start-local - Run all 4 APIs locally against docker postgres" + @echo " start-docker - Build dashboard + docker compose up the full stack" + @echo " clean-local - Kill local dev processes and drop postgres volume" + @echo " clean-docker - Kill docker stack and drop all volumes" diff --git a/NuGet.config b/NuGet.config new file mode 100644 index 0000000..4d736c1 --- /dev/null +++ b/NuGet.config @@ -0,0 +1,7 @@ +<?xml version="1.0" encoding="utf-8"?> +<configuration> + <packageSources> + <clear /> + <add key="nuget.org" value="https://api.nuget.org/v3/index.json" protocolVersion="3" /> + </packageSources> +</configuration> diff --git a/Scheduling/Scheduling.Api/DataProvider.json b/Scheduling/Scheduling.Api/DataProvider.json index 611251e..2bd02f2 100644 --- a/Scheduling/Scheduling.Api/DataProvider.json +++ b/Scheduling/Scheduling.Api/DataProvider.json @@ -1,73 +1,75 @@ { - "queries": [ - { - "name": "GetUpcomingAppointments", - "sqlFile": "Queries/GetUpcomingAppointments.generated.sql" - }, - { - "name": "GetAppointmentById", - "sqlFile": "Queries/GetAppointmentById.generated.sql" - }, - { - "name": "GetAppointmentsByPatient", - "sqlFile": "Queries/GetAppointmentsByPatient.generated.sql" - }, - { - "name": "GetAppointmentsByPractitioner", - "sqlFile": "Queries/GetAppointmentsByPractitioner.generated.sql" - }, - { - "name": "GetAllPractitioners", - "sqlFile": "Queries/GetAllPractitioners.generated.sql" - }, - { - "name": "GetPractitionerById", - "sqlFile": "Queries/GetPractitionerById.generated.sql" - }, - { - "name": "SearchPractitionersBySpecialty", - "sqlFile": "Queries/SearchPractitionersBySpecialty.generated.sql" - }, - { - "name": "GetAvailableSlots", - "sqlFile": "Queries/GetAvailableSlots.generated.sql" - }, - { - "name": "GetAppointmentsByStatus", - "sqlFile": "Queries/GetAppointmentsByStatus.generated.sql" - }, - { - "name": "CheckSchedulingConflicts", - "sqlFile": "Queries/CheckSchedulingConflicts.generated.sql" - }, - { - "name": "GetProviderAvailability", - "sqlFile": "Queries/GetProviderAvailability.generated.sql" - }, - { - "name": "GetProviderDailySchedule", - "sqlFile": "Queries/GetProviderDailySchedule.generated.sql" - } - ], - "tables": [ - { - "schema": "main", - "name": "fhir_Practitioner", - "generateInsert": true, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - }, - { - "schema": "main", - "name": "fhir_Appointment", - "generateInsert": true, - "generateUpdate": false, - "generateDelete": false, - "excludeColumns": ["Id"], - "primaryKeyColumns": ["Id"] - } - ], - "connectionString": "Data Source=scheduling.db" -} + "queries": [ + { + "name": "GetUpcomingAppointments", + "sqlFile": "Queries/GetUpcomingAppointments.generated.sql" + }, + { + "name": "GetAppointmentById", + "sqlFile": "Queries/GetAppointmentById.generated.sql" + }, + { + "name": "GetAppointmentsByPatient", + "sqlFile": "Queries/GetAppointmentsByPatient.generated.sql" + }, + { + "name": "GetAppointmentsByPractitioner", + "sqlFile": "Queries/GetAppointmentsByPractitioner.generated.sql" + }, + { + "name": "GetAllPractitioners", + "sqlFile": "Queries/GetAllPractitioners.generated.sql" + }, + { + "name": "GetPractitionerById", + "sqlFile": "Queries/GetPractitionerById.generated.sql" + }, + { + "name": "SearchPractitionersBySpecialty", + "sqlFile": "Queries/SearchPractitionersBySpecialty.generated.sql" + }, + { + "name": "GetAvailableSlots", + "sqlFile": "Queries/GetAvailableSlots.generated.sql" + }, + { + "name": "GetAppointmentsByStatus", + "sqlFile": "Queries/GetAppointmentsByStatus.generated.sql" + }, + { + "name": "CheckSchedulingConflicts", + "sqlFile": "Queries/CheckSchedulingConflicts.generated.sql" + }, + { + "name": "GetProviderAvailability", + "sqlFile": "Queries/GetProviderAvailability.generated.sql" + }, + { + "name": "GetProviderDailySchedule", + "sqlFile": "Queries/GetProviderDailySchedule.generated.sql" + } + ], + "tables": [ + { + "schema": "public", + "name": "fhir_practitioner", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + }, + { + "schema": "public", + "name": "fhir_appointment", + "generateInsert": true, + "generateUpdate": false, + "generateDelete": false, + "primaryKeyColumns": [ + "Id" + ] + } + ], + "connectionString": "Host=localhost;Port=5432;Database=scheduling;Username=postgres;Password=changeme" +} \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/DatabaseSetup.cs b/Scheduling/Scheduling.Api/DatabaseSetup.cs index 3c02dd6..179be79 100644 --- a/Scheduling/Scheduling.Api/DatabaseSetup.cs +++ b/Scheduling/Scheduling.Api/DatabaseSetup.cs @@ -1,5 +1,5 @@ -using Migration; -using Migration.Postgres; +using Nimblesite.DataProvider.Migration.Core; +using Nimblesite.DataProvider.Migration.Postgres; using InitError = Outcome.Result<bool, string>.Error<bool, string>; using InitOk = Outcome.Result<bool, string>.Ok<bool, string>; using InitResult = Outcome.Result<bool, string>; @@ -35,16 +35,7 @@ public static InitResult Initialize(NpgsqlConnection connection, ILogger logger) { var yamlPath = Path.Combine(AppContext.BaseDirectory, "scheduling-schema.yaml"); var schema = SchemaYamlSerializer.FromYamlFile(yamlPath); - - foreach (var table in schema.Tables) - { - var ddl = PostgresDdlGenerator.Generate(new CreateTableOperation(table)); - using var cmd = connection.CreateCommand(); - cmd.CommandText = ddl; - cmd.ExecuteNonQuery(); - logger.Log(LogLevel.Debug, "Created table {TableName}", table.Name); - } - + PostgresDdlGenerator.MigrateSchema(connection, schema); logger.Log(LogLevel.Information, "Created Scheduling database schema from YAML"); } catch (Exception ex) diff --git a/Scheduling/Scheduling.Api/Generated/.timestamp b/Scheduling/Scheduling.Api/Generated/.timestamp deleted file mode 100644 index e69de29..0000000 diff --git a/Scheduling/Scheduling.Api/Generated/CheckSchedulingConflicts.g.cs b/Scheduling/Scheduling.Api/Generated/CheckSchedulingConflicts.g.cs deleted file mode 100644 index e62b956..0000000 --- a/Scheduling/Scheduling.Api/Generated/CheckSchedulingConflicts.g.cs +++ /dev/null @@ -1,101 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'CheckSchedulingConflicts'. -/// </summary> -public static partial class CheckSchedulingConflictsExtensions -{ - /// <summary> - /// Executes 'CheckSchedulingConflicts.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="practitionerRef">Query parameter.</param> - /// <param name="proposedEnd">Query parameter.</param> - /// <param name="proposedStart">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<CheckSchedulingConflicts>, SqlError>> CheckSchedulingConflictsAsync(this NpgsqlConnection connection, object practitionerRef, object proposedEnd, object proposedStart) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.Status FROM fhir_Appointment WHERE fhir_Appointment.PractitionerReference = @practitionerRef AND fhir_Appointment.Status != 'cancelled' AND fhir_Appointment.StartTime < @proposedEnd AND fhir_Appointment.EndTime > @proposedStart"; - - try - { - var results = ImmutableList.CreateBuilder<CheckSchedulingConflicts>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (practitionerRef is not null and not DBNull) - command.Parameters.AddWithValue("@practitionerRef", practitionerRef); - else - command.Parameters.Add(new NpgsqlParameter("@practitionerRef", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (proposedEnd is not null and not DBNull) - command.Parameters.AddWithValue("@proposedEnd", proposedEnd); - else - command.Parameters.Add(new NpgsqlParameter("@proposedEnd", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (proposedStart is not null and not DBNull) - command.Parameters.AddWithValue("@proposedStart", proposedStart); - else - command.Parameters.Add(new NpgsqlParameter("@proposedStart", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new CheckSchedulingConflicts( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<CheckSchedulingConflicts>, SqlError>.Ok<ImmutableList<CheckSchedulingConflicts>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<CheckSchedulingConflicts>, SqlError>.Error<ImmutableList<CheckSchedulingConflicts>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'CheckSchedulingConflicts' query. -/// </summary> -public record CheckSchedulingConflicts -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Initializes a new instance of CheckSchedulingConflicts.</summary> - public CheckSchedulingConflicts( - string Id, - string StartTime, - string EndTime, - string Status - ) - { - this.Id = Id; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.Status = Status; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetAllPractitioners.g.cs b/Scheduling/Scheduling.Api/Generated/GetAllPractitioners.g.cs deleted file mode 100644 index 32b2386..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetAllPractitioners.g.cs +++ /dev/null @@ -1,116 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetAllPractitioners'. -/// </summary> -public static partial class GetAllPractitionersExtensions -{ - /// <summary> - /// Executes 'GetAllPractitioners.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetAllPractitioners>, SqlError>> GetAllPractitionersAsync(this NpgsqlConnection connection) - { - const string sql = @"SELECT fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone FROM fhir_Practitioner ORDER BY fhir_Practitioner.NameFamily , fhir_Practitioner.NameGiven "; - - try - { - var results = ImmutableList.CreateBuilder<GetAllPractitioners>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetAllPractitioners( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? default(long) : reader.GetFieldValue<long>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetAllPractitioners>, SqlError>.Ok<ImmutableList<GetAllPractitioners>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetAllPractitioners>, SqlError>.Error<ImmutableList<GetAllPractitioners>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetAllPractitioners' query. -/// </summary> -public record GetAllPractitioners -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Identifier'.</summary> - public string Identifier { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Column 'NameFamily'.</summary> - public string NameFamily { get; init; } - - /// <summary>Column 'NameGiven'.</summary> - public string NameGiven { get; init; } - - /// <summary>Column 'Qualification'.</summary> - public string Qualification { get; init; } - - /// <summary>Column 'Specialty'.</summary> - public string Specialty { get; init; } - - /// <summary>Column 'TelecomEmail'.</summary> - public string TelecomEmail { get; init; } - - /// <summary>Column 'TelecomPhone'.</summary> - public string TelecomPhone { get; init; } - - /// <summary>Initializes a new instance of GetAllPractitioners.</summary> - public GetAllPractitioners( - string Id, - string Identifier, - long Active, - string NameFamily, - string NameGiven, - string Qualification, - string Specialty, - string TelecomEmail, - string TelecomPhone - ) - { - this.Id = Id; - this.Identifier = Identifier; - this.Active = Active; - this.NameFamily = NameFamily; - this.NameGiven = NameGiven; - this.Qualification = Qualification; - this.Specialty = Specialty; - this.TelecomEmail = TelecomEmail; - this.TelecomPhone = TelecomPhone; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetAppointmentById.g.cs b/Scheduling/Scheduling.Api/Generated/GetAppointmentById.g.cs deleted file mode 100644 index 3f5abfa..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetAppointmentById.g.cs +++ /dev/null @@ -1,151 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetAppointmentById'. -/// </summary> -public static partial class GetAppointmentByIdExtensions -{ - /// <summary> - /// Executes 'GetAppointmentById.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="id">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetAppointmentById>, SqlError>> GetAppointmentByIdAsync(this NpgsqlConnection connection, object id) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.Id = @id"; - - try - { - var results = ImmutableList.CreateBuilder<GetAppointmentById>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (id is not null and not DBNull) - command.Parameters.AddWithValue("@id", id); - else - command.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetAppointmentById( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? default(long) : reader.GetFieldValue<long>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetAppointmentById>, SqlError>.Ok<ImmutableList<GetAppointmentById>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetAppointmentById>, SqlError>.Error<ImmutableList<GetAppointmentById>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetAppointmentById' query. -/// </summary> -public record GetAppointmentById -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'ServiceCategory'.</summary> - public string ServiceCategory { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Column 'Priority'.</summary> - public string Priority { get; init; } - - /// <summary>Column 'Description'.</summary> - public string Description { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'MinutesDuration'.</summary> - public long MinutesDuration { get; init; } - - /// <summary>Column 'PatientReference'.</summary> - public string PatientReference { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Column 'Created'.</summary> - public string Created { get; init; } - - /// <summary>Column 'Comment'.</summary> - public string Comment { get; init; } - - /// <summary>Initializes a new instance of GetAppointmentById.</summary> - public GetAppointmentById( - string Id, - string Status, - string ServiceCategory, - string ServiceType, - string ReasonCode, - string Priority, - string Description, - string StartTime, - string EndTime, - long MinutesDuration, - string PatientReference, - string PractitionerReference, - string Created, - string Comment - ) - { - this.Id = Id; - this.Status = Status; - this.ServiceCategory = ServiceCategory; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - this.Priority = Priority; - this.Description = Description; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.MinutesDuration = MinutesDuration; - this.PatientReference = PatientReference; - this.PractitionerReference = PractitionerReference; - this.Created = Created; - this.Comment = Comment; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetAppointmentsByPatient.g.cs b/Scheduling/Scheduling.Api/Generated/GetAppointmentsByPatient.g.cs deleted file mode 100644 index f8b0a9c..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetAppointmentsByPatient.g.cs +++ /dev/null @@ -1,151 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetAppointmentsByPatient'. -/// </summary> -public static partial class GetAppointmentsByPatientExtensions -{ - /// <summary> - /// Executes 'GetAppointmentsByPatient.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="patientReference">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetAppointmentsByPatient>, SqlError>> GetAppointmentsByPatientAsync(this NpgsqlConnection connection, object patientReference) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.PatientReference = @patientReference ORDER BY fhir_Appointment.StartTime DESC"; - - try - { - var results = ImmutableList.CreateBuilder<GetAppointmentsByPatient>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (patientReference is not null and not DBNull) - command.Parameters.AddWithValue("@patientReference", patientReference); - else - command.Parameters.Add(new NpgsqlParameter("@patientReference", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetAppointmentsByPatient( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? default(long) : reader.GetFieldValue<long>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetAppointmentsByPatient>, SqlError>.Ok<ImmutableList<GetAppointmentsByPatient>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetAppointmentsByPatient>, SqlError>.Error<ImmutableList<GetAppointmentsByPatient>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetAppointmentsByPatient' query. -/// </summary> -public record GetAppointmentsByPatient -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'ServiceCategory'.</summary> - public string ServiceCategory { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Column 'Priority'.</summary> - public string Priority { get; init; } - - /// <summary>Column 'Description'.</summary> - public string Description { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'MinutesDuration'.</summary> - public long MinutesDuration { get; init; } - - /// <summary>Column 'PatientReference'.</summary> - public string PatientReference { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Column 'Created'.</summary> - public string Created { get; init; } - - /// <summary>Column 'Comment'.</summary> - public string Comment { get; init; } - - /// <summary>Initializes a new instance of GetAppointmentsByPatient.</summary> - public GetAppointmentsByPatient( - string Id, - string Status, - string ServiceCategory, - string ServiceType, - string ReasonCode, - string Priority, - string Description, - string StartTime, - string EndTime, - long MinutesDuration, - string PatientReference, - string PractitionerReference, - string Created, - string Comment - ) - { - this.Id = Id; - this.Status = Status; - this.ServiceCategory = ServiceCategory; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - this.Priority = Priority; - this.Description = Description; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.MinutesDuration = MinutesDuration; - this.PatientReference = PatientReference; - this.PractitionerReference = PractitionerReference; - this.Created = Created; - this.Comment = Comment; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetAppointmentsByPractitioner.g.cs b/Scheduling/Scheduling.Api/Generated/GetAppointmentsByPractitioner.g.cs deleted file mode 100644 index 3d705c7..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetAppointmentsByPractitioner.g.cs +++ /dev/null @@ -1,151 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetAppointmentsByPractitioner'. -/// </summary> -public static partial class GetAppointmentsByPractitionerExtensions -{ - /// <summary> - /// Executes 'GetAppointmentsByPractitioner.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="practitionerReference">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetAppointmentsByPractitioner>, SqlError>> GetAppointmentsByPractitionerAsync(this NpgsqlConnection connection, object practitionerReference) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.PractitionerReference = @practitionerReference AND fhir_Appointment.Status = 'booked' ORDER BY fhir_Appointment.StartTime "; - - try - { - var results = ImmutableList.CreateBuilder<GetAppointmentsByPractitioner>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (practitionerReference is not null and not DBNull) - command.Parameters.AddWithValue("@practitionerReference", practitionerReference); - else - command.Parameters.Add(new NpgsqlParameter("@practitionerReference", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetAppointmentsByPractitioner( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? default(long) : reader.GetFieldValue<long>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetAppointmentsByPractitioner>, SqlError>.Ok<ImmutableList<GetAppointmentsByPractitioner>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetAppointmentsByPractitioner>, SqlError>.Error<ImmutableList<GetAppointmentsByPractitioner>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetAppointmentsByPractitioner' query. -/// </summary> -public record GetAppointmentsByPractitioner -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'ServiceCategory'.</summary> - public string ServiceCategory { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Column 'Priority'.</summary> - public string Priority { get; init; } - - /// <summary>Column 'Description'.</summary> - public string Description { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'MinutesDuration'.</summary> - public long MinutesDuration { get; init; } - - /// <summary>Column 'PatientReference'.</summary> - public string PatientReference { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Column 'Created'.</summary> - public string Created { get; init; } - - /// <summary>Column 'Comment'.</summary> - public string Comment { get; init; } - - /// <summary>Initializes a new instance of GetAppointmentsByPractitioner.</summary> - public GetAppointmentsByPractitioner( - string Id, - string Status, - string ServiceCategory, - string ServiceType, - string ReasonCode, - string Priority, - string Description, - string StartTime, - string EndTime, - long MinutesDuration, - string PatientReference, - string PractitionerReference, - string Created, - string Comment - ) - { - this.Id = Id; - this.Status = Status; - this.ServiceCategory = ServiceCategory; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - this.Priority = Priority; - this.Description = Description; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.MinutesDuration = MinutesDuration; - this.PatientReference = PatientReference; - this.PractitionerReference = PractitionerReference; - this.Created = Created; - this.Comment = Comment; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetAppointmentsByStatus.g.cs b/Scheduling/Scheduling.Api/Generated/GetAppointmentsByStatus.g.cs deleted file mode 100644 index da0f10e..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetAppointmentsByStatus.g.cs +++ /dev/null @@ -1,131 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetAppointmentsByStatus'. -/// </summary> -public static partial class GetAppointmentsByStatusExtensions -{ - /// <summary> - /// Executes 'GetAppointmentsByStatus.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="status">Query parameter.</param> - /// <param name="dateStart">Query parameter.</param> - /// <param name="dateEnd">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetAppointmentsByStatus>, SqlError>> GetAppointmentsByStatusAsync(this NpgsqlConnection connection, object status, object dateStart, object dateEnd) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.Status, sync_ScheduledPatient.DisplayName, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode FROM fhir_Appointment INNER JOIN sync_ScheduledPatient ON fhir_Appointment.PatientReference = sync_ScheduledPatient.PatientId INNER JOIN fhir_Practitioner ON fhir_Appointment.PractitionerReference = fhir_Practitioner.Id WHERE fhir_Appointment.Status = @status AND fhir_Appointment.StartTime >= @dateStart AND fhir_Appointment.StartTime < @dateEnd ORDER BY fhir_Appointment.StartTime "; - - try - { - var results = ImmutableList.CreateBuilder<GetAppointmentsByStatus>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (status is not null and not DBNull) - command.Parameters.AddWithValue("@status", status); - else - command.Parameters.Add(new NpgsqlParameter("@status", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (dateStart is not null and not DBNull) - command.Parameters.AddWithValue("@dateStart", dateStart); - else - command.Parameters.Add(new NpgsqlParameter("@dateStart", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (dateEnd is not null and not DBNull) - command.Parameters.AddWithValue("@dateEnd", dateEnd); - else - command.Parameters.Add(new NpgsqlParameter("@dateEnd", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetAppointmentsByStatus( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetAppointmentsByStatus>, SqlError>.Ok<ImmutableList<GetAppointmentsByStatus>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetAppointmentsByStatus>, SqlError>.Error<ImmutableList<GetAppointmentsByStatus>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetAppointmentsByStatus' query. -/// </summary> -public record GetAppointmentsByStatus -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'DisplayName'.</summary> - public string DisplayName { get; init; } - - /// <summary>Column 'NameFamily'.</summary> - public string NameFamily { get; init; } - - /// <summary>Column 'NameGiven'.</summary> - public string NameGiven { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Initializes a new instance of GetAppointmentsByStatus.</summary> - public GetAppointmentsByStatus( - string Id, - string StartTime, - string EndTime, - string Status, - string DisplayName, - string NameFamily, - string NameGiven, - string ServiceType, - string ReasonCode - ) - { - this.Id = Id; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.Status = Status; - this.DisplayName = DisplayName; - this.NameFamily = NameFamily; - this.NameGiven = NameGiven; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetAvailableSlots.g.cs b/Scheduling/Scheduling.Api/Generated/GetAvailableSlots.g.cs deleted file mode 100644 index 1fde898..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetAvailableSlots.g.cs +++ /dev/null @@ -1,107 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetAvailableSlots'. -/// </summary> -public static partial class GetAvailableSlotsExtensions -{ - /// <summary> - /// Executes 'GetAvailableSlots.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="practitionerRef">Query parameter.</param> - /// <param name="fromDate">Query parameter.</param> - /// <param name="toDate">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetAvailableSlots>, SqlError>> GetAvailableSlotsAsync(this NpgsqlConnection connection, object practitionerRef, object fromDate, object toDate) - { - const string sql = @"SELECT fhir_Slot.Id, fhir_Slot.Status, fhir_Slot.StartTime, fhir_Slot.EndTime, fhir_Schedule.PractitionerReference FROM fhir_Slot INNER JOIN fhir_Schedule ON fhir_Slot.ScheduleReference = fhir_Schedule.Id WHERE fhir_Schedule.PractitionerReference = @practitionerRef AND fhir_Slot.Status = 'free' AND fhir_Slot.StartTime >= @fromDate AND fhir_Slot.StartTime < @toDate ORDER BY fhir_Slot.StartTime "; - - try - { - var results = ImmutableList.CreateBuilder<GetAvailableSlots>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (practitionerRef is not null and not DBNull) - command.Parameters.AddWithValue("@practitionerRef", practitionerRef); - else - command.Parameters.Add(new NpgsqlParameter("@practitionerRef", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (fromDate is not null and not DBNull) - command.Parameters.AddWithValue("@fromDate", fromDate); - else - command.Parameters.Add(new NpgsqlParameter("@fromDate", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (toDate is not null and not DBNull) - command.Parameters.AddWithValue("@toDate", toDate); - else - command.Parameters.Add(new NpgsqlParameter("@toDate", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetAvailableSlots( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetAvailableSlots>, SqlError>.Ok<ImmutableList<GetAvailableSlots>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetAvailableSlots>, SqlError>.Error<ImmutableList<GetAvailableSlots>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetAvailableSlots' query. -/// </summary> -public record GetAvailableSlots -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Initializes a new instance of GetAvailableSlots.</summary> - public GetAvailableSlots( - string Id, - string Status, - string StartTime, - string EndTime, - string PractitionerReference - ) - { - this.Id = Id; - this.Status = Status; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.PractitionerReference = PractitionerReference; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetPractitionerById.g.cs b/Scheduling/Scheduling.Api/Generated/GetPractitionerById.g.cs deleted file mode 100644 index 76bdf39..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetPractitionerById.g.cs +++ /dev/null @@ -1,121 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetPractitionerById'. -/// </summary> -public static partial class GetPractitionerByIdExtensions -{ - /// <summary> - /// Executes 'GetPractitionerById.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="id">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetPractitionerById>, SqlError>> GetPractitionerByIdAsync(this NpgsqlConnection connection, object id) - { - const string sql = @"SELECT fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone FROM fhir_Practitioner WHERE fhir_Practitioner.Id = @id"; - - try - { - var results = ImmutableList.CreateBuilder<GetPractitionerById>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (id is not null and not DBNull) - command.Parameters.AddWithValue("@id", id); - else - command.Parameters.Add(new NpgsqlParameter("@id", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetPractitionerById( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? default(long) : reader.GetFieldValue<long>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetPractitionerById>, SqlError>.Ok<ImmutableList<GetPractitionerById>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetPractitionerById>, SqlError>.Error<ImmutableList<GetPractitionerById>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetPractitionerById' query. -/// </summary> -public record GetPractitionerById -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Identifier'.</summary> - public string Identifier { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Column 'NameFamily'.</summary> - public string NameFamily { get; init; } - - /// <summary>Column 'NameGiven'.</summary> - public string NameGiven { get; init; } - - /// <summary>Column 'Qualification'.</summary> - public string Qualification { get; init; } - - /// <summary>Column 'Specialty'.</summary> - public string Specialty { get; init; } - - /// <summary>Column 'TelecomEmail'.</summary> - public string TelecomEmail { get; init; } - - /// <summary>Column 'TelecomPhone'.</summary> - public string TelecomPhone { get; init; } - - /// <summary>Initializes a new instance of GetPractitionerById.</summary> - public GetPractitionerById( - string Id, - string Identifier, - long Active, - string NameFamily, - string NameGiven, - string Qualification, - string Specialty, - string TelecomEmail, - string TelecomPhone - ) - { - this.Id = Id; - this.Identifier = Identifier; - this.Active = Active; - this.NameFamily = NameFamily; - this.NameGiven = NameGiven; - this.Qualification = Qualification; - this.Specialty = Specialty; - this.TelecomEmail = TelecomEmail; - this.TelecomPhone = TelecomPhone; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetProviderAvailability.g.cs b/Scheduling/Scheduling.Api/Generated/GetProviderAvailability.g.cs deleted file mode 100644 index 20eb91d..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetProviderAvailability.g.cs +++ /dev/null @@ -1,103 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetProviderAvailability'. -/// </summary> -public static partial class GetProviderAvailabilityExtensions -{ - /// <summary> - /// Executes 'GetProviderAvailability.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="practitionerRef">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetProviderAvailability>, SqlError>> GetProviderAvailabilityAsync(this NpgsqlConnection connection, object practitionerRef) - { - const string sql = @"SELECT fhir_Schedule.Id, fhir_Schedule.PractitionerReference, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Schedule.PlanningHorizon, fhir_Schedule.Active FROM fhir_Schedule INNER JOIN fhir_Practitioner ON fhir_Schedule.PractitionerReference = fhir_Practitioner.Id WHERE fhir_Schedule.PractitionerReference = @practitionerRef AND fhir_Schedule.Active = 1"; - - try - { - var results = ImmutableList.CreateBuilder<GetProviderAvailability>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (practitionerRef is not null and not DBNull) - command.Parameters.AddWithValue("@practitionerRef", practitionerRef); - else - command.Parameters.Add(new NpgsqlParameter("@practitionerRef", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetProviderAvailability( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? default(long) : reader.GetFieldValue<long>(4), - reader.IsDBNull(5) ? default(long) : reader.GetFieldValue<long>(5) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetProviderAvailability>, SqlError>.Ok<ImmutableList<GetProviderAvailability>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetProviderAvailability>, SqlError>.Error<ImmutableList<GetProviderAvailability>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetProviderAvailability' query. -/// </summary> -public record GetProviderAvailability -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Column 'NameFamily'.</summary> - public string NameFamily { get; init; } - - /// <summary>Column 'NameGiven'.</summary> - public string NameGiven { get; init; } - - /// <summary>Column 'PlanningHorizon'.</summary> - public long PlanningHorizon { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Initializes a new instance of GetProviderAvailability.</summary> - public GetProviderAvailability( - string Id, - string PractitionerReference, - string NameFamily, - string NameGiven, - long PlanningHorizon, - long Active - ) - { - this.Id = Id; - this.PractitionerReference = PractitionerReference; - this.NameFamily = NameFamily; - this.NameGiven = NameGiven; - this.PlanningHorizon = PlanningHorizon; - this.Active = Active; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetProviderDailySchedule.g.cs b/Scheduling/Scheduling.Api/Generated/GetProviderDailySchedule.g.cs deleted file mode 100644 index 437fc7f..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetProviderDailySchedule.g.cs +++ /dev/null @@ -1,161 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetProviderDailySchedule'. -/// </summary> -public static partial class GetProviderDailyScheduleExtensions -{ - /// <summary> - /// Executes 'GetProviderDailySchedule.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="practitionerRef">Query parameter.</param> - /// <param name="dateStart">Query parameter.</param> - /// <param name="dateEnd">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetProviderDailySchedule>, SqlError>> GetProviderDailyScheduleAsync(this NpgsqlConnection connection, object practitionerRef, object dateStart, object dateEnd) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Description, fhir_Appointment.PatientReference, sync_ScheduledPatient.PatientId, sync_ScheduledPatient.DisplayName, sync_ScheduledPatient.ContactPhone, fhir_Appointment.PractitionerReference FROM fhir_Appointment INNER JOIN sync_ScheduledPatient ON fhir_Appointment.PatientReference = sync_ScheduledPatient.PatientId WHERE fhir_Appointment.PractitionerReference = @practitionerRef AND fhir_Appointment.StartTime >= @dateStart AND fhir_Appointment.StartTime < @dateEnd ORDER BY fhir_Appointment.StartTime "; - - try - { - var results = ImmutableList.CreateBuilder<GetProviderDailySchedule>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (practitionerRef is not null and not DBNull) - command.Parameters.AddWithValue("@practitionerRef", practitionerRef); - else - command.Parameters.Add(new NpgsqlParameter("@practitionerRef", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (dateStart is not null and not DBNull) - command.Parameters.AddWithValue("@dateStart", dateStart); - else - command.Parameters.Add(new NpgsqlParameter("@dateStart", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - if (dateEnd is not null and not DBNull) - command.Parameters.AddWithValue("@dateEnd", dateEnd); - else - command.Parameters.Add(new NpgsqlParameter("@dateEnd", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetProviderDailySchedule( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? default(long) : reader.GetFieldValue<long>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? null : reader.GetFieldValue<string>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetProviderDailySchedule>, SqlError>.Ok<ImmutableList<GetProviderDailySchedule>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetProviderDailySchedule>, SqlError>.Error<ImmutableList<GetProviderDailySchedule>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetProviderDailySchedule' query. -/// </summary> -public record GetProviderDailySchedule -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'MinutesDuration'.</summary> - public long MinutesDuration { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'ServiceCategory'.</summary> - public string ServiceCategory { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Column 'Description'.</summary> - public string Description { get; init; } - - /// <summary>Column 'PatientReference'.</summary> - public string PatientReference { get; init; } - - /// <summary>Column 'PatientId'.</summary> - public string PatientId { get; init; } - - /// <summary>Column 'DisplayName'.</summary> - public string DisplayName { get; init; } - - /// <summary>Column 'ContactPhone'.</summary> - public string ContactPhone { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Initializes a new instance of GetProviderDailySchedule.</summary> - public GetProviderDailySchedule( - string Id, - string StartTime, - string EndTime, - long MinutesDuration, - string Status, - string ServiceCategory, - string ServiceType, - string ReasonCode, - string Description, - string PatientReference, - string PatientId, - string DisplayName, - string ContactPhone, - string PractitionerReference - ) - { - this.Id = Id; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.MinutesDuration = MinutesDuration; - this.Status = Status; - this.ServiceCategory = ServiceCategory; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - this.Description = Description; - this.PatientReference = PatientReference; - this.PatientId = PatientId; - this.DisplayName = DisplayName; - this.ContactPhone = ContactPhone; - this.PractitionerReference = PractitionerReference; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/GetUpcomingAppointments.g.cs b/Scheduling/Scheduling.Api/Generated/GetUpcomingAppointments.g.cs deleted file mode 100644 index 338564a..0000000 --- a/Scheduling/Scheduling.Api/Generated/GetUpcomingAppointments.g.cs +++ /dev/null @@ -1,146 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'GetUpcomingAppointments'. -/// </summary> -public static partial class GetUpcomingAppointmentsExtensions -{ - /// <summary> - /// Executes 'GetUpcomingAppointments.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<GetUpcomingAppointments>, SqlError>> GetUpcomingAppointmentsAsync(this NpgsqlConnection connection) - { - const string sql = @"SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.Status = 'booked' ORDER BY fhir_Appointment.StartTime "; - - try - { - var results = ImmutableList.CreateBuilder<GetUpcomingAppointments>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new GetUpcomingAppointments( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? null : reader.GetFieldValue<string>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8), - reader.IsDBNull(9) ? default(long) : reader.GetFieldValue<long>(9), - reader.IsDBNull(10) ? null : reader.GetFieldValue<string>(10), - reader.IsDBNull(11) ? null : reader.GetFieldValue<string>(11), - reader.IsDBNull(12) ? null : reader.GetFieldValue<string>(12), - reader.IsDBNull(13) ? null : reader.GetFieldValue<string>(13) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<GetUpcomingAppointments>, SqlError>.Ok<ImmutableList<GetUpcomingAppointments>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<GetUpcomingAppointments>, SqlError>.Error<ImmutableList<GetUpcomingAppointments>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'GetUpcomingAppointments' query. -/// </summary> -public record GetUpcomingAppointments -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Status'.</summary> - public string Status { get; init; } - - /// <summary>Column 'ServiceCategory'.</summary> - public string ServiceCategory { get; init; } - - /// <summary>Column 'ServiceType'.</summary> - public string ServiceType { get; init; } - - /// <summary>Column 'ReasonCode'.</summary> - public string ReasonCode { get; init; } - - /// <summary>Column 'Priority'.</summary> - public string Priority { get; init; } - - /// <summary>Column 'Description'.</summary> - public string Description { get; init; } - - /// <summary>Column 'StartTime'.</summary> - public string StartTime { get; init; } - - /// <summary>Column 'EndTime'.</summary> - public string EndTime { get; init; } - - /// <summary>Column 'MinutesDuration'.</summary> - public long MinutesDuration { get; init; } - - /// <summary>Column 'PatientReference'.</summary> - public string PatientReference { get; init; } - - /// <summary>Column 'PractitionerReference'.</summary> - public string PractitionerReference { get; init; } - - /// <summary>Column 'Created'.</summary> - public string Created { get; init; } - - /// <summary>Column 'Comment'.</summary> - public string Comment { get; init; } - - /// <summary>Initializes a new instance of GetUpcomingAppointments.</summary> - public GetUpcomingAppointments( - string Id, - string Status, - string ServiceCategory, - string ServiceType, - string ReasonCode, - string Priority, - string Description, - string StartTime, - string EndTime, - long MinutesDuration, - string PatientReference, - string PractitionerReference, - string Created, - string Comment - ) - { - this.Id = Id; - this.Status = Status; - this.ServiceCategory = ServiceCategory; - this.ServiceType = ServiceType; - this.ReasonCode = ReasonCode; - this.Priority = Priority; - this.Description = Description; - this.StartTime = StartTime; - this.EndTime = EndTime; - this.MinutesDuration = MinutesDuration; - this.PatientReference = PatientReference; - this.PractitionerReference = PractitionerReference; - this.Created = Created; - this.Comment = Comment; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/SearchPractitionersBySpecialty.g.cs b/Scheduling/Scheduling.Api/Generated/SearchPractitionersBySpecialty.g.cs deleted file mode 100644 index c28dc12..0000000 --- a/Scheduling/Scheduling.Api/Generated/SearchPractitionersBySpecialty.g.cs +++ /dev/null @@ -1,121 +0,0 @@ -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated; - -/// <summary> -/// Extension methods for 'SearchPractitionersBySpecialty'. -/// </summary> -public static partial class SearchPractitionersBySpecialtyExtensions -{ - /// <summary> - /// Executes 'SearchPractitionersBySpecialty.sql' and maps results. - /// </summary> - /// <param name="connection">Open NpgsqlConnection connection.</param> - /// <param name="specialty">Query parameter.</param> - /// <returns>Result of records or SQL error.</returns> - public static async Task<Result<ImmutableList<SearchPractitionersBySpecialty>, SqlError>> SearchPractitionersBySpecialtyAsync(this NpgsqlConnection connection, object specialty) - { - const string sql = @"SELECT fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone FROM fhir_Practitioner WHERE fhir_Practitioner.Specialty LIKE '%' || @specialty || '%' ORDER BY fhir_Practitioner.NameFamily , fhir_Practitioner.NameGiven "; - - try - { - var results = ImmutableList.CreateBuilder<SearchPractitionersBySpecialty>(); - - using (var command = new NpgsqlCommand(sql, connection)) - { - if (specialty is not null and not DBNull) - command.Parameters.AddWithValue("@specialty", specialty); - else - command.Parameters.Add(new NpgsqlParameter("@specialty", NpgsqlTypes.NpgsqlDbType.Text) { Value = DBNull.Value }); - - using (var reader = await command.ExecuteReaderAsync().ConfigureAwait(false)) - { - while (await reader.ReadAsync().ConfigureAwait(false)) - { - var item = new SearchPractitionersBySpecialty( - reader.IsDBNull(0) ? null : reader.GetFieldValue<string>(0), - reader.IsDBNull(1) ? null : reader.GetFieldValue<string>(1), - reader.IsDBNull(2) ? default(long) : reader.GetFieldValue<long>(2), - reader.IsDBNull(3) ? null : reader.GetFieldValue<string>(3), - reader.IsDBNull(4) ? null : reader.GetFieldValue<string>(4), - reader.IsDBNull(5) ? null : reader.GetFieldValue<string>(5), - reader.IsDBNull(6) ? null : reader.GetFieldValue<string>(6), - reader.IsDBNull(7) ? null : reader.GetFieldValue<string>(7), - reader.IsDBNull(8) ? null : reader.GetFieldValue<string>(8) - ); - results.Add(item); - } - } - } - - return new Result<ImmutableList<SearchPractitionersBySpecialty>, SqlError>.Ok<ImmutableList<SearchPractitionersBySpecialty>, SqlError>(results.ToImmutable()); - } - catch (Exception ex) - { - return new Result<ImmutableList<SearchPractitionersBySpecialty>, SqlError>.Error<ImmutableList<SearchPractitionersBySpecialty>, SqlError>(new SqlError("Database error", ex)); - } - } -} - -/// <summary> -/// Result row for 'SearchPractitionersBySpecialty' query. -/// </summary> -public record SearchPractitionersBySpecialty -{ - /// <summary>Column 'Id'.</summary> - public string Id { get; init; } - - /// <summary>Column 'Identifier'.</summary> - public string Identifier { get; init; } - - /// <summary>Column 'Active'.</summary> - public long Active { get; init; } - - /// <summary>Column 'NameFamily'.</summary> - public string NameFamily { get; init; } - - /// <summary>Column 'NameGiven'.</summary> - public string NameGiven { get; init; } - - /// <summary>Column 'Qualification'.</summary> - public string Qualification { get; init; } - - /// <summary>Column 'Specialty'.</summary> - public string Specialty { get; init; } - - /// <summary>Column 'TelecomEmail'.</summary> - public string TelecomEmail { get; init; } - - /// <summary>Column 'TelecomPhone'.</summary> - public string TelecomPhone { get; init; } - - /// <summary>Initializes a new instance of SearchPractitionersBySpecialty.</summary> - public SearchPractitionersBySpecialty( - string Id, - string Identifier, - long Active, - string NameFamily, - string NameGiven, - string Qualification, - string Specialty, - string TelecomEmail, - string TelecomPhone - ) - { - this.Id = Id; - this.Identifier = Identifier; - this.Active = Active; - this.NameFamily = NameFamily; - this.NameGiven = NameGiven; - this.Qualification = Qualification; - this.Specialty = Specialty; - this.TelecomEmail = TelecomEmail; - this.TelecomPhone = TelecomPhone; - } -} diff --git a/Scheduling/Scheduling.Api/Generated/fhir_AppointmentOperations.g.cs b/Scheduling/Scheduling.Api/Generated/fhir_AppointmentOperations.g.cs deleted file mode 100644 index 5b36202..0000000 --- a/Scheduling/Scheduling.Api/Generated/fhir_AppointmentOperations.g.cs +++ /dev/null @@ -1,60 +0,0 @@ -#nullable enable -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Data; -using System.Globalization; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated -{ - /// <summary> - /// Extension methods for table operations on fhir_Appointment - /// </summary> - public static partial class fhir_AppointmentExtensions - { - - /// <summary> - /// Inserts a new row into the fhir_Appointment table. - /// </summary> - public static async Task<Result<int, SqlError>> Insertfhir_AppointmentAsync(this IDbTransaction transaction, string? id, string? status, string? servicecategory, string? servicetype, string? reasoncode, string? priority, string? description, string? starttime, string? endtime, long? minutesduration, string? patientreference, string? practitionerreference, string? created, string? comment) - { - const string sql = "INSERT INTO fhir_Appointment (Id, Status, ServiceCategory, ServiceType, ReasonCode, Priority, Description, StartTime, EndTime, MinutesDuration, PatientReference, PractitionerReference, Created, Comment) VALUES (@Id, @Status, @ServiceCategory, @ServiceType, @ReasonCode, @Priority, @Description, @StartTime, @EndTime, @MinutesDuration, @PatientReference, @PractitionerReference, @Created, @Comment)"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Status", status ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@ServiceCategory", servicecategory ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@ServiceType", servicetype ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@ReasonCode", reasoncode ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Priority", priority ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Description", description ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@StartTime", starttime ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@EndTime", endtime ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@MinutesDuration", minutesduration ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PatientReference", patientreference ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@PractitionerReference", practitionerreference ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Created", created ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Comment", comment ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Insert failed", ex)); - } - } - - } -} diff --git a/Scheduling/Scheduling.Api/Generated/fhir_PractitionerOperations.g.cs b/Scheduling/Scheduling.Api/Generated/fhir_PractitionerOperations.g.cs deleted file mode 100644 index 29bd56d..0000000 --- a/Scheduling/Scheduling.Api/Generated/fhir_PractitionerOperations.g.cs +++ /dev/null @@ -1,55 +0,0 @@ -#nullable enable -using System; -using System.Collections.Generic; -using System.Collections.Immutable; -using System.Data; -using System.Globalization; -using System.Threading.Tasks; -using Npgsql; -using Outcome; -using Selecta; - -namespace Generated -{ - /// <summary> - /// Extension methods for table operations on fhir_Practitioner - /// </summary> - public static partial class fhir_PractitionerExtensions - { - - /// <summary> - /// Inserts a new row into the fhir_Practitioner table. - /// </summary> - public static async Task<Result<int, SqlError>> Insertfhir_PractitionerAsync(this IDbTransaction transaction, string? id, string? identifier, long? active, string? namefamily, string? namegiven, string? qualification, string? specialty, string? telecomemail, string? telecomphone) - { - const string sql = "INSERT INTO fhir_Practitioner (Id, Identifier, Active, NameFamily, NameGiven, Qualification, Specialty, TelecomEmail, TelecomPhone) VALUES (@Id, @Identifier, @Active, @NameFamily, @NameGiven, @Qualification, @Specialty, @TelecomEmail, @TelecomPhone)"; - - if (transaction.Connection is null) - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Transaction has no connection")); - - try - { - using (var command = new NpgsqlCommand(sql, (NpgsqlConnection)transaction.Connection!, (NpgsqlTransaction)transaction)) - { - command.Parameters.AddWithValue("@Id", id ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Identifier", identifier ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Active", active ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@NameFamily", namefamily ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@NameGiven", namegiven ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Qualification", qualification ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@Specialty", specialty ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@TelecomEmail", telecomemail ?? (object)DBNull.Value); - command.Parameters.AddWithValue("@TelecomPhone", telecomphone ?? (object)DBNull.Value); - - var rowsAffected = await command.ExecuteNonQueryAsync().ConfigureAwait(false); - return new Result<int, SqlError>.Ok<int, SqlError>(rowsAffected); - } - } - catch (Exception ex) - { - return new Result<int, SqlError>.Error<int, SqlError>(new SqlError("Insert failed", ex)); - } - } - - } -} diff --git a/Scheduling/Scheduling.Api/GlobalUsings.cs b/Scheduling/Scheduling.Api/GlobalUsings.cs index 3a661bb..878b848 100644 --- a/Scheduling/Scheduling.Api/GlobalUsings.cs +++ b/Scheduling/Scheduling.Api/GlobalUsings.cs @@ -1,116 +1,149 @@ global using System; global using Generated; global using Microsoft.Extensions.Logging; +global using Nimblesite.Sql.Model; +global using Nimblesite.Sync.Core; +global using Nimblesite.Sync.Postgres; global using Npgsql; global using Outcome; -global using Selecta; -global using Sync; -global using Sync.Postgres; // Sync result type aliases -global using BoolSyncError = Outcome.Result<bool, Sync.SyncError>.Error<bool, Sync.SyncError>; +global using BoolSyncError = Outcome.Result<bool, Nimblesite.Sync.Core.SyncError>.Error< + bool, + Nimblesite.Sync.Core.SyncError +>; global using GetAllPractitionersError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAllPractitioners>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetAllPractitioners>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetAllPractitioners query result type aliases global using GetAllPractitionersOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAllPractitioners>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetAllPractitioners>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetAllPractitioners>, + Nimblesite.Sql.Model.SqlError +>; global using GetAppointmentByIdError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentById>, - Selecta.SqlError ->.Error<System.Collections.Immutable.ImmutableList<Generated.GetAppointmentById>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Error< + System.Collections.Immutable.ImmutableList<Generated.GetAppointmentById>, + Nimblesite.Sql.Model.SqlError +>; // GetAppointmentById query result type aliases global using GetAppointmentByIdOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentById>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetAppointmentById>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetAppointmentById>, + Nimblesite.Sql.Model.SqlError +>; global using GetAppointmentsByPatientError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetAppointmentsByPatient query result type aliases global using GetAppointmentsByPatientOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPatient>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; global using GetAppointmentsByPractitionerError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPractitioner>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPractitioner>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetAppointmentsByPractitioner query result type aliases global using GetAppointmentsByPractitionerOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPractitioner>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.GetAppointmentsByPractitioner>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; global using GetPractitionerByIdError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetPractitionerById>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetPractitionerById>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetPractitionerById query result type aliases global using GetPractitionerByIdOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetPractitionerById>, - Selecta.SqlError ->.Ok<System.Collections.Immutable.ImmutableList<Generated.GetPractitionerById>, Selecta.SqlError>; + Nimblesite.Sql.Model.SqlError +>.Ok< + System.Collections.Immutable.ImmutableList<Generated.GetPractitionerById>, + Nimblesite.Sql.Model.SqlError +>; global using GetUpcomingAppointmentsError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetUpcomingAppointments>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.GetUpcomingAppointments>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // GetUpcomingAppointments query result type aliases global using GetUpcomingAppointmentsOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.GetUpcomingAppointments>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.GetUpcomingAppointments>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError +>; +global using InsertError = Outcome.Result<System.Guid?, Nimblesite.Sql.Model.SqlError>.Error< + System.Guid?, + Nimblesite.Sql.Model.SqlError >; -global using InsertError = Outcome.Result<int, Selecta.SqlError>.Error<int, Selecta.SqlError>; // Insert result type aliases -global using InsertOk = Outcome.Result<int, Selecta.SqlError>.Ok<int, Selecta.SqlError>; +global using InsertOk = Outcome.Result<System.Guid?, Nimblesite.Sql.Model.SqlError>.Ok< + System.Guid?, + Nimblesite.Sql.Model.SqlError +>; global using SearchPractitionersError = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchPractitionersBySpecialty>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Error< System.Collections.Immutable.ImmutableList<Generated.SearchPractitionersBySpecialty>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >; // SearchPractitionersBySpecialty query result type aliases global using SearchPractitionersOk = Outcome.Result< System.Collections.Immutable.ImmutableList<Generated.SearchPractitionersBySpecialty>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError >.Ok< System.Collections.Immutable.ImmutableList<Generated.SearchPractitionersBySpecialty>, - Selecta.SqlError + Nimblesite.Sql.Model.SqlError +>; +global using StringSyncError = Outcome.Result<string, Nimblesite.Sync.Core.SyncError>.Error< + string, + Nimblesite.Sync.Core.SyncError +>; +global using StringSyncOk = Outcome.Result<string, Nimblesite.Sync.Core.SyncError>.Ok< + string, + Nimblesite.Sync.Core.SyncError >; -global using StringSyncError = Outcome.Result<string, Sync.SyncError>.Error<string, Sync.SyncError>; -global using StringSyncOk = Outcome.Result<string, Sync.SyncError>.Ok<string, Sync.SyncError>; global using SyncLogListError = Outcome.Result< - System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, - Sync.SyncError ->.Error<System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, Sync.SyncError>; + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>.Error< + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>; global using SyncLogListOk = Outcome.Result< - System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, - Sync.SyncError ->.Ok<System.Collections.Generic.IReadOnlyList<Sync.SyncLogEntry>, Sync.SyncError>; + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>.Ok< + System.Collections.Generic.IReadOnlyList<Nimblesite.Sync.Core.SyncLogEntry>, + Nimblesite.Sync.Core.SyncError +>; diff --git a/Scheduling/Scheduling.Api/Program.cs b/Scheduling/Scheduling.Api/Program.cs index 0b405b7..f75146b 100644 --- a/Scheduling/Scheduling.Api/Program.cs +++ b/Scheduling/Scheduling.Api/Program.cs @@ -141,10 +141,10 @@ var id = Guid.NewGuid().ToString(); var result = await transaction - .Insertfhir_PractitionerAsync( + .Insertfhir_practitionerAsync( id, request.Identifier, - 1L, + 1, request.NameFamily, request.NameGiven, request.Qualification ?? string.Empty, @@ -201,15 +201,15 @@ using var cmd = conn.CreateCommand(); cmd.Transaction = transaction; cmd.CommandText = """ - UPDATE fhir_Practitioner - SET NameFamily = @nameFamily, - NameGiven = @nameGiven, - Qualification = @qualification, - Specialty = @specialty, - TelecomEmail = @telecomEmail, - TelecomPhone = @telecomPhone, - Active = @active - WHERE Id = @id + UPDATE fhir_practitioner + SET "NameFamily" = @nameFamily, + "NameGiven" = @nameGiven, + "Qualification" = @qualification, + "Specialty" = @specialty, + "TelecomEmail" = @telecomEmail, + "TelecomPhone" = @telecomPhone, + "Active" = @active + WHERE "Id" = @id """; cmd.Parameters.AddWithValue("@id", id); cmd.Parameters.AddWithValue("@nameFamily", request.NameFamily); @@ -354,7 +354,7 @@ UPDATE fhir_Practitioner var durationMinutes = (int)(end - start).TotalMinutes; var result = await transaction - .Insertfhir_AppointmentAsync( + .Insertfhir_appointmentAsync( id, "booked", request.ServiceCategory ?? string.Empty, @@ -428,20 +428,20 @@ UPDATE fhir_Practitioner using var cmd = conn.CreateCommand(); cmd.Transaction = transaction; cmd.CommandText = """ - UPDATE fhir_Appointment - SET ServiceCategory = @serviceCategory, - ServiceType = @serviceType, - ReasonCode = @reasonCode, - Priority = @priority, - Description = @description, - StartTime = @start, - EndTime = @end, - MinutesDuration = @duration, - PatientReference = @patientRef, - PractitionerReference = @practitionerRef, - Comment = @comment, - Status = @status - WHERE Id = @id + UPDATE fhir_appointment + SET "ServiceCategory" = @serviceCategory, + "ServiceType" = @serviceType, + "ReasonCode" = @reasonCode, + "Priority" = @priority, + "Description" = @description, + "StartTime" = @start, + "EndTime" = @end, + "MinutesDuration" = @duration, + "PatientReference" = @patientRef, + "PractitionerReference" = @practitionerRef, + "Comment" = @comment, + "Status" = @status + WHERE "Id" = @id """; cmd.Parameters.AddWithValue("@id", id); cmd.Parameters.AddWithValue( @@ -508,7 +508,7 @@ UPDATE fhir_Appointment using var cmd = conn.CreateCommand(); cmd.Transaction = transaction; - cmd.CommandText = "UPDATE fhir_Appointment SET Status = @status WHERE Id = @id"; + cmd.CommandText = "UPDATE fhir_appointment SET \"Status\" = @status WHERE \"Id\" = @id"; cmd.Parameters.AddWithValue("@status", status); cmd.Parameters.AddWithValue("@id", id); @@ -739,7 +739,7 @@ Func<NpgsqlConnection> getConn using var conn = getConn(); using var cmd = conn.CreateCommand(); cmd.CommandText = - "SELECT PatientId, DisplayName, ContactPhone, ContactEmail, SyncedAt FROM sync_ScheduledPatient"; + "SELECT \"PatientId\", \"DisplayName\", \"ContactPhone\", \"ContactEmail\", \"SyncedAt\" FROM sync_scheduledpatient"; using var reader = cmd.ExecuteReader(); var patients = new List<object>(); while (reader.Read()) diff --git a/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.generated.sql b/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.generated.sql deleted file mode 100644 index ffc4d3b..0000000 --- a/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.Status FROM fhir_Appointment WHERE fhir_Appointment.PractitionerReference = @practitionerRef AND fhir_Appointment.Status != 'cancelled' AND fhir_Appointment.StartTime < @proposedEnd AND fhir_Appointment.EndTime > @proposedStart \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.lql b/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.lql index 2885529..85bfc69 100644 --- a/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.lql +++ b/Scheduling/Scheduling.Api/Queries/CheckSchedulingConflicts.lql @@ -1,5 +1,5 @@ -- Check for scheduling conflicts -- Parameters: @practitionerRef, @proposedStart, @proposedEnd -fhir_Appointment -|> filter(fn(row) => row.fhir_Appointment.PractitionerReference = @practitionerRef and row.fhir_Appointment.Status != 'cancelled' and row.fhir_Appointment.StartTime < @proposedEnd and row.fhir_Appointment.EndTime > @proposedStart) -|> select(fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.Status) +fhir_appointment +|> filter(fn(row) => row.fhir_appointment.PractitionerReference = @practitionerRef and row.fhir_appointment.Status != 'cancelled' and row.fhir_appointment.StartTime < @proposedEnd and row.fhir_appointment.EndTime > @proposedStart) +|> select(fhir_appointment.Id, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.Status) diff --git a/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.generated.sql b/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.generated.sql deleted file mode 100644 index eb27afd..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone FROM fhir_Practitioner ORDER BY fhir_Practitioner.NameFamily , fhir_Practitioner.NameGiven \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.lql b/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.lql index cd02457..e4c2630 100644 --- a/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.lql +++ b/Scheduling/Scheduling.Api/Queries/GetAllPractitioners.lql @@ -1,4 +1,4 @@ -- Get all practitioners -fhir_Practitioner -|> select(fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone) -|> order_by(fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven) +fhir_practitioner +|> select(fhir_practitioner.Id, fhir_practitioner.Identifier, fhir_practitioner.Active, fhir_practitioner.NameFamily, fhir_practitioner.NameGiven, fhir_practitioner.Qualification, fhir_practitioner.Specialty, fhir_practitioner.TelecomEmail, fhir_practitioner.TelecomPhone) +|> order_by(fhir_practitioner.NameFamily, fhir_practitioner.NameGiven) diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentById.generated.sql b/Scheduling/Scheduling.Api/Queries/GetAppointmentById.generated.sql deleted file mode 100644 index fdc3a9a..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentById.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.Id = @id \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentById.lql b/Scheduling/Scheduling.Api/Queries/GetAppointmentById.lql index d12e4a7..3e184e4 100644 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentById.lql +++ b/Scheduling/Scheduling.Api/Queries/GetAppointmentById.lql @@ -1,5 +1,5 @@ -- Get appointment by ID -- Parameters: @id -fhir_Appointment -|> filter(fn(row) => row.fhir_Appointment.Id = @id) -|> select(fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment) +fhir_appointment +|> filter(fn(row) => row.fhir_appointment.Id = @id) +|> select(fhir_appointment.Id, fhir_appointment.Status, fhir_appointment.ServiceCategory, fhir_appointment.ServiceType, fhir_appointment.ReasonCode, fhir_appointment.Priority, fhir_appointment.Description, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.MinutesDuration, fhir_appointment.PatientReference, fhir_appointment.PractitionerReference, fhir_appointment.Created, fhir_appointment.Comment) diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.generated.sql b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.generated.sql deleted file mode 100644 index 685633a..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.PatientReference = @patientReference ORDER BY fhir_Appointment.StartTime DESC \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.lql b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.lql index 11bd7cc..391aa79 100644 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.lql +++ b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPatient.lql @@ -1,6 +1,6 @@ -- Get appointments for a patient -- Parameters: @patientReference -fhir_Appointment -|> filter(fn(row) => row.fhir_Appointment.PatientReference = @patientReference) -|> select(fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment) -|> order_by(fhir_Appointment.StartTime desc) +fhir_appointment +|> filter(fn(row) => row.fhir_appointment.PatientReference = @patientReference) +|> select(fhir_appointment.Id, fhir_appointment.Status, fhir_appointment.ServiceCategory, fhir_appointment.ServiceType, fhir_appointment.ReasonCode, fhir_appointment.Priority, fhir_appointment.Description, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.MinutesDuration, fhir_appointment.PatientReference, fhir_appointment.PractitionerReference, fhir_appointment.Created, fhir_appointment.Comment) +|> order_by(fhir_appointment.StartTime desc) diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.generated.sql b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.generated.sql deleted file mode 100644 index fa540bf..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.PractitionerReference = @practitionerReference AND fhir_Appointment.Status = 'booked' ORDER BY fhir_Appointment.StartTime \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.lql b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.lql index 97effa5..0cff4aa 100644 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.lql +++ b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByPractitioner.lql @@ -1,6 +1,6 @@ -- Get appointments for a practitioner -- Parameters: @practitionerReference -fhir_Appointment -|> filter(fn(row) => row.fhir_Appointment.PractitionerReference = @practitionerReference and row.fhir_Appointment.Status = 'booked') -|> select(fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment) -|> order_by(fhir_Appointment.StartTime) +fhir_appointment +|> filter(fn(row) => row.fhir_appointment.PractitionerReference = @practitionerReference and row.fhir_appointment.Status = 'booked') +|> select(fhir_appointment.Id, fhir_appointment.Status, fhir_appointment.ServiceCategory, fhir_appointment.ServiceType, fhir_appointment.ReasonCode, fhir_appointment.Priority, fhir_appointment.Description, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.MinutesDuration, fhir_appointment.PatientReference, fhir_appointment.PractitionerReference, fhir_appointment.Created, fhir_appointment.Comment) +|> order_by(fhir_appointment.StartTime) diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.generated.sql b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.generated.sql deleted file mode 100644 index 6318005..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.Status, sync_ScheduledPatient.DisplayName, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode FROM fhir_Appointment INNER JOIN sync_ScheduledPatient ON fhir_Appointment.PatientReference = sync_ScheduledPatient.PatientId INNER JOIN fhir_Practitioner ON fhir_Appointment.PractitionerReference = fhir_Practitioner.Id WHERE fhir_Appointment.Status = @status AND fhir_Appointment.StartTime >= @dateStart AND fhir_Appointment.StartTime < @dateEnd ORDER BY fhir_Appointment.StartTime \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.lql b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.lql index 32fa9b8..1bd11c4 100644 --- a/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.lql +++ b/Scheduling/Scheduling.Api/Queries/GetAppointmentsByStatus.lql @@ -1,8 +1,8 @@ -- Get appointments by status with patient and practitioner info -- Parameters: @status, @dateStart, @dateEnd -fhir_Appointment -|> join(sync_ScheduledPatient, on = fhir_Appointment.PatientReference = sync_ScheduledPatient.PatientId) -|> join(fhir_Practitioner, on = fhir_Appointment.PractitionerReference = fhir_Practitioner.Id) -|> filter(fn(row) => row.fhir_Appointment.Status = @status and row.fhir_Appointment.StartTime >= @dateStart and row.fhir_Appointment.StartTime < @dateEnd) -|> select(fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.Status, sync_ScheduledPatient.DisplayName, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode) -|> order_by(fhir_Appointment.StartTime) +fhir_appointment +|> join(sync_scheduledpatient, on = fhir_appointment.PatientReference = sync_scheduledpatient.PatientId) +|> join(fhir_practitioner, on = fhir_appointment.PractitionerReference = fhir_practitioner.Id) +|> filter(fn(row) => row.fhir_appointment.Status = @status and row.fhir_appointment.StartTime >= @dateStart and row.fhir_appointment.StartTime < @dateEnd) +|> select(fhir_appointment.Id, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.Status, sync_scheduledpatient.DisplayName, fhir_practitioner.NameFamily, fhir_practitioner.NameGiven, fhir_appointment.ServiceType, fhir_appointment.ReasonCode) +|> order_by(fhir_appointment.StartTime) diff --git a/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.generated.sql b/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.generated.sql deleted file mode 100644 index a3ca060..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Slot.Id, fhir_Slot.Status, fhir_Slot.StartTime, fhir_Slot.EndTime, fhir_Schedule.PractitionerReference FROM fhir_Slot INNER JOIN fhir_Schedule ON fhir_Slot.ScheduleReference = fhir_Schedule.Id WHERE fhir_Schedule.PractitionerReference = @practitionerRef AND fhir_Slot.Status = 'free' AND fhir_Slot.StartTime >= @fromDate AND fhir_Slot.StartTime < @toDate ORDER BY fhir_Slot.StartTime \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.lql b/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.lql index 487e8a8..52a3e1b 100644 --- a/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.lql +++ b/Scheduling/Scheduling.Api/Queries/GetAvailableSlots.lql @@ -1,7 +1,7 @@ -- Get available slots for a practitioner -- Parameters: @practitionerRef, @fromDate, @toDate -fhir_Slot -|> join(fhir_Schedule, on = fhir_Slot.ScheduleReference = fhir_Schedule.Id) -|> filter(fn(row) => row.fhir_Schedule.PractitionerReference = @practitionerRef and row.fhir_Slot.Status = 'free' and row.fhir_Slot.StartTime >= @fromDate and row.fhir_Slot.StartTime < @toDate) -|> select(fhir_Slot.Id, fhir_Slot.Status, fhir_Slot.StartTime, fhir_Slot.EndTime, fhir_Schedule.PractitionerReference) -|> order_by(fhir_Slot.StartTime) +fhir_slot +|> join(fhir_schedule, on = fhir_slot.ScheduleReference = fhir_schedule.Id) +|> filter(fn(row) => row.fhir_schedule.PractitionerReference = @practitionerRef and row.fhir_slot.Status = 'free' and row.fhir_slot.StartTime >= @fromDate and row.fhir_slot.StartTime < @toDate) +|> select(fhir_slot.Id, fhir_slot.Status, fhir_slot.StartTime, fhir_slot.EndTime, fhir_schedule.PractitionerReference) +|> order_by(fhir_slot.StartTime) diff --git a/Scheduling/Scheduling.Api/Queries/GetPractitionerById.generated.sql b/Scheduling/Scheduling.Api/Queries/GetPractitionerById.generated.sql deleted file mode 100644 index 14c5d35..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetPractitionerById.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone FROM fhir_Practitioner WHERE fhir_Practitioner.Id = @id \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetPractitionerById.lql b/Scheduling/Scheduling.Api/Queries/GetPractitionerById.lql index 8aeb570..e8ec48e 100644 --- a/Scheduling/Scheduling.Api/Queries/GetPractitionerById.lql +++ b/Scheduling/Scheduling.Api/Queries/GetPractitionerById.lql @@ -1,5 +1,5 @@ -- Get practitioner by ID -- Parameters: @id -fhir_Practitioner -|> filter(fn(row) => row.fhir_Practitioner.Id = @id) -|> select(fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone) +fhir_practitioner +|> filter(fn(row) => row.fhir_practitioner.Id = @id) +|> select(fhir_practitioner.Id, fhir_practitioner.Identifier, fhir_practitioner.Active, fhir_practitioner.NameFamily, fhir_practitioner.NameGiven, fhir_practitioner.Qualification, fhir_practitioner.Specialty, fhir_practitioner.TelecomEmail, fhir_practitioner.TelecomPhone) diff --git a/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.generated.sql b/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.generated.sql deleted file mode 100644 index 72bb92b..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Schedule.Id, fhir_Schedule.PractitionerReference, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Schedule.PlanningHorizon, fhir_Schedule.Active FROM fhir_Schedule INNER JOIN fhir_Practitioner ON fhir_Schedule.PractitionerReference = fhir_Practitioner.Id WHERE fhir_Schedule.PractitionerReference = @practitionerRef AND fhir_Schedule.Active = 1 \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.lql b/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.lql index b033b7b..f7aefce 100644 --- a/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.lql +++ b/Scheduling/Scheduling.Api/Queries/GetProviderAvailability.lql @@ -1,6 +1,6 @@ -- Get provider availability schedule -- Parameters: @practitionerRef -fhir_Schedule -|> join(fhir_Practitioner, on = fhir_Schedule.PractitionerReference = fhir_Practitioner.Id) -|> filter(fn(row) => row.fhir_Schedule.PractitionerReference = @practitionerRef and row.fhir_Schedule.Active = 1) -|> select(fhir_Schedule.Id, fhir_Schedule.PractitionerReference, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Schedule.PlanningHorizon, fhir_Schedule.Active) +fhir_schedule +|> join(fhir_practitioner, on = fhir_schedule.PractitionerReference = fhir_practitioner.Id) +|> filter(fn(row) => row.fhir_schedule.PractitionerReference = @practitionerRef and row.fhir_schedule.Active = 1) +|> select(fhir_schedule.Id, fhir_schedule.PractitionerReference, fhir_practitioner.NameFamily, fhir_practitioner.NameGiven, fhir_schedule.PlanningHorizon, fhir_schedule.Active) diff --git a/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.generated.sql b/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.generated.sql deleted file mode 100644 index 668d0ec..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Description, fhir_Appointment.PatientReference, sync_ScheduledPatient.PatientId, sync_ScheduledPatient.DisplayName, sync_ScheduledPatient.ContactPhone, fhir_Appointment.PractitionerReference FROM fhir_Appointment INNER JOIN sync_ScheduledPatient ON fhir_Appointment.PatientReference = sync_ScheduledPatient.PatientId WHERE fhir_Appointment.PractitionerReference = @practitionerRef AND fhir_Appointment.StartTime >= @dateStart AND fhir_Appointment.StartTime < @dateEnd ORDER BY fhir_Appointment.StartTime \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.lql b/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.lql index 36c1fd9..0f62b15 100644 --- a/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.lql +++ b/Scheduling/Scheduling.Api/Queries/GetProviderDailySchedule.lql @@ -1,7 +1,7 @@ -- Get provider daily schedule with patient info -- Parameters: @practitionerRef, @dateStart, @dateEnd -fhir_Appointment -|> join(sync_ScheduledPatient, on = fhir_Appointment.PatientReference = sync_ScheduledPatient.PatientId) -|> filter(fn(row) => row.fhir_Appointment.PractitionerReference = @practitionerRef and row.fhir_Appointment.StartTime >= @dateStart and row.fhir_Appointment.StartTime < @dateEnd) -|> select(fhir_Appointment.Id, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Description, fhir_Appointment.PatientReference, sync_ScheduledPatient.PatientId, sync_ScheduledPatient.DisplayName, sync_ScheduledPatient.ContactPhone, fhir_Appointment.PractitionerReference) -|> order_by(fhir_Appointment.StartTime) +fhir_appointment +|> join(sync_scheduledpatient, on = fhir_appointment.PatientReference = sync_scheduledpatient.PatientId) +|> filter(fn(row) => row.fhir_appointment.PractitionerReference = @practitionerRef and row.fhir_appointment.StartTime >= @dateStart and row.fhir_appointment.StartTime < @dateEnd) +|> select(fhir_appointment.Id, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.MinutesDuration, fhir_appointment.Status, fhir_appointment.ServiceCategory, fhir_appointment.ServiceType, fhir_appointment.ReasonCode, fhir_appointment.Description, fhir_appointment.PatientReference, sync_scheduledpatient.PatientId, sync_scheduledpatient.DisplayName, sync_scheduledpatient.ContactPhone, fhir_appointment.PractitionerReference) +|> order_by(fhir_appointment.StartTime) diff --git a/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.generated.sql b/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.generated.sql deleted file mode 100644 index a27dab3..0000000 --- a/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment FROM fhir_Appointment WHERE fhir_Appointment.Status = 'booked' ORDER BY fhir_Appointment.StartTime \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.lql b/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.lql index 44893dd..f691269 100644 --- a/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.lql +++ b/Scheduling/Scheduling.Api/Queries/GetUpcomingAppointments.lql @@ -1,5 +1,5 @@ -- Get all booked appointments (no limit - calendar needs all appointments) -fhir_Appointment -|> filter(fn(row) => row.fhir_Appointment.Status = 'booked') -|> select(fhir_Appointment.Id, fhir_Appointment.Status, fhir_Appointment.ServiceCategory, fhir_Appointment.ServiceType, fhir_Appointment.ReasonCode, fhir_Appointment.Priority, fhir_Appointment.Description, fhir_Appointment.StartTime, fhir_Appointment.EndTime, fhir_Appointment.MinutesDuration, fhir_Appointment.PatientReference, fhir_Appointment.PractitionerReference, fhir_Appointment.Created, fhir_Appointment.Comment) -|> order_by(fhir_Appointment.StartTime) +fhir_appointment +|> filter(fn(row) => row.fhir_appointment.Status = 'booked') +|> select(fhir_appointment.Id, fhir_appointment.Status, fhir_appointment.ServiceCategory, fhir_appointment.ServiceType, fhir_appointment.ReasonCode, fhir_appointment.Priority, fhir_appointment.Description, fhir_appointment.StartTime, fhir_appointment.EndTime, fhir_appointment.MinutesDuration, fhir_appointment.PatientReference, fhir_appointment.PractitionerReference, fhir_appointment.Created, fhir_appointment.Comment) +|> order_by(fhir_appointment.StartTime) diff --git a/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.generated.sql b/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.generated.sql deleted file mode 100644 index e0a423b..0000000 --- a/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.generated.sql +++ /dev/null @@ -1 +0,0 @@ -SELECT fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone FROM fhir_Practitioner WHERE fhir_Practitioner.Specialty LIKE '%' || @specialty || '%' ORDER BY fhir_Practitioner.NameFamily , fhir_Practitioner.NameGiven \ No newline at end of file diff --git a/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.lql b/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.lql index b7ee685..fbe2fcb 100644 --- a/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.lql +++ b/Scheduling/Scheduling.Api/Queries/SearchPractitionersBySpecialty.lql @@ -1,6 +1,6 @@ -- Search practitioners by specialty -- Parameters: @specialty -fhir_Practitioner -|> filter(fn(row) => row.fhir_Practitioner.Specialty like '%' || @specialty || '%') -|> select(fhir_Practitioner.Id, fhir_Practitioner.Identifier, fhir_Practitioner.Active, fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven, fhir_Practitioner.Qualification, fhir_Practitioner.Specialty, fhir_Practitioner.TelecomEmail, fhir_Practitioner.TelecomPhone) -|> order_by(fhir_Practitioner.NameFamily, fhir_Practitioner.NameGiven) +fhir_practitioner +|> filter(fn(row) => row.fhir_practitioner.Specialty like '%' || @specialty || '%') +|> select(fhir_practitioner.Id, fhir_practitioner.Identifier, fhir_practitioner.Active, fhir_practitioner.NameFamily, fhir_practitioner.NameGiven, fhir_practitioner.Qualification, fhir_practitioner.Specialty, fhir_practitioner.TelecomEmail, fhir_practitioner.TelecomPhone) +|> order_by(fhir_practitioner.NameFamily, fhir_practitioner.NameGiven) diff --git a/Scheduling/Scheduling.Api/Scheduling.Api.csproj b/Scheduling/Scheduling.Api/Scheduling.Api.csproj index a770e0f..33b9271 100644 --- a/Scheduling/Scheduling.Api/Scheduling.Api.csproj +++ b/Scheduling/Scheduling.Api/Scheduling.Api.csproj @@ -1,7 +1,7 @@ <Project Sdk="Microsoft.NET.Sdk.Web"> <PropertyGroup> <OutputType>Exe</OutputType> - <NoWarn>CA1515;CA2100;RS1035;CA1508;CA2234;CA1812</NoWarn> + <NoWarn>$(NoWarn);CA1515;CA2100;RS1035;CA1508;CA2234;CA1812;CS1591</NoWarn> <EnableLqlTranspile>true</EnableLqlTranspile> </PropertyGroup> @@ -12,12 +12,17 @@ <ItemGroup> <PackageReference Include="Npgsql" Version="9.0.2" /> - <PackageReference Include="MelbourneDev.DataProvider" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Lql.Postgres" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Sync.Postgres" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Selecta" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Migration" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Migration.Postgres" Version="0.1.0" /> + <PackageReference Include="Nimblesite.DataProvider.Core" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Lql.Postgres" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Sync.Postgres" Version="$(DataProviderVersion)" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Core" + Version="$(DataProviderVersion)" + /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Postgres" + Version="$(DataProviderVersion)" + /> </ItemGroup> <ItemGroup> @@ -34,19 +39,10 @@ </Content> </ItemGroup> - <!-- Create database from YAML using Migration.Cli (installed as dotnet tool) --> - <Target Name="CreateDatabaseSchema" BeforeTargets="TranspileLqlAndGenerateDataProvider"> - <Exec - Command="dotnet migration-cli -- --schema "$(MSBuildProjectDirectory)/scheduling-schema.yaml" --output "$(MSBuildProjectDirectory)/scheduling.db" --provider sqlite" - WorkingDirectory="$(MSBuildProjectDirectory)" - StandardOutputImportance="High" - StandardErrorImportance="High" - /> - </Target> - - <!-- Pre-compile: transpile LQL to SQL, then generate C# from SQL using CLI tools --> + <!-- Pre-compile: transpile LQL to SQL, then generate C# via `dotnet DataProvider postgres`. + Requires a live Postgres with the scheduling schema migrated (see `make db-migrate`). --> <Target - Name="TranspileLqlAndGenerateDataProvider" + Name="GenerateDataProvider" BeforeTargets="BeforeCompile;CoreCompile" Inputs="$(MSBuildProjectDirectory)/DataProvider.json;@(AdditionalFiles);@(LqlFiles)" Outputs="$(MSBuildProjectDirectory)/Generated/.timestamp" @@ -58,19 +54,17 @@ </ItemGroup> <Message Importance="High" Text="Transpiling LQL files (@(LqlFiles))" /> <Exec - Command="dotnet lqlcli-sqlite -- --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" + Command="dotnet Lql postgres --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" Condition="'$(EnableLqlTranspile)' == 'true' and @(LqlFiles) != ''" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" - ContinueOnError="WarnAndContinue" /> <Exec - Command="dotnet dataprovider-sqlite-cli -- --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated" --connection-type NpgsqlConnection" + Command="dotnet DataProvider postgres --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" - IgnoreExitCode="true" /> <Touch Files="$(MSBuildProjectDirectory)/Generated/.timestamp" AlwaysCreate="true" /> <ItemGroup> diff --git a/Scheduling/Scheduling.Api/scheduling-schema.yaml b/Scheduling/Scheduling.Api/scheduling-schema.yaml index ec1d910..2f92ac9 100644 --- a/Scheduling/Scheduling.Api/scheduling-schema.yaml +++ b/Scheduling/Scheduling.Api/scheduling-schema.yaml @@ -1,6 +1,6 @@ name: scheduling tables: -- name: fhir_Practitioner +- name: fhir_practitioner columns: - name: Id type: Text @@ -29,10 +29,10 @@ tables: columns: - Specialty primaryKey: - name: PK_fhir_Practitioner + name: PK_fhir_practitioner columns: - Id -- name: fhir_Schedule +- name: fhir_schedule columns: - name: Id type: Text @@ -51,17 +51,17 @@ tables: columns: - PractitionerReference foreignKeys: - - name: FK_fhir_Schedule_PractitionerReference + - name: FK_fhir_schedule_PractitionerReference columns: - PractitionerReference - referencedTable: fhir_Practitioner + referencedTable: fhir_practitioner referencedColumns: - Id primaryKey: - name: PK_fhir_Schedule + name: PK_fhir_schedule columns: - Id -- name: fhir_Slot +- name: fhir_slot columns: - name: Id type: Text @@ -87,17 +87,17 @@ tables: columns: - Status foreignKeys: - - name: FK_fhir_Slot_ScheduleReference + - name: FK_fhir_slot_ScheduleReference columns: - ScheduleReference - referencedTable: fhir_Schedule + referencedTable: fhir_schedule referencedColumns: - Id primaryKey: - name: PK_fhir_Slot + name: PK_fhir_slot columns: - Id -- name: fhir_Appointment +- name: fhir_appointment columns: - name: Id type: Text @@ -140,10 +140,10 @@ tables: columns: - PractitionerReference primaryKey: - name: PK_fhir_Appointment + name: PK_fhir_appointment columns: - Id -- name: sync_ScheduledPatient +- name: sync_scheduledpatient columns: - name: PatientId type: Text @@ -157,6 +157,6 @@ tables: type: Text defaultValue: CURRENT_TIMESTAMP primaryKey: - name: PK_sync_ScheduledPatient + name: PK_sync_scheduledpatient columns: - PatientId diff --git a/Scheduling/Scheduling.Api/scheduling.db b/Scheduling/Scheduling.Api/scheduling.db index b93b46f..2e4d75c 100644 Binary files a/Scheduling/Scheduling.Api/scheduling.db and b/Scheduling/Scheduling.Api/scheduling.db differ diff --git a/Scheduling/Scheduling.Sync/Scheduling.Sync.csproj b/Scheduling/Scheduling.Sync/Scheduling.Sync.csproj index 1967999..e7dc0ec 100644 --- a/Scheduling/Scheduling.Sync/Scheduling.Sync.csproj +++ b/Scheduling/Scheduling.Sync/Scheduling.Sync.csproj @@ -6,7 +6,7 @@ <ItemGroup> <PackageReference Include="Npgsql" Version="9.0.2" /> <PackageReference Include="Microsoft.Extensions.Hosting" Version="10.0.0" /> - <PackageReference Include="MelbourneDev.Sync" Version="0.1.0" /> - <PackageReference Include="MelbourneDev.Sync.Postgres" Version="0.1.0" /> + <PackageReference Include="Nimblesite.Sync.Core" Version="$(DataProviderVersion)" /> + <PackageReference Include="Nimblesite.Sync.Postgres" Version="$(DataProviderVersion)" /> </ItemGroup> </Project> diff --git a/Scheduling/Scheduling.Sync/SchedulingSyncWorker.cs b/Scheduling/Scheduling.Sync/SchedulingSyncWorker.cs index b1220fb..6b349fb 100644 --- a/Scheduling/Scheduling.Sync/SchedulingSyncWorker.cs +++ b/Scheduling/Scheduling.Sync/SchedulingSyncWorker.cs @@ -133,7 +133,7 @@ await Task.Delay(TimeSpan.FromSeconds(retryDelay), stoppingToken) } /// <summary> - /// Fetches changes from Clinical domain and applies column mappings to sync_ScheduledPatient. + /// Fetches changes from Clinical domain and applies column mappings to sync_scheduledpatient. /// </summary> private async Task SyncPatientDataAsync(CancellationToken cancellationToken) { @@ -191,8 +191,8 @@ private async Task SyncPatientDataAsync(CancellationToken cancellationToken) } /// <summary> - /// Applies a change from Clinical domain to sync_ScheduledPatient with column mapping. - /// Maps: fhir_Patient -> sync_ScheduledPatient + /// Applies a change from Clinical domain to sync_scheduledpatient with column mapping. + /// Maps: fhir_patient -> sync_scheduledpatient /// Transforms: DisplayName = concat(GivenName, ' ', FamilyName) /// </summary> private void ApplyMappedChange(NpgsqlConnection connection, SyncChange change) @@ -230,11 +230,11 @@ private void ApplyMappedChange(NpgsqlConnection connection, SyncChange change) // Transform: DisplayName = concat(GivenName, ' ', FamilyName) var displayName = $"{givenName} {familyName}".Trim(); - // Upsert to sync_ScheduledPatient + // Upsert to sync_scheduledpatient if (change.Operation == SyncChange.Delete) { using var cmd = connection.CreateCommand(); - cmd.CommandText = "DELETE FROM sync_ScheduledPatient WHERE PatientId = @id"; + cmd.CommandText = "DELETE FROM sync_scheduledpatient WHERE \"PatientId\" = @id"; cmd.Parameters.AddWithValue("@id", patientId); cmd.ExecuteNonQuery(); @@ -244,13 +244,13 @@ private void ApplyMappedChange(NpgsqlConnection connection, SyncChange change) { using var cmd = connection.CreateCommand(); cmd.CommandText = """ - INSERT INTO sync_ScheduledPatient (PatientId, DisplayName, ContactPhone, ContactEmail, SyncedAt) + INSERT INTO sync_scheduledpatient ("PatientId", "DisplayName", "ContactPhone", "ContactEmail", "SyncedAt") VALUES (@id, @name, @phone, @email, NOW()) - ON CONFLICT (PatientId) DO UPDATE SET - DisplayName = excluded.DisplayName, - ContactPhone = excluded.ContactPhone, - ContactEmail = excluded.ContactEmail, - SyncedAt = NOW() + ON CONFLICT ("PatientId") DO UPDATE SET + "DisplayName" = excluded."DisplayName", + "ContactPhone" = excluded."ContactPhone", + "ContactEmail" = excluded."ContactEmail", + "SyncedAt" = NOW() """; cmd.Parameters.AddWithValue("@id", patientId); diff --git a/Scheduling/Scheduling.Sync/SyncMappings.json b/Scheduling/Scheduling.Sync/SyncMappings.json index 6bace33..69e7224 100644 --- a/Scheduling/Scheduling.Sync/SyncMappings.json +++ b/Scheduling/Scheduling.Sync/SyncMappings.json @@ -2,7 +2,7 @@ "mappings": [ { "source_table": "fhir_Patient", - "target_table": "sync_ScheduledPatient", + "target_table": "sync_scheduledpatient", "column_mappings": [ { "source": "Id", diff --git a/Shared/Authorization/AuthHelpers.cs b/Shared/Authorization/AuthHelpers.cs index 84d304e..d5830ea 100644 --- a/Shared/Authorization/AuthHelpers.cs +++ b/Shared/Authorization/AuthHelpers.cs @@ -159,7 +159,7 @@ public static async Task<PermissionResult> CheckPermissionAsync( } catch (Exception ex) { - return new PermissionResult(false, $"Permission check failed: {ex.Message}"); + return new PermissionResult(false, $"Permission check failed: {ex}"); } } diff --git a/Shared/Authorization/Authorization.csproj b/Shared/Authorization/Authorization.csproj index 7608f91..32f0526 100644 --- a/Shared/Authorization/Authorization.csproj +++ b/Shared/Authorization/Authorization.csproj @@ -8,5 +8,9 @@ <ItemGroup> <FrameworkReference Include="Microsoft.AspNetCore.App" /> + <PackageReference + Include="Nimblesite.DataProvider.Migration.Core" + Version="$(DataProviderVersion)" + /> </ItemGroup> </Project> diff --git a/coverage-thresholds.json b/coverage-thresholds.json new file mode 100644 index 0000000..2379e4f --- /dev/null +++ b/coverage-thresholds.json @@ -0,0 +1,23 @@ +{ + "default_threshold": 80, + "projects": { + "Gatekeeper/Gatekeeper.Api.Tests": { + "threshold": 68 + }, + "Clinical/Clinical.Api.Tests": { + "threshold": 90 + }, + "Scheduling/Scheduling.Api.Tests": { + "threshold": 85 + }, + "ICD10/ICD10.Api.Tests": { + "threshold": 71 + }, + "ICD10/ICD10.Cli.Tests": { + "threshold": 67 + }, + "Dashboard/Dashboard.Integration.Tests": { + "threshold": 42 + } + } +} diff --git a/coverlet.runsettings b/coverlet.runsettings index a740cec..e4a0779 100644 --- a/coverlet.runsettings +++ b/coverlet.runsettings @@ -1,11 +1,11 @@ <?xml version="1.0" encoding="utf-8" ?> -<!-- agent-pmo:d58c330 --> +<!-- agent-pmo:29b9dcf --> <RunSettings> <DataCollectionRunSettings> <DataCollectors> <DataCollector friendlyName="XPlat Code Coverage"> <Configuration> - <Format>json,lcov,opencover,cobertura</Format> + <Format>json,cobertura</Format> <Exclude>[*]*.Generated*,[*]*.g.*</Exclude> <ExcludeByFile>**/obj/**/*,**/bin/**/*,**/Migrations/**/*</ExcludeByFile> <ExcludeByAttribute> diff --git a/docker/README.md b/docker/README.md index 7a47e41..f6a27f4 100644 --- a/docker/README.md +++ b/docker/README.md @@ -49,13 +49,13 @@ Then serve the static files however you want (nginx, python, etc). ```bash # Start everything -./scripts/start.sh +make start-docker # Fresh start (wipe databases) -./scripts/start.sh --fresh +make clean-docker start-docker # Rebuild containers -./scripts/start.sh --build +make start-docker BUILD=1 ``` ## Ports diff --git a/docker/docker-compose.db.yml b/docker/docker-compose.db.yml new file mode 100644 index 0000000..cd233b4 --- /dev/null +++ b/docker/docker-compose.db.yml @@ -0,0 +1,20 @@ +services: + db: + image: pgvector/pgvector:pg16 + container_name: healthcaresamples-db + environment: + POSTGRES_USER: postgres + POSTGRES_PASSWORD: ${DB_PASSWORD:-changeme} + volumes: + - db-data:/var/lib/postgresql/data + - ./init-db:/docker-entrypoint-initdb.d + ports: + - "5432:5432" + healthcheck: + test: ["CMD-SHELL", "pg_isready -U postgres"] + interval: 5s + timeout: 3s + retries: 20 + +volumes: + db-data: diff --git a/docs/plans/delete-generated-files-and-postgres-codegen.md b/docs/plans/delete-generated-files-and-postgres-codegen.md new file mode 100644 index 0000000..c85b88f --- /dev/null +++ b/docs/plans/delete-generated-files-and-postgres-codegen.md @@ -0,0 +1,298 @@ +# Plan: Delete Committed Generated Code & Switch to Postgres-Based Generation + +DataProvider reference code here: +/Users/christianfindlay/Documents/Code/ai_cms + +## Progress Checklist + +- [x] Step 1 — Add `dataprovider-postgres` (0.2.7-beta), `lql-postgres` (0.1.8-beta), bump `migration-cli` (0.2.2-beta) to `.config/dotnet-tools.json` +- [x] Step 2 — Create `docker/docker-compose.db.yml` +- [x] Step 3 — Add `db-up`, `db-down`, `db-reset`, `db-wait`, `db-migrate` Makefile targets and wire `build`/`test`/`lint` to depend on `db-migrate` +- [x] Step 4 — Update each `DataProvider.json` for Postgres (Clinical, Scheduling, Gatekeeper, ICD10) — connection string + `schema` `main` → `public`, drop `excludeColumns` +- [x] Step 5 — Update each API `.csproj` (Clinical, Scheduling, Gatekeeper, ICD10) — switch to `dataprovider-postgres`, switch LQL to `lql-postgres`, drop `IgnoreExitCode`, delete `CreateDatabaseSchema` target, drop SQLite `icd10.db` `<Content>` +- [x] Step 6 — Delete tracked `Generated/` files from git, update root `.gitignore`, simplify `ICD10/.gitignore` +- [x] Step 7 — Update `.github/workflows/ci.yml` to use `make db-up` / `make db-migrate` instead of inline `services.postgres` +- [x] Step 8 — Patch consumer C# (Gatekeeper/Clinical/Scheduling/ICD10) for new generated record shape (`Result<Guid?>` instead of `Result<int>`, snake_case fields preserved, IDbTransaction overloads) +- [x] Verify — `make build` succeeds with 0 errors / 0 warnings; full `HealthcareSamples.sln` builds clean; all `Generated/` content regenerated each build through `dataprovider-postgres` against live Postgres + +## Context + +Generated `.g.cs` files are currently committed to git in three of four API projects (Clinical, Scheduling, Gatekeeper). The fourth (ICD10) already excludes them via a per-folder `.gitignore`. This causes constant noise: + +- The current uncommitted modification to `Gatekeeper/Gatekeeper.Api/Generated/CheckResourceGrant.g.cs` shows only the **order of parameters** changed between two runs of the generator. Output is non-deterministic across machines/runs. +- Worse, the current generator (`dataprovider-sqlite`) reads schema from a local SQLite mirror created from YAML by `migration-cli --provider sqlite`. SQLite has no real type system, so every generated column comes back as `string`. Look at `CheckResourceGrant.g.cs` lines 102–126: `id`, `granted_at`, `expires_at`, `permission_id` are all `string` when they should be `Guid` / `DateTimeOffset`. This is a latent runtime bug in addition to the file-churn problem. +- The MSBuild target uses `IgnoreExitCode="true"` on the codegen step, so silent generation failures get masked and someone could be tempted to hand-edit the resulting stale files. + +The goal: **all four projects must regenerate their data-access code from a live Postgres database on every build**, the generated files must never be committed, and there must never be any need to manually edit generated code. Build/test/CI must spin up the Postgres container before invoking the generators. + +--- + +## Recommended Approach (Summary) + +1. Switch all four projects from `dataprovider-sqlite` to `dataprovider-postgres` for accurate type introspection. +2. Move the migration step from inside each `.csproj` to a single `make db-migrate` target that applies YAML schemas to the live Postgres via `migration-cli --provider postgres`. +3. Add `make db-up` / `make db-down` targets that start/stop the Postgres container via `docker compose`. Make `db-up` a hard prerequisite of `make build`, `make test`, `make ci`. The MSBuild codegen target stays inside each `.csproj` so IDE rebuilds also regenerate, but it now expects Postgres to be reachable and **fails loudly** if it isn't (no more `IgnoreExitCode`). +4. Update the GitHub Actions workflow to use the same `make db-up && make db-migrate` flow instead of the inline `services:` Postgres block, so local and CI run the identical pipeline. +5. Delete tracked `Generated/` files from git, add a single root `**/Generated/` ignore rule, and consolidate the per-folder ICD10 ignores. + +--- + +## Critical Files + +| Path | Change | +| --- | --- | +| `.gitignore` (root) | Add `**/Generated/`, `*.generated.sql`, `*.db` | +| `ICD10/.gitignore` | Remove now-redundant lines (`Generated/`, `*.generated.sql`, `*.db`) | +| `.config/dotnet-tools.json` | Add `nimblesite.dataprovider.postgres.cli` (replaces sqlite version for codegen) | +| `Makefile` | Add `db-up`, `db-down`, `db-reset`, `db-migrate` targets; make `build`/`test`/`ci` depend on them | +| `docker/docker-compose.db.yml` (NEW) | Stripped-down compose with only the `db` service for use during build | +| `Clinical/Clinical.Api/Clinical.Api.csproj` | Switch Exec to `dataprovider-postgres`, remove `migration-cli` Target, remove `IgnoreExitCode` | +| `Scheduling/Scheduling.Api/Scheduling.Api.csproj` | Same | +| `Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj` | Same | +| `ICD10/ICD10.Api/ICD10.Api.csproj` | Same | +| `Clinical/Clinical.Api/DataProvider.json` | Update `connectionString` to Postgres; change `tables[].schema` from `main` to `public` | +| `Scheduling/Scheduling.Api/DataProvider.json` | Same | +| `Gatekeeper/Gatekeeper.Api/DataProvider.json` | Same | +| `ICD10/ICD10.Api/DataProvider.json` | Same | +| `.github/workflows/ci.yml` | Remove `services.postgres` block; add `make db-up` step before lint/test/build | + +--- + +## Step-by-Step Plan + +### Step 1 — Add `dataprovider-postgres` to dotnet tools + +Edit `.config/dotnet-tools.json` and add an entry alongside the existing tools: + +```json +"nimblesite.dataprovider.postgres.cli": { + "version": "0.2.0-beta", + "commands": ["dataprovider-postgres"], + "rollForward": false +} +``` + +Keep `nimblesite.dataprovider.sqlite.cli` for now (ICD10 may still use it temporarily — see Step 8). + +### Step 2 — Create a build-only docker compose file + +Create `docker/docker-compose.db.yml` containing only the `db` service from the existing `docker/docker-compose.yml` (lines 2–16). This is the file `make db-up` will invoke. Reuses the existing init scripts in `docker/init-db/` which create the four databases (`gatekeeper`, `clinical`, `scheduling`, `icd10`) and roles. + +### Step 3 — Add Makefile targets + +Insert into [Makefile](Makefile) after the existing `setup` target: + +```make +# ============================================================================= +# DATABASE LIFECYCLE (required for code generation + tests) +# ============================================================================= + +DB_COMPOSE := docker compose -f docker/docker-compose.db.yml +DB_PASSWORD ?= changeme + +## db-up: Start Postgres container and wait until healthy +db-up: + @echo "==> Starting Postgres..." + @$(DB_COMPOSE) up -d + @echo "==> Waiting for Postgres to become healthy..." + @for i in $$(seq 1 30); do \ + if $(DB_COMPOSE) exec -T db pg_isready -U postgres >/dev/null 2>&1; then \ + echo "Postgres ready."; exit 0; \ + fi; sleep 1; \ + done; \ + echo "FAIL: Postgres never became healthy"; $(DB_COMPOSE) logs db; exit 1 + +## db-down: Stop Postgres container (preserves volume) +db-down: + @echo "==> Stopping Postgres..." + @$(DB_COMPOSE) down + +## db-reset: Drop volume and restart Postgres clean +db-reset: + @echo "==> Resetting Postgres (DROP volumes)..." + @$(DB_COMPOSE) down -v + @$(MAKE) db-up + @$(MAKE) db-migrate + +## db-migrate: Apply YAML schemas to live Postgres for all four databases +db-migrate: db-up + @echo "==> Applying schemas..." + dotnet migration-cli --schema Gatekeeper/Gatekeeper.Api/gatekeeper-schema.yaml \ + --output "Host=localhost;Database=gatekeeper;Username=gatekeeper;Password=$(DB_PASSWORD)" \ + --provider postgres + dotnet migration-cli --schema Clinical/Clinical.Api/clinical-schema.yaml \ + --output "Host=localhost;Database=clinical;Username=clinical;Password=$(DB_PASSWORD)" \ + --provider postgres + dotnet migration-cli --schema Scheduling/Scheduling.Api/scheduling-schema.yaml \ + --output "Host=localhost;Database=scheduling;Username=scheduling;Password=$(DB_PASSWORD)" \ + --provider postgres + dotnet migration-cli --schema ICD10/ICD10.Api/icd10-schema.yaml \ + --output "Host=localhost;Database=icd10;Username=icd10;Password=$(DB_PASSWORD)" \ + --provider postgres +``` + +Then change the existing primary targets so they depend on a live, migrated database: + +```make +build: db-migrate + @echo "==> Building..." + dotnet build HealthcareSamples.sln --configuration Release + +test: db-migrate + @echo "==> Testing..." + dotnet test ... + +lint: fmt-check db-migrate + @echo "==> Linting..." + dotnet build HealthcareSamples.sln --configuration Release +``` + +`db-migrate` itself depends on `db-up`, so the chain `make ci → lint → db-migrate → db-up` guarantees the container is started before any `dotnet` invocation. Add `db-up`, `db-down`, `db-reset`, `db-migrate` to the `.PHONY` list and to the `help` target. + +### Step 4 — Update each `DataProvider.json` for Postgres + +For all four files (`Clinical/Clinical.Api/DataProvider.json`, `Scheduling/Scheduling.Api/DataProvider.json`, `Gatekeeper/Gatekeeper.Api/DataProvider.json`, `ICD10/ICD10.Api/DataProvider.json`): + +1. Replace the `connectionString` field. Example for Gatekeeper: + ```json + "connectionString": "Host=localhost;Database=gatekeeper;Username=gatekeeper;Password=changeme" + ``` + This is a **dev/codegen-only** connection string. Runtime uses `appsettings.json` / env vars and is unaffected. The plaintext `changeme` here is acceptable because the same default already lives in `docker/docker-compose.yml` line 24 and `.github/workflows/ci.yml` line 24. + +2. Change every `"schema": "main"` to `"schema": "public"`. SQLite's default schema is `main`; Postgres's is `public`. + +### Step 5 — Update each API `.csproj` + +For each of the four API csproj files, apply these changes (example shown for `Gatekeeper.Api.csproj`; the same pattern applies to the others — just delete the `CreateDatabaseSchema` Target since migrations now run via `make db-migrate`): + +**Delete** the `CreateDatabaseSchema` Target entirely (lines 33–40 in Gatekeeper, equivalent lines in the others). Migrations are now a Makefile concern. + +**Replace** the body of `TranspileLqlAndGenerateDataProvider`: + +```xml +<Target + Name="TranspileLqlAndGenerateDataProvider" + BeforeTargets="BeforeCompile;CoreCompile" + Inputs="$(MSBuildProjectDirectory)/DataProvider.json;@(AdditionalFiles);@(LqlFiles)" + Outputs="$(MSBuildProjectDirectory)/Generated/.timestamp" +> + <RemoveDir Directories="$(MSBuildProjectDirectory)/Generated" /> + <MakeDir Directories="$(MSBuildProjectDirectory)/Generated" /> + <ItemGroup> + <LqlFiles Include="$(MSBuildProjectDirectory)/**/*.lql" /> + </ItemGroup> + <Exec + Command="dotnet lqlcli-sqlite --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" + Condition="'$(EnableLqlTranspile)' == 'true' and @(LqlFiles) != ''" + WorkingDirectory="$(MSBuildProjectDirectory)" /> + <Exec + Command="dotnet dataprovider-postgres --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" + WorkingDirectory="$(MSBuildProjectDirectory)" + StandardOutputImportance="High" + StandardErrorImportance="High" /> + <Touch Files="$(MSBuildProjectDirectory)/Generated/.timestamp" AlwaysCreate="true" /> + <ItemGroup> + <Compile Include="$(MSBuildProjectDirectory)/Generated/**/*.g.cs" /> + </ItemGroup> +</Target> +``` + +Notable diffs from the current target: + +- `dataprovider-sqlite` → `dataprovider-postgres` +- `--connection-type NpgsqlConnection` flag removed (the postgres tool always emits Npgsql code) +- **`IgnoreExitCode="true"` removed** from the codegen step. If Postgres is unreachable, generation fails, and so does the build. This is the enforcement mechanism for "code must always be generated, never hand-edited". +- `ContinueOnError="WarnAndContinue"` removed from the LQL transpile step for the same reason. + +The `<Compile Remove="Generated/**" />` ItemGroup at the top of each csproj stays unchanged. + +### Step 6 — Delete tracked Generated files and update gitignore + +```bash +git rm -r --cached Clinical/Clinical.Api/Generated +git rm -r --cached Scheduling/Scheduling.Api/Generated +git rm -r --cached Gatekeeper/Gatekeeper.Api/Generated +``` + +ICD10 has zero tracked `Generated/` files (verified via `git ls-files`); skip it. + +Edit [.gitignore](.gitignore) and add to the C#/.NET section after line 70 (`obj/`): + +```gitignore +# Generated code (regenerated at build time by TranspileLqlAndGenerateDataProvider) +**/Generated/ +# LQL transpile output +*.generated.sql +# SQLite databases (legacy from previous codegen approach; safe to keep ignored) +*.db +``` + +Edit [ICD10/.gitignore](ICD10/.gitignore) and **delete** lines 1–4 (`# Generated files`, `*.generated.sql`, `*.db`, `Generated/`) — they are now redundant. Leave the Python and IDE sections alone. + +### Step 7 — Update the GitHub Actions workflow + +In [.github/workflows/ci.yml](.github/workflows/ci.yml): + +1. **Delete** the `services.postgres` block (lines 19–31). CI will now use the same `docker compose` flow as local dev via `make db-up`. +2. **Insert** a new step right after `dotnet tool restore` (between current line 59 and line 61): + ```yaml + - name: Start database + run: make db-up + - name: Apply schemas + run: make db-migrate + ``` +3. The existing `Lint` / `Test` / `Build` steps already invoke `make`, and those targets now depend on `db-migrate` (which depends on `db-up`), so even if the explicit steps above were removed the chain would still work. We add them explicitly for clarity in the CI log and to fail fast at a recognizable step name. +4. The embedding-service step at lines 42–56 stays as-is — it's unrelated. +5. Move the `Build` step (currently at line 88, the LAST step) to be the FIRST `make` invocation, BEFORE `Lint`, `Test`, `Coverage check`. Right now `make build` runs after `make test`, which is backwards: tests can't run if the build is broken, so build must come first. Order should be: db-up → db-migrate → build → lint → test → coverage-check → upload. + +### Step 8 — Decide ICD10's fate + +ICD10's codegen currently uses the SQLite path *and* references `EnableLqlTranspile=true` for `.lql` files. After this plan, ICD10 must also use `dataprovider-postgres`. Apply the same `.csproj` and `DataProvider.json` changes as the other three. This means the ICD10 SQLite-mirror approach goes away entirely. Verify that no test or runtime code depends on the on-disk `icd10.db` SQLite file (`<Content Include="icd10.db" Condition="Exists('icd10.db')">` in the csproj — that line should also be deleted, since the runtime is Postgres). Once ICD10 is migrated, `nimblesite.dataprovider.sqlite.cli` and `nimblesite.lql.cli.sqlite` can be reviewed for removal from `.config/dotnet-tools.json` (out of scope for this plan if anything still depends on them). + +--- + +## Verification + +1. **Local fresh-clone smoke test:** + ```bash + make clean + make db-reset # drops volume, starts Postgres clean, applies all schemas + make ci # lint + test + build, all from a clean slate + git status # MUST be clean — no Generated/ files showing up + ``` + Expected: every project regenerates its `Generated/*.g.cs` files from the live Postgres schema, all tests pass, working tree is clean. + +2. **Type-correctness spot check:** Open `Gatekeeper/Gatekeeper.Api/Generated/CheckResourceGrant.g.cs` after a successful build. The columns `id`, `permission_id`, `granted_at`, `expires_at` should now be `Guid` and `DateTimeOffset`, NOT `string`. This is the proof that Postgres-based introspection is wired up correctly. + +3. **Failure mode check:** + ```bash + make db-down + dotnet build Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj + ``` + Expected: build FAILS with a clear error from `dataprovider-postgres` saying it cannot connect. This verifies that `IgnoreExitCode` removal works — generation failures are now loud. + +4. **Re-build idempotency:** + ```bash + make build + make build + git status + ``` + Expected: clean working tree after both builds. (Note: even if the Nimblesite generator's output ordering is non-deterministic upstream, this no longer matters because the files are gitignored. See Concern (a).) + +5. **CI verification:** Push the branch and confirm the GitHub Actions run goes through the steps in this order: `Start database` → `Apply schemas` → `Build` → `Lint` → `Test` → `Coverage check`. All green. + +--- + +## Concerns / Follow-ups (out of scope but worth tracking) + +**(a) Non-deterministic generator output.** The `CheckResourceGrant.g.cs` parameter-order drift you observed comes from dictionary iteration order in `dataprovider-sqlite`. After this plan it stops mattering for git but should still be filed against `/Users/christianfindlay/Documents/Code/gigs/DataProvider` so future debugging diffs across machines stay quiet. Fix: sort keys with `StringComparer.Ordinal` before emitting parameter lists. + +**(b) Connection string secrecy.** `DataProvider.json` will contain `Password=changeme` in plaintext. Acceptable because (i) it's a dev-only password matching `docker-compose.yml` and CI defaults, (ii) the file is committed in source already, (iii) production runtime uses `appsettings.json` / env vars and is unaffected. If a real password ever appears here, it's a process failure regardless of file format. Optional follow-up: support `${DB_PASSWORD}` env var substitution in `DataProvider.json` upstream. + +**(c) MSBuild target re-runs every build.** The current target's `<RemoveDir>` deletes `.timestamp` immediately, so the `Inputs`/`Outputs` incremental check always sees outputs as missing and re-runs. This is wasteful but pre-existing; not changed by this plan. Follow-up: drop `RemoveDir`/`MakeDir` and let `dataprovider-postgres` overwrite in place, so MSBuild's incremental check actually works. + +**(d) `make lint` runs `csharpier check`.** Once generated files are produced fresh on every build, csharpier may flag them. Either (i) the generator must emit csharpier-clean output, or (ii) add `Generated/` to `.csharpierignore`. Recommend (ii) — generated code should never be linted. + +**(e) ICD10 SQLite leftovers.** Once Step 8 lands, the `*.db` files (`icd10.db`, `clinical.db`, `scheduling.db`, `gatekeeper.db`) and `*.generated.sql` files left over from prior builds become orphans. `make clean` should be extended to remove them, or they should be deleted from any working trees as a one-time cleanup. + +**(f) `dataprovider-postgres` argument set.** Confirmed via DataProvider source at `/Users/christianfindlay/Documents/Code/gigs/DataProvider/DataProvider/DataProvider.Postgres.Cli`: it accepts `--project-dir`, `--config`, `--out` and reads the connection string from `DataProvider.json`. No `--connection-string` flag exists, which is why we put the connection string in the JSON. diff --git a/docs/specs/gatekeeper-spec.md b/docs/specs/gatekeeper-spec.md new file mode 100644 index 0000000..fd3301a --- /dev/null +++ b/docs/specs/gatekeeper-spec.md @@ -0,0 +1,910 @@ +# Gatekeeper: Authentication & Authorization Microservice + +## Overview + +Gatekeeper is an independent, deployable authentication and authorization microservice implementing: +- **Passkey-only authentication** (WebAuthn/FIDO2) - no passwords +- **Fine-grained RBAC** with record-level permissions +- **Allows C# attributes to specify permissions or roles at code level** - distinc from .NET ABAC + +This service is framework-agnostic and can be integrated with any system via REST API. + +--- + +## Authoritative References + +### WebAuthn/FIDO2 Standards +- [W3C WebAuthn Specification](https://www.w3.org/TR/webauthn-3/) +- [FIDO Alliance Technical Specifications](https://fidoalliance.org/specs/fido-v2.0-id-20180227/fido-client-to-authenticator-protocol-v2.0-id-20180227.html) +- [WebAuthn Guide (webauthn.guide)](https://webauthn.guide/) + +### ASP.NET Core Implementation +- [ASP.NET Core Passkeys Documentation](https://learn.microsoft.com/en-us/aspnet/core/security/authentication/passkeys/?view=aspnetcore-10.0) +- [fido2-net-lib GitHub](https://github.com/passwordless-lib/fido2-net-lib) - Recommended library for .NET 9 +- [Syncfusion FIDO2 Tutorial](https://www.syncfusion.com/blogs/post/passkey-in-asp-dotnet-core-with-fido2) +- [damienbod/AspNetCoreIdentityFido2Mfa](https://github.com/damienbod/AspNetCoreIdentityFido2Mfa) - .NET 9 reference implementation + +### React Implementation +- [SimpleWebAuthn Documentation](https://simplewebauthn.dev/docs/packages/browser/) +- [SimpleWebAuthn Server Package](https://simplewebauthn.dev/docs/packages/server) +- [Complete React + WebAuthn Guide](https://medium.com/@siddhantahire98/building-a-modern-authentication-system-with-webauthn-passkeys-a-complete-guide-65cac3511049) + +### Access Control Design +- [NocoBase RBAC Design Guide](https://www.nocobase.com/en/blog/how-to-design-rbac-role-based-access-control-system) +- [Oso RBAC Layer Guide](https://www.osohq.com/learn/rbac-role-based-access-control) +- [SQLFlash Fine-Grained RBAC](https://sqlflash.ai/article/20250617-2/) +- [Hoop.dev Fine-Grained Access Control](https://hoop.dev/blog/fine-grained-access-control-and-rbac-building-secure-and-scalable-permission-systems/) +- [Permify Fine-Grained Access](https://permify.co/post/fine-grained-access-control-where-rbac-falls-short/) + +--- + +## Architecture + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ Client Application │ +│ (React Dashboard, etc.) │ +│ │ +│ ┌─────────────────┐ ┌─────────────────┐ │ +│ │ @simplewebauthn │ │ API Client │ │ +│ │ /browser │ │ (fetch/axios) │ │ +│ └────────┬────────┘ └────────┬────────┘ │ +└───────────┼──────────────────────┼──────────────────────────────┘ + │ │ + ▼ ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Gatekeeper API (:5002) │ +│ │ +│ ┌─────────────────────────────────────────────────────────┐ │ +│ │ Authentication Layer │ │ +│ │ POST /auth/register/begin - Start passkey creation │ │ +│ │ POST /auth/register/complete - Finish registration │ │ +│ │ POST /auth/login/begin - Start authentication │ │ +│ │ POST /auth/login/complete - Finish authentication │ │ +│ │ POST /auth/logout - Invalidate session │ │ +│ │ GET /auth/session - Get current session │ │ +│ └─────────────────────────────────────────────────────────┘ │ +│ │ +│ ┌─────────────────────────────────────────────────────────┐ │ +│ │ Authorization Layer │ │ +│ │ GET /authz/check - Check permission │ │ +│ │ GET /authz/permissions - List user permissions │ │ +│ │ POST /authz/evaluate - Bulk permission check │ │ +│ └─────────────────────────────────────────────────────────┘ │ +│ │ +│ ┌─────────────────────────────────────────────────────────┐ │ +│ │ Admin API (protected) │ │ +│ │ /admin/users - User management │ │ +│ │ /admin/roles - Role management │ │ +│ │ /admin/permissions - Permission management │ │ +│ │ /admin/policies - ABAC policy management │ │ +│ └─────────────────────────────────────────────────────────┘ │ +│ │ +│ ┌─────────────────────────────────────────────────────────┐ │ +│ │ fido2-net-lib (Fido2.AspNet) │ │ +│ │ Attestation & Assertion verification │ │ +│ └─────────────────────────────────────────────────────────┘ │ +└────────────────────────────┬────────────────────────────────────┘ + │ + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Database │ +│ │ +│ Users ──┬── Credentials (passkeys) │ +│ ├── UserRoles ── Roles ── RolePermissions │ +│ └── UserPermissions (direct grants) │ +│ │ │ +│ ▼ │ +│ Permissions ── ResourceType + Action + Scope │ +│ │ +│ Policies (ABAC) ── Conditions + Attributes │ +└─────────────────────────────────────────────────────────────────┘ +``` + +--- + +## Database Schema + +```mermaid +erDiagram + %% ═══════════════════════════════════════════════════════════════ + %% CORE AUTHENTICATION + %% ═══════════════════════════════════════════════════════════════ + + gk_user { + text id PK "UUID" + text display_name "NOT NULL" + text email UK "Unique index" + text created_at "NOT NULL, ISO8601" + text last_login_at "ISO8601" + boolean is_active "NOT NULL, default 1" + json metadata "Extensibility" + } + + gk_credential { + text id PK "Base64URL credential ID" + text user_id FK "NOT NULL" + blob public_key "NOT NULL, COSE format" + int sign_count "NOT NULL, default 0" + text aaguid "Authenticator AAGUID" + text credential_type "NOT NULL, 'public-key'" + json transports "['internal','usb','ble','nfc']" + text attestation_format + text created_at "NOT NULL" + text last_used_at + text device_name "User-friendly name" + boolean is_backup_eligible "BE flag" + boolean is_backed_up "BS flag" + } + + gk_session { + text id PK "JWT JTI" + text user_id FK "NOT NULL" + text credential_id FK + text created_at "NOT NULL" + text expires_at "NOT NULL" + text last_activity_at "NOT NULL" + text ip_address + text user_agent + boolean is_revoked "NOT NULL, default 0" + } + + gk_challenge { + text id PK "UUID" + text user_id "NULL for login" + blob challenge "NOT NULL, crypto random" + text type "NOT NULL, registration|authentication" + text created_at "NOT NULL" + text expires_at "NOT NULL, 5 min" + } + + %% ═══════════════════════════════════════════════════════════════ + %% RBAC + %% ═══════════════════════════════════════════════════════════════ + + gk_role { + text id PK + text name UK "NOT NULL, unique" + text description + boolean is_system "NOT NULL, default 0" + text created_at "NOT NULL" + text parent_role_id FK "Role hierarchy" + } + + gk_user_role { + text user_id PK,FK "Composite PK" + text role_id PK,FK "Composite PK" + text granted_at "NOT NULL" + text granted_by FK + text expires_at "Temporal grants" + } + + gk_permission { + text id PK + text code UK "NOT NULL, e.g. patient:read" + text resource_type "NOT NULL" + text action "NOT NULL, read|write|delete" + text description + text created_at "NOT NULL" + } + + gk_role_permission { + text role_id PK,FK "Composite PK" + text permission_id PK,FK "Composite PK" + text granted_at "NOT NULL" + } + + gk_user_permission { + text user_id FK "NOT NULL" + text permission_id FK "NOT NULL" + text scope_type "all|record|query" + text scope_value "Record ID or LQL query" + text granted_at "NOT NULL" + text granted_by FK + text expires_at + text reason "Audit trail" + } + + %% ═══════════════════════════════════════════════════════════════ + %% FINE-GRAINED ACCESS CONTROL + %% ═══════════════════════════════════════════════════════════════ + + gk_resource_grant { + text id PK + text user_id FK "NOT NULL" + text resource_type "NOT NULL, e.g. patient" + text resource_id "NOT NULL, e.g. patient-uuid" + text permission_id FK "NOT NULL" + text granted_at "NOT NULL" + text granted_by FK + text expires_at + } + + gk_policy { + text id PK + text name UK "NOT NULL" + text description + text resource_type "NOT NULL" + text action "NOT NULL" + json condition "NOT NULL, JSON expression" + text effect "NOT NULL, allow|deny" + int priority "NOT NULL, default 0" + boolean is_active "NOT NULL, default 1" + text created_at "NOT NULL" + } + + %% ═══════════════════════════════════════════════════════════════ + %% RELATIONSHIPS + %% ═══════════════════════════════════════════════════════════════ + + gk_user ||--o{ gk_credential : "has passkeys" + gk_user ||--o{ gk_session : "has sessions" + gk_credential ||--o{ gk_session : "authenticates" + + gk_user ||--o{ gk_user_role : "assigned to" + gk_role ||--o{ gk_user_role : "has members" + gk_user ||--o{ gk_user_role : "grants (granted_by)" + + gk_role ||--o| gk_role : "inherits from (parent)" + + gk_role ||--o{ gk_role_permission : "has" + gk_permission ||--o{ gk_role_permission : "granted to" + + gk_user ||--o{ gk_user_permission : "has direct" + gk_permission ||--o{ gk_user_permission : "granted directly" + gk_user ||--o{ gk_user_permission : "grants (granted_by)" + + gk_user ||--o{ gk_resource_grant : "has access to" + gk_permission ||--o{ gk_resource_grant : "defines access" + gk_user ||--o{ gk_resource_grant : "grants (granted_by)" +``` + +**Policy Condition Examples** (stored as JSON): +```json +{ "user.department": "finance", "resource.status": "draft" } +{ "time.hour": { "$gte": 9, "$lte": 17 } } +{ "user.id": { "$eq": "resource.owner_id" } } +``` + +--- + +## API Specification + +### Authentication Endpoints + +#### POST /auth/register/begin +Start passkey registration for a new or existing user. + +**Request:** +```json +{ + "email": "user@example.com", + "displayName": "John Doe" +} +``` + +**Response:** +```json +{ + "challengeId": "uuid", + "options": { + "challenge": "base64url-encoded-challenge", + "rp": { + "name": "Gatekeeper", + "id": "localhost" + }, + "user": { + "id": "base64url-user-id", + "name": "user@example.com", + "displayName": "John Doe" + }, + "pubKeyCredParams": [ + { "type": "public-key", "alg": -7 }, + { "type": "public-key", "alg": -257 } + ], + "timeout": 60000, + "attestation": "none", + "authenticatorSelection": { + "authenticatorAttachment": "platform", + "residentKey": "required", + "userVerification": "required" + } + } +} +``` + +#### POST /auth/register/complete +Complete passkey registration with authenticator response. + +**Request:** +```json +{ + "challengeId": "uuid", + "response": { + "id": "credential-id", + "rawId": "base64url", + "type": "public-key", + "response": { + "clientDataJSON": "base64url", + "attestationObject": "base64url" + } + }, + "deviceName": "MacBook Pro Touch ID" +} +``` + +**Response:** +```json +{ + "userId": "user-uuid", + "credentialId": "credential-id", + "session": { + "token": "session-token", + "expiresAt": "2025-12-22T00:00:00Z" + } +} +``` + +#### POST /auth/login/begin +Start passkey authentication. + +**Request:** +```json +{ + "email": "user@example.com" // Optional - for discoverable credentials +} +``` + +**Response:** +```json +{ + "challengeId": "uuid", + "options": { + "challenge": "base64url-encoded-challenge", + "timeout": 60000, + "rpId": "localhost", + "allowCredentials": [], // Empty for discoverable credentials + "userVerification": "required" + } +} +``` + +#### POST /auth/login/complete +Complete passkey authentication. + +**Request:** +```json +{ + "challengeId": "uuid", + "response": { + "id": "credential-id", + "rawId": "base64url", + "type": "public-key", + "response": { + "clientDataJSON": "base64url", + "authenticatorData": "base64url", + "signature": "base64url", + "userHandle": "base64url" + } + } +} +``` + +**Response:** +```json +{ + "userId": "user-uuid", + "displayName": "John Doe", + "session": { + "token": "session-token", + "expiresAt": "2025-12-22T00:00:00Z" + } +} +``` + +#### GET /auth/session +Get current session info. + +**Headers:** `Authorization: Bearer <session-token>` + +**Response:** +```json +{ + "userId": "user-uuid", + "displayName": "John Doe", + "email": "user@example.com", + "roles": ["admin", "clinician"], + "expiresAt": "2025-12-22T00:00:00Z" +} +``` + +#### POST /auth/logout +Invalidate current session. + +**Headers:** `Authorization: Bearer <session-token>` + +**Response:** `204 No Content` + +--- + +### Authorization Endpoints + +#### GET /authz/check +Check if current user has a specific permission. + +**Headers:** `Authorization: Bearer <session-token>` + +**Query Parameters:** +- `permission` - Permission code (e.g., `patient:read`) +- `resourceType` - Optional resource type +- `resourceId` - Optional specific resource ID + +**Response:** +```json +{ + "allowed": true, + "reason": "role:admin grants patient:read", + "evaluatedPolicies": ["default-admin-policy"] +} +``` + +#### POST /authz/evaluate +Bulk permission evaluation. + +**Request:** +```json +{ + "checks": [ + { "permission": "patient:read", "resourceId": "patient-123" }, + { "permission": "patient:write", "resourceId": "patient-123" }, + { "permission": "order:delete", "resourceId": "order-456" } + ] +} +``` + +**Response:** +```json +{ + "results": [ + { "permission": "patient:read", "resourceId": "patient-123", "allowed": true }, + { "permission": "patient:write", "resourceId": "patient-123", "allowed": true }, + { "permission": "order:delete", "resourceId": "order-456", "allowed": false } + ] +} +``` + +#### GET /authz/permissions +List all effective permissions for current user. + +**Response:** +```json +{ + "permissions": [ + { + "code": "patient:read", + "source": "role:clinician", + "scope": "all" + }, + { + "code": "patient:write", + "source": "direct-grant", + "scope": "record", + "scopeValue": "patient-123" + } + ] +} +``` + +--- + +### Admin Endpoints + +All admin endpoints require the `admin:*` permission. + +#### Users +- `GET /admin/users` - List users +- `GET /admin/users/{id}` - Get user details +- `POST /admin/users` - Create user (generates registration link) +- `PUT /admin/users/{id}` - Update user +- `DELETE /admin/users/{id}` - Deactivate user +- `GET /admin/users/{id}/credentials` - List user's passkeys +- `DELETE /admin/users/{id}/credentials/{credentialId}` - Revoke passkey + +#### Roles +- `GET /admin/roles` - List roles +- `GET /admin/roles/{id}` - Get role with permissions +- `POST /admin/roles` - Create role +- `PUT /admin/roles/{id}` - Update role +- `DELETE /admin/roles/{id}` - Delete role (if not system role) +- `POST /admin/roles/{id}/permissions` - Add permission to role +- `DELETE /admin/roles/{id}/permissions/{permissionId}` - Remove permission + +#### Permissions +- `GET /admin/permissions` - List permissions +- `POST /admin/permissions` - Create permission +- `DELETE /admin/permissions/{id}` - Delete permission + +#### User Grants +- `POST /admin/users/{id}/roles` - Assign role to user +- `DELETE /admin/users/{id}/roles/{roleId}` - Remove role +- `POST /admin/users/{id}/permissions` - Direct permission grant +- `DELETE /admin/users/{id}/permissions/{permissionId}` - Revoke grant +- `POST /admin/users/{id}/resources` - Grant resource-level access +- `DELETE /admin/users/{id}/resources/{grantId}` - Revoke resource access + +--- + +## Project Structure + +``` +Samples/ +└── Gatekeeper/ + ├── spec.md # This file + ├── Gatekeeper.Api/ + │ ├── Gatekeeper.Api.csproj + │ ├── Program.cs # Minimal API setup + │ ├── GlobalUsings.cs + │ ├── Endpoints/ + │ │ ├── AuthEndpoints.cs # /auth/* routes + │ │ ├── AuthzEndpoints.cs # /authz/* routes + │ │ └── AdminEndpoints.cs # /admin/* routes + │ ├── Services/ + │ │ ├── PasskeyService.cs # fido2-net-lib wrapper + │ │ ├── SessionService.cs # Session management + │ │ ├── AuthorizationService.cs # Permission evaluation + │ │ └── PolicyEvaluator.cs # ABAC policy engine + │ ├── Middleware/ + │ │ └── AuthMiddleware.cs # Session validation + │ ├── Sql/ + │ │ ├── GetUserByEmail.sql + │ │ ├── GetUserCredentials.sql + │ │ ├── InsertCredential.sql + │ │ ├── GetUserPermissions.sql + │ │ ├── CheckResourceGrant.sql + │ │ └── ... (DataProvider SQL files) + │ └── gatekeeper.db + │ + ├── Gatekeeper.Api.Tests/ + │ ├── Gatekeeper.Api.Tests.csproj + │ ├── AuthenticationTests.cs + │ ├── AuthorizationTests.cs + │ └── PermissionTests.cs + │ + └── Gatekeeper.Migration/ + ├── Gatekeeper.Migration.csproj + └── Schema.cs # Migration SchemaBuilder +``` + +--- + +## Implementation Guide + +### Dependencies (NuGet) + +```xml +<!-- Gatekeeper.Api.csproj --> +<PackageReference Include="Fido2" Version="4.*" /> +<PackageReference Include="Fido2.AspNet" Version="4.*" /> +<PackageReference Include="Microsoft.Data.Sqlite" Version="9.*" /> +``` + +### FIDO2 Configuration + +```csharp +// Program.cs +builder.Services.AddFido2(options => +{ + options.ServerDomain = builder.Configuration["Fido2:ServerDomain"] ?? "localhost"; + options.ServerName = "Gatekeeper"; + options.Origins = new HashSet<string> + { + builder.Configuration["Fido2:Origin"] ?? "http://localhost:5173" + }; + options.TimestampDriftTolerance = 300000; // 5 minutes +}); +``` + +### React Integration + +```typescript +// Using @simplewebauthn/browser +import { + startRegistration, + startAuthentication +} from '@simplewebauthn/browser'; + +// Registration +async function registerPasskey() { + const beginResp = await fetch('/auth/register/begin', { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ email, displayName }) + }); + const { challengeId, options } = await beginResp.json(); + + // Trigger browser passkey creation + const credential = await startRegistration(options); + + const completeResp = await fetch('/auth/register/complete', { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ challengeId, response: credential }) + }); + + return completeResp.json(); +} + +// Authentication +async function loginWithPasskey() { + const beginResp = await fetch('/auth/login/begin', { method: 'POST' }); + const { challengeId, options } = await beginResp.json(); + + const assertion = await startAuthentication(options); + + const completeResp = await fetch('/auth/login/complete', { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ challengeId, response: assertion }) + }); + + return completeResp.json(); +} +``` + +--- + +## Permission Model Examples + +### High-Level Permissions (Menu Access) +``` +menu:dashboard:access - Can see dashboard +menu:patients:access - Can see patients menu +menu:admin:access - Can see admin menu +``` + +### Resource Permissions (CRUD) +``` +patient:create - Can create patients +patient:read - Can read any patient +patient:write - Can update any patient +patient:delete - Can delete any patient + +order:create +order:read +order:write +order:delete +``` + +### Record-Level Permissions +Granted via `POST /admin/users/{id}/resources`: +```json +// User can only read order 123456 +{ + "resourceType": "order", + "resourceId": "123456", + "permissionCode": "order:read" +} + +// User can write to order 54345 +{ + "resourceType": "order", + "resourceId": "54345", + "permissionCode": "order:write" +} +``` + +### ABAC Policy Examples +```json +// Policy: Users can only edit resources they own +{ + "name": "owner-edit-policy", + "resource_type": "*", + "action": "write", + "condition": { + "user.id": { "$eq": "resource.owner_id" } + }, + "effect": "allow" +} + +// Policy: Finance users can only access finance resources during business hours +{ + "name": "finance-time-restriction", + "resource_type": "finance_report", + "action": "*", + "condition": { + "$and": [ + { "user.department": "finance" }, + { "context.hour": { "$gte": 9, "$lte": 17 } }, + { "context.day_of_week": { "$in": [1, 2, 3, 4, 5] } } + ] + }, + "effect": "allow" +} + +// Policy: Deny access to archived records except for admins +{ + "name": "archived-restriction", + "resource_type": "*", + "action": "*", + "condition": { + "$and": [ + { "resource.status": "archived" }, + { "user.roles": { "$nin": ["admin"] } } + ] + }, + "effect": "deny", + "priority": 100 +} +``` + +--- + +## Authorization Decision Flow + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ Authorization Request │ +│ User: user-123, Permission: order:write, Resource: order-456 │ +└────────────────────────────┬────────────────────────────────────┘ + │ + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Step 1: Check explicit DENY policies (highest priority first) │ +│ → If any DENY matches → DENY │ +└────────────────────────────┬────────────────────────────────────┘ + │ No DENY + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Step 2: Check resource-level grants (gk_resource_grant) │ +│ → SELECT * FROM gk_resource_grant │ +│ WHERE user_id = ? AND resource_type = ? │ +│ AND resource_id = ? AND permission_id = ? │ +│ → If found and not expired → ALLOW │ +└────────────────────────────┬────────────────────────────────────┘ + │ Not found + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Step 3: Check direct user permissions (gk_user_permission) │ +│ → If scope='all' → ALLOW │ +│ → If scope='record' and scope_value matches → ALLOW │ +└────────────────────────────┬────────────────────────────────────┘ + │ Not found + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Step 4: Check role permissions (gk_role_permission via roles) │ +│ → Traverse role hierarchy │ +│ → If permission found in any role → ALLOW │ +└────────────────────────────┬────────────────────────────────────┘ + │ Not found + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Step 5: Evaluate ABAC ALLOW policies │ +│ → If any ALLOW policy matches → ALLOW │ +└────────────────────────────┬────────────────────────────────────┘ + │ No match + ▼ +┌─────────────────────────────────────────────────────────────────┐ +│ Default: DENY │ +└─────────────────────────────────────────────────────────────────┘ +``` + +--- + +## Security Considerations + +1. **No Password Storage** - Only public keys stored; private keys never leave user devices +2. **Challenge Expiry** - All challenges expire after 5 minutes +3. **Session Rotation** - Sessions can be invalidated server-side +4. **Sign Count Verification** - Detect cloned authenticators +5. **User Verification Required** - Biometric/PIN required for all operations +6. **Audit Logging** - All authentication and authorization events logged + +--- + +## Integration with Other Services + +Other microservices integrate via middleware: + +```csharp +// In Clinical.Api or Scheduling.Api +app.Use(async (context, next) => +{ + var token = context.Request.Headers["Authorization"] + .ToString() + .Replace("Bearer ", ""); + + if (string.IsNullOrEmpty(token)) + { + context.Response.StatusCode = 401; + return; + } + + // Validate with Gatekeeper + using var client = new HttpClient(); + var response = await client.GetAsync( + $"http://localhost:5002/auth/session", + new HttpRequestMessage { Headers = { Authorization = new("Bearer", token) } } + ); + + if (!response.IsSuccessStatusCode) + { + context.Response.StatusCode = 401; + return; + } + + var session = await response.Content.ReadFromJsonAsync<SessionInfo>(); + context.Items["User"] = session; + + await next(); +}); +``` + +Or via shared library: +```csharp +// Gatekeeper.Client library +services.AddGatekeeperAuth(options => +{ + options.GatekeeperUrl = "http://localhost:5002"; +}); + +// Then in endpoints: +app.MapGet("/fhir/Patient", async (HttpContext ctx) => +{ + var authz = ctx.RequestServices.GetRequiredService<IGatekeeperClient>(); + if (!await authz.CheckAsync("patient:read")) + return Results.Forbid(); + + // ... handle request +}); +``` + +--- + +## Default Roles & Permissions + +Seeded via `Gatekeeper.Migration` on first run: + +| Role | Description | System? | +|------|-------------|---------| +| `admin` | Full system access | Yes | +| `user` | Basic authenticated user | Yes | + +| Permission Code | Resource | Action | Description | +|-----------------|----------|--------|-------------| +| `admin:*` | admin | * | Full admin access | +| `user:profile` | user | read | View own profile | +| `user:credentials` | user | manage | Manage own passkeys | + +| Role | Permissions | +|------|-------------| +| `admin` | `admin:*` | +| `user` | `user:profile`, `user:credentials` | + +--- + +## Sync Support + +Gatekeeper integrates with the existing Sync infrastructure for multi-node deployments. Sync triggers are enabled on permission tables via the `Sync.SQLite` schema extensions (uses existing `Sync.Http` infrastructure). + +--- + +## Open Questions + +1. **Token Format**: JWT vs opaque session tokens? + - Current spec uses opaque tokens for server-side revocation + - JWT could be added for stateless verification in edge cases + +2. **Cross-Origin Passkeys**: Support for passkeys across subdomains? + - Requires careful RP ID configuration + +3. **Recovery Flow**: What happens if user loses all devices? + - Admin-initiated account recovery? + - Backup codes? (against passkey-only philosophy) + +--- + +## Version History + +| Version | Date | Changes | +|---------|------|---------| +| 1.0.0 | 2025-12-21 | Initial specification | diff --git a/opencode.json b/opencode.json index e35e90a..c881b86 100644 --- a/opencode.json +++ b/opencode.json @@ -1,5 +1,5 @@ { - "_agent_pmo": "d58c330", + "_agent_pmo": "29b9dcf", "$schema": "https://opencode.ai/config.json", "instructions": ["CLAUDE.md"] } diff --git a/readme.md b/readme.md index 3c18ee2..d6c709c 100644 --- a/readme.md +++ b/readme.md @@ -15,13 +15,13 @@ This sample showcases: ```bash # Run all APIs locally against Docker Postgres -./scripts/start-local.sh +make start-local # Run everything in Docker containers -./scripts/start.sh +make start-docker -# Run APIs + sync workers -./scripts/start.sh --sync +# Force rebuild of the docker images +make start-docker BUILD=1 ``` | Service | URL | @@ -102,11 +102,7 @@ Built with H5 transpiler (C#->JavaScript) + React 18. ``` Samples/ -+-- scripts/ -| +-- start.sh # Docker startup script -| +-- start-local.sh # Local dev startup script -| +-- clean.sh # Clean Docker environment -| +-- clean-local.sh # Clean local environment ++-- Makefile # All build/test/dev-stack targets (make help) +-- Clinical/ | +-- Clinical.Api/ # REST API (PostgreSQL) | +-- Clinical.Api.Tests/ # E2E tests diff --git a/scripts/clean-local.sh b/scripts/clean-local.sh deleted file mode 100755 index e844564..0000000 --- a/scripts/clean-local.sh +++ /dev/null @@ -1,34 +0,0 @@ -#!/bin/bash -# Healthcare Samples - Clean local development environment -# Kills running services and drops the Postgres database volume -# Usage: ./clean-local.sh - -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -SAMPLES_DIR="$(dirname "$SCRIPT_DIR")" -REPO_ROOT="$(dirname "$SAMPLES_DIR")" - -kill_port() { - local port=$1 - local pids - pids=$(lsof -ti :"$port" 2>/dev/null || true) - if [ -n "$pids" ]; then - echo "Killing processes on port $port: $pids" - echo "$pids" | xargs kill -9 2>/dev/null || true - sleep 0.5 - fi -} - -echo "Clearing ports..." -kill_port 5002 -kill_port 5080 -kill_port 5001 -kill_port 5090 -kill_port 5173 - -echo "Removing Postgres volume..." -cd "$REPO_ROOT" -docker compose -f docker-compose.postgres.yml down -v 2>/dev/null || true - -echo "Clean complete." diff --git a/scripts/clean.sh b/scripts/clean.sh deleted file mode 100755 index 9800e94..0000000 --- a/scripts/clean.sh +++ /dev/null @@ -1,34 +0,0 @@ -#!/bin/bash -# Healthcare Samples - Clean Docker environment -# Kills running services and drops all Docker volumes -# Usage: ./clean.sh - -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -SAMPLES_DIR="$(dirname "$SCRIPT_DIR")" - -kill_port() { - local port=$1 - local pids - pids=$(lsof -ti :"$port" 2>/dev/null || true) - if [ -n "$pids" ]; then - echo "Killing processes on port $port: $pids" - echo "$pids" | xargs kill -9 2>/dev/null || true - sleep 0.5 - fi -} - -echo "Clearing ports..." -kill_port 5432 -kill_port 5002 -kill_port 5080 -kill_port 5001 -kill_port 5090 -kill_port 5173 - -echo "Removing Docker volumes..." -cd "$SAMPLES_DIR/docker" -docker compose down -v - -echo "Clean complete." diff --git a/scripts/start-local.sh b/scripts/start-local.sh deleted file mode 100755 index d5069c2..0000000 --- a/scripts/start-local.sh +++ /dev/null @@ -1,177 +0,0 @@ -#!/bin/bash -# Healthcare Samples - Local Development -# Runs all 4 APIs locally against docker-compose.postgres.yml -# -# Prerequisites: -# docker compose -f docker-compose.postgres.yml up -d -# -# Usage: ./start-local.sh [--fresh] -# --fresh: Drop postgres volume and recreate - -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -SAMPLES_DIR="$(dirname "$SCRIPT_DIR")" -REPO_ROOT="$(dirname "$SAMPLES_DIR")" -PIDS=() - -for arg in "$@"; do - case $arg in - --fresh) "$SCRIPT_DIR/clean-local.sh" ;; - esac -done - -# ── Ensure Postgres is running ────────────────────────────────────── -cd "$REPO_ROOT" - -if ! pg_isready -h localhost -p 5432 -q 2>/dev/null; then - echo "Postgres not running. Starting via docker-compose.postgres.yml..." - docker compose -f docker-compose.postgres.yml up -d - echo "Waiting for Postgres..." - for i in {1..30}; do - if pg_isready -h localhost -p 5432 -q 2>/dev/null; then - echo "Postgres ready!" - break - fi - sleep 1 - done -fi - -# ── Set up Python venv (shared by embedding service + import) ───── -VENV_DIR="$SAMPLES_DIR/ICD10/.venv" -EMBED_DIR="$SAMPLES_DIR/ICD10/embedding-service" - -echo "" -echo "Setting up Python environment..." -if [ ! -d "$VENV_DIR" ]; then - python3 -m venv "$VENV_DIR" -fi -"$VENV_DIR/bin/pip" install -q \ - -r "$EMBED_DIR/requirements.txt" \ - psycopg2-binary click requests -echo "Python environment ready." - -# ── Start Embedding Service ─────────────────────────────────────── -echo "Starting Embedding Service on :8000 (model loading may take a moment)..." -"$VENV_DIR/bin/python" -m uvicorn main:app --host 0.0.0.0 --port 8000 \ - --app-dir "$EMBED_DIR" \ - 2>&1 | sed 's/^/ [embedding] /' & -PIDS+=($!) - -# ── ICD10 data population (runs after APIs + embedding are ready) ─ -populate_icd10() { - local CONN_STR="Host=localhost;Database=icd10;Username=icd10;Password=$DB_PASS" - local SCRIPTS_DIR="$SAMPLES_DIR/ICD10/scripts/CreateDb" - - # Wait for ICD10 API to be ready - echo " [icd10-import] Waiting for ICD10 API..." - for i in {1..60}; do - if curl -sf http://localhost:5090/health >/dev/null 2>&1; then - echo " [icd10-import] ICD10 API is up." - break - fi - sleep 2 - done - - # Wait for embedding service to be ready (needed for AI search) - echo " [icd10-import] Waiting for embedding service..." - for i in {1..120}; do - if curl -sf http://localhost:8000/health >/dev/null 2>&1; then - echo " [icd10-import] Embedding service ready." - break - fi - sleep 2 - done - - # Check if data already exists (query the chapters endpoint) - local CHAPTERS - CHAPTERS=$(curl -sf http://localhost:5090/api/icd10/chapters 2>/dev/null || echo "[]") - if [ "$CHAPTERS" = "[]" ] || [ "$CHAPTERS" = "" ]; then - echo " [icd10-import] No ICD10 data found. Running full Postgres import..." - EMBEDDING_SERVICE_URL="http://localhost:8000" \ - "$VENV_DIR/bin/python" "$SCRIPTS_DIR/import_postgres.py" \ - --connection-string "$CONN_STR" || echo " [icd10-import] Import encountered errors (check logs above)" - else - echo " [icd10-import] ICD10 codes already populated. Generating missing embeddings..." - EMBEDDING_SERVICE_URL="http://localhost:8000" \ - "$VENV_DIR/bin/python" "$SCRIPTS_DIR/import_postgres.py" \ - --connection-string "$CONN_STR" --embeddings-only || echo " [icd10-import] Embedding generation encountered errors" - fi -} - -# ── Build all projects (avoids parallel build contention) ─────────── -echo "" -echo "Building all projects..." -dotnet build "$REPO_ROOT/Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj" --nologo -v q -dotnet build "$SAMPLES_DIR/Clinical/Clinical.Api/Clinical.Api.csproj" --nologo -v q -dotnet build "$SAMPLES_DIR/Scheduling/Scheduling.Api/Scheduling.Api.csproj" --nologo -v q -dotnet build "$SAMPLES_DIR/ICD10/ICD10.Api/ICD10.Api.csproj" --nologo -v q -dotnet build "$SAMPLES_DIR/Dashboard/Dashboard.Web/Dashboard.Web.csproj" -c Release --nologo -v q -echo "All projects built." - -# ── Cleanup on exit ───────────────────────────────────────────────── -cleanup() { - echo "" - echo "Shutting down..." - for pid in "${PIDS[@]}"; do - kill "$pid" 2>/dev/null || true - done - wait 2>/dev/null || true - echo "All services stopped." -} -trap cleanup EXIT INT TERM - -# ── Start APIs (--no-build since we pre-built above) ──────────────── -echo "" -DB_PASS="${DB_PASSWORD:-changeme}" - -echo "Starting Gatekeeper.Api on :5002..." -ConnectionStrings__Postgres="Host=localhost;Database=gatekeeper;Username=gatekeeper;Password=$DB_PASS" \ - dotnet run --no-build --project "$REPO_ROOT/Gatekeeper/Gatekeeper.Api/Gatekeeper.Api.csproj" --no-launch-profile \ - --urls "http://localhost:5002" \ - 2>&1 | sed 's/^/ [gatekeeper] /' & -PIDS+=($!) - -echo "Starting Clinical.Api on :5080..." -ConnectionStrings__Postgres="Host=localhost;Database=clinical;Username=clinical;Password=$DB_PASS" \ - dotnet run --no-build --project "$SAMPLES_DIR/Clinical/Clinical.Api/Clinical.Api.csproj" --no-launch-profile \ - --urls "http://localhost:5080" \ - 2>&1 | sed 's/^/ [clinical] /' & -PIDS+=($!) - -echo "Starting Scheduling.Api on :5001..." -ConnectionStrings__Postgres="Host=localhost;Database=scheduling;Username=scheduling;Password=$DB_PASS" \ - dotnet run --no-build --project "$SAMPLES_DIR/Scheduling/Scheduling.Api/Scheduling.Api.csproj" --no-launch-profile \ - --urls "http://localhost:5001" \ - 2>&1 | sed 's/^/ [scheduling] /' & -PIDS+=($!) - -echo "Starting ICD10.Api on :5090..." -ConnectionStrings__Postgres="Host=localhost;Database=icd10;Username=icd10;Password=$DB_PASS" \ - dotnet run --no-build --project "$SAMPLES_DIR/ICD10/ICD10.Api/ICD10.Api.csproj" --no-launch-profile \ - --urls "http://localhost:5090" \ - 2>&1 | sed 's/^/ [icd10] /' & -PIDS+=($!) - -echo "Starting Dashboard on :5173..." -python3 -m http.server 5173 --directory "$SAMPLES_DIR/Dashboard/Dashboard.Web/wwwroot" \ - 2>&1 | sed 's/^/ [dashboard] /' & -PIDS+=($!) - -# Populate ICD10 data in background (waits for API, then imports if empty) -populate_icd10 & -PIDS+=($!) - -echo "" -echo "════════════════════════════════════════" -echo " Gatekeeper: http://localhost:5002" -echo " Clinical: http://localhost:5080" -echo " Scheduling: http://localhost:5001" -echo " ICD10: http://localhost:5090" -echo " Embedding: http://localhost:8000" -echo " Dashboard: http://localhost:5173" -echo "════════════════════════════════════════" -echo " Press Ctrl+C to stop all services" -echo "" - -wait diff --git a/scripts/start.sh b/scripts/start.sh deleted file mode 100755 index 2c43ce7..0000000 --- a/scripts/start.sh +++ /dev/null @@ -1,39 +0,0 @@ -#!/bin/bash -# Healthcare Samples - Docker Compose wrapper -# Usage: ./start.sh [--fresh] [--build] -# --fresh: Drop volumes and start clean -# --build: Force rebuild containers - -set -e - -SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" -SAMPLES_DIR="$(dirname "$SCRIPT_DIR")" - -BUILD="" - -for arg in "$@"; do - case $arg in - --fresh) "$SCRIPT_DIR/clean.sh" ;; - --build) BUILD="--build" ;; - esac -done - -# Build Dashboard locally (H5 transpiler doesn't work in Docker Linux) -echo "Building Dashboard locally (H5 requires native build)..." -cd "$SAMPLES_DIR/Dashboard/Dashboard.Web" -dotnet publish -c Release -o "$SAMPLES_DIR/docker/dashboard-build" --nologo -v q -echo "Dashboard built successfully" - -cd "$SAMPLES_DIR/docker" - -echo "Starting services..." -docker compose up $BUILD - -# 3 containers: -# db: Postgres with all databases (localhost:5432) -# app: All .NET APIs + sync workers -# - Gatekeeper: localhost:5002 -# - Clinical: localhost:5080 -# - Scheduling: localhost:5001 -# - ICD10: localhost:5090 -# dashboard: Static files (localhost:5173) diff --git a/xunit.runner.json b/xunit.runner.json new file mode 100644 index 0000000..34723a8 --- /dev/null +++ b/xunit.runner.json @@ -0,0 +1,10 @@ +{ + "$schema": "https://xunit.net/schema/current/xunit.runner.schema.json", + "parallelizeAssembly": false, + "parallelizeTestCollections": false, + "maxParallelThreads": 1, + "stopOnFail": true, + "diagnosticMessages": true, + "longRunningTestSeconds": 30, + "methodDisplay": "method" +}