fix(security): harden authentication, add rate limiting, CSRF, input validation#60
fix(security): harden authentication, add rate limiting, CSRF, input validation#60Helal-maker wants to merge 8 commits intoweroperking:mainfrom
Conversation
docker: update MinIO images to latest stable versions cli: refactor deploy command argument handling for clarity provider: remove redundant WebSocket reconnection logic
…ME, and SELF_HOSTED files
…ket state, remove leaked content, fix broken links, sync Docker versions
…validation Critical security fixes addressing multiple vulnerabilities found in codebase review: - Fix auth bypass in auth template (removed 'code.length === 6' bypass) - Add in-memory rate limiting (5 attempts/15min) on admin login - Add WebSocket authentication requiring Bearer token - Remove hardcoded S3 credentials (fail fast if not configured) - Add input validation for filename/extension in storage endpoint - Add CSRF middleware with double-submit cookie pattern - Reduce JWT token expiry from 30 days to 7 days - Implement RLS role-based policy evaluation (auth.role() support) - Add comprehensive Security_Review.md spec document All changes verified with typecheck (server + core) and tests pass.
|
You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard. |
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 8 minutes and 19 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: ASSERTIVE Plan: Pro Run ID: 📒 Files selected for processing (1)
WalkthroughPins Bun and container images, tweaks Docker build stages and compose files, centralizes dashboard setup check, adds CSRF middleware and admin login rate limiting, shortens admin JWT expiry, enforces WebSocket auth, tightens storage filename validation, extends RLS evaluation with userRole, refactors WebSocket reconnect logic and several CLI handlers, and adds large spec/security docs. Changes
Sequence Diagram(s)mermaid Client->>Handler: GET /betterbase/ws?project=... Authorization: Bearer (or ?token=...) Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested labels
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 11
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (4)
packages/cli/src/index.ts (1)
32-53:⚠️ Potential issue | 🟠 MajorAuth gate fails with global options before the command.
Line 49 uses
process.argv[2], which breaks when global flags appear before the command (e.g.,bb --debug login). Since Commander.js v12 parses options anywhere on the command line,process.argv[2]becomes"--debug"instead of"login", forcing auth check on public commands. New users who runbb --debug loginwill hit authentication failure before login completes.The
preActionhook receives(thisCommand, actionCommand)as parameters; useactionCommand.name()instead:Fix
-async function checkAuthHook(): Promise<void> { - const commandName = process.argv[2]; +async function checkAuthHook(_thisCommand: Command, actionCommand: Command): Promise<void> { + const commandName = actionCommand.name();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/cli/src/index.ts` around lines 32 - 53, The auth gate currently reads the command from process.argv[2] which breaks when global flags precede the command; update checkAuthHook to accept the preAction hook parameters (thisCommand, actionCommand) and derive the command via actionCommand.name() (falling back to an empty string or first arg if necessary), then use PUBLIC_COMMANDS.includes(commandName) for the skip check; update the preAction hook registration to pass those parameters into checkAuthHook so public commands like login are correctly recognized even when global flags appear first.packages/server/src/routes/betterbase/index.ts (2)
94-105:⚠️ Potential issue | 🟠 MajorInconsistent credential handling:
buildStorageCtxstill falls back to hardcoded credentials.
generate-upload-urlnow correctly fails fast whenSTORAGE_ACCESS_KEY/STORAGE_SECRET_KEYare missing (lines 206-208), butbuildStorageCtxstill uses"minioadmin"fallbacks. This creates an inconsistent security posture—queries/mutations/actions will silently use default credentials while direct uploads fail.Apply the same fail-fast pattern here or throw at startup.
Proposed fix
function buildStorageCtx(pool: any, projectSlug: string): StorageCtx { const env = validateEnv(); + if (!env.STORAGE_ACCESS_KEY || !env.STORAGE_SECRET_KEY) { + throw new Error("Storage credentials not configured"); + } return new StorageCtx({ pool, projectSlug, endpoint: env.STORAGE_ENDPOINT ?? "http://minio:9000", - accessKey: env.STORAGE_ACCESS_KEY ?? "minioadmin", - secretKey: env.STORAGE_SECRET_KEY ?? "minioadmin", + accessKey: env.STORAGE_ACCESS_KEY, + secretKey: env.STORAGE_SECRET_KEY, bucket: env.STORAGE_BUCKET ?? "betterbase", publicBase: env.STORAGE_PUBLIC_BASE, }); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/server/src/routes/betterbase/index.ts` around lines 94 - 105, buildStorageCtx currently falls back to hardcoded "minioadmin" credentials which is inconsistent with the fail-fast behavior in generate-upload-url; update buildStorageCtx (function buildStorageCtx and its use of validateEnv) to require STORAGE_ACCESS_KEY and STORAGE_SECRET_KEY instead of using defaults—either validate and throw/exit if those env vars are missing (same pattern as generate-upload-url) or propagate the validated env values directly (remove the "minioadmin" fallbacks for accessKey/secretKey and ensure publicBase/endpoint/bucket use validateEnv outputs).
234-244:⚠️ Potential issue | 🔴 CriticalSQL injection via unvalidated
projectSlugin dynamic schema names — affects multiple handlers.
projectSlugis read directly from theX-Project-Slugheader without validation. At lines 56 and 234–235, it's interpolated into SQL query strings. A malicious header value likefoo"; DROP TABLE --compromises the database.The coding guidelines forbid string interpolation with user input for dynamic schemas. Additionally, this pattern appears in two handlers:
- Main POST handler (line 56):
const dbSchema =project_${projectSlug};/storage/generate-upload-urlhandler (line 234):INSERT INTO "project_${projectSlug}"._iac_storageFix: Validate
projectSlugagainst the allowed pattern[a-z][a-z0-9_]*before use, or usepg-formatwith identifier escaping (%I). AvalidateSchemaNamefunction exists inlib/inngest.tsbut is not exported; either export it or define validation inline.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/server/src/routes/betterbase/index.ts` around lines 234 - 244, projectSlug is used unsafely in dynamic schema names (dbSchema and the INSERT into "project_${projectSlug}"._iac_storage) allowing SQL injection; fix by validating/escaping projectSlug before interpolation: either export and reuse the existing validateSchemaName from lib/inngest.ts (or call a new local validator) to assert projectSlug matches /^[a-z][a-z0-9_]*$/ and throw on failure, or use pg-format's identifier escaping (%I) when building schema-qualified identifiers; ensure all occurrences (the dbSchema const and the INSERT into _iac_storage) use the validated/escaped identifier rather than raw projectSlug.packages/server/src/routes/admin/auth.ts (1)
47-71:⚠️ Potential issue | 🟡 MinorRate limit counter increments on successful login.
The counter is incremented before credential validation (lines 48-50) and never reset on success. A legitimate user who fails 4 times then succeeds will hit the limit on their next login attempt within the window.
Reset on successful authentication:
const token = await signAdminToken(admin.id); +loginAttempts.delete(c.req.header("X-Real-IP") || "unknown"); return c.json({ token, admin: { id: admin.id, email: admin.email } });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/server/src/routes/admin/auth.ts` around lines 47 - 71, The rate limiter is being incremented before credential validation via loginRateLimiter(c) and is never cleared on success; update the handler so the limiter does not penalize successful logins by either (A) using a non-incrementing check variant (e.g., loginRateLimiter(c, { increment: false }) or a dedicated check function) before verifying credentials, or (B) if only an incrementing call exists, call the limiter's reset/clear method after successful authentication (e.g., loginRateLimiter.reset(c) or similar) immediately after verifyPassword(...) returns true and before returning the token from signAdminToken(...); reference loginRateLimiter, verifyPassword, signAdminToken and the admin auth handler when making the change.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/ci.yml:
- Around line 18-19: The bun-version value is duplicated across multiple jobs
(key: bun-version); centralize it by adding a single workflow-level env or input
(e.g., env: BUN_VERSION: "1.3.10" or workflow input bun_version) and update each
job to reference that variable (use env.BUN_VERSION or inputs.bun_version)
instead of repeating bun-version in each job; remove the per-job bun-version
entries so CI upgrades are made in one place.
In `@apps/dashboard/src/lib/api.ts`:
- Around line 98-105: checkSetup() is sending a POST with {_check: true} which
is blocked by the global csrfMiddleware and also doesn't match the auth
endpoint's validator (which expects { email, password }); fix by adding a
CSRF-exempt, read-only endpoint and updating the client call: create a new GET
route like GET /admin/auth/setup/check (or similar) in the same auth handler
file (auth.ts) that returns boolean/setup status without requiring body
validation, ensure app.use("*", csrfMiddleware) continues to exempt this new
path (or explicitly whitelist it in csrfMiddleware), then change the client
function checkSetup() to perform a GET to /admin/auth/setup/check and interpret
the response (instead of POST {_check: true}); alternatively, if you prefer to
keep the POST, update the auth endpoint validator in auth.ts to accept the
_check parameter and add that route to the csrfMiddleware whitelist.
In `@Dockerfile.project`:
- Line 18: Replace the mutable tag in the Dockerfile FROM statement by pinning
the base image to its SHA256 digest: locate the FROM line that references
oven/bun:1.3.10-debian and replace the tag with the resolved digest (e.g., FROM
oven/bun@sha256:<digest>); obtain the digest via docker inspect
oven/bun:1.3.10-debian or by querying the registry/manifest and paste that
SHA256 into the FROM line to ensure immutable, reproducible builds.
In `@packages/client/src/iac/provider.tsx`:
- Line 75: The useEffect hook that depends on config.url and config.projectSlug
should also include config.getToken in its dependency array so the connection
can re-establish when the auth/token changes; update the dependency array for
the effect that references config.url and config.projectSlug (in provider.tsx)
to include config.getToken and ensure any closures using getToken are stable or
re-created accordingly.
- Around line 33-38: The WebSocket connect() does not include the required
Bearer token; call config.getToken() inside connect(), await the token, and
append it as a URL query parameter (e.g., &token=<encoded token>) when building
wsUrl before creating the WebSocket and assigning wsRef.current, so the server
can authenticate the upgrade; ensure connect() still guards with isCleaned and
handles missing/empty tokens gracefully.
In `@packages/server/src/lib/csrf.ts`:
- Around line 23-30: The middleware currently generates a new CSRF token on
every safe request using crypto.randomUUID() and writes it as an HttpOnly cookie
via c.header("Set-Cookie", `${CSRF_COOKIE_NAME}=...; HttpOnly; ...`), but the
verifier expects the token in the x-csrf-token header (which JS cannot read from
HttpOnly cookies) causing desync; change the logic in the isSafeMethod branch so
you only generate and set a token when none exists or it is expired (avoid
rotating on every safe request) and write the cookie without HttpOnly (make it
accessible to client JS) so client code can read CSRF_COOKIE_NAME and include it
in the x-csrf-token header; update the Set-Cookie call (and any token storage
lookup) accordingly and keep using CSRF_COOKIE_NAME, the x-csrf-token header
name, and the existing crypto.randomUUID() token generation.
In `@packages/server/src/routes/admin/auth.ts`:
- Around line 13-32: The in-memory rate limiter (loginAttempts used by
loginRateLimiter, with RATE_LIMIT_WINDOW_MS and RATE_LIMIT_MAX_ATTEMPTS) never
removes expired entries causing unbounded memory growth; fix it by adding
cleanup of expired entries either via a periodic task (setInterval) that
iterates loginAttempts and deletes entries where resetAt <= Date.now(), or by
performing a cleanup pass at the start of loginRateLimiter that removes stale
entries before checking/incrementing the current IP; ensure the cleanup logic
references loginAttempts and respects RATE_LIMIT_WINDOW_MS so only expired
entries are removed.
- Around line 18-31: The loginRateLimiter currently collapses missing X-Real-IP
to "unknown"; update loginRateLimiter to derive a unique client IP instead:
first read X-Real-IP, if absent parse X-Forwarded-For and take the first
non-empty entry, and if that is also missing fall back to the actual remote
address from the request (e.g., connection/socket remote address) or return a
400/deny response; then use that resolved IP when looking up/setting
loginAttempts so anonymous requests do not share a single bucket (references:
loginRateLimiter, loginAttempts, RATE_LIMIT_MAX_ATTEMPTS, RATE_LIMIT_WINDOW_MS,
headers "X-Real-IP" and "X-Forwarded-For").
In `@packages/server/src/routes/betterbase/ws.ts`:
- Around line 144-166: The WebSocket auth currently only reads Authorization
from req.headers (req.headers.get("Authorization")) and rejects browser clients;
update the auth flow in ws.ts to accept a token fallback from the connection URL
(parse the request URL and use url.searchParams.get("token") if
extractBearerToken(req.headers.get("Authorization")) returns no token), then
pass that token into verifyAdminToken(token) (or try subprotocol / first-message
auth if preferred); ensure you still return the same 401 Response when no token
is found or verifyAdminToken fails, but prefer the query-param token before
failing.
In `@specs/CodeRabbit_Full_Codebase_review.md`:
- Around line 780-783: Remove the accidental binary/blob payload that was pasted
after the HTML marker "<!-- tips_end -->" and before the closing markdown fence;
locate the long opaque base64-like block following "<!-- tips_end -->" and
delete that entire blob so the file contains only the intended markdown content
and the closing "```" fence remains intact.
In `@templates/auth/src/routes/auth.ts`:
- Around line 134-135: The magic-link verification route (GET
/magic-link/verify) currently accepts any token matching the "dev-token-*"
pattern and issues sessions unconditionally; restrict this behavior by gating
the dev-token acceptance behind an explicit environment check (e.g.,
process.env.NODE_ENV === "development" or a dedicated flag like
process.env.ALLOW_DEV_TOKENS) inside the verify handler, so the dev-token branch
in the verify logic only runs when the environment flag is enabled and otherwise
falls through to the normal verification flow or rejects the token; update the
code paths that call session issuance (the function/method that creates sessions
in the /magic-link/verify handler) to only be invoked from the gated branch.
---
Outside diff comments:
In `@packages/cli/src/index.ts`:
- Around line 32-53: The auth gate currently reads the command from
process.argv[2] which breaks when global flags precede the command; update
checkAuthHook to accept the preAction hook parameters (thisCommand,
actionCommand) and derive the command via actionCommand.name() (falling back to
an empty string or first arg if necessary), then use
PUBLIC_COMMANDS.includes(commandName) for the skip check; update the preAction
hook registration to pass those parameters into checkAuthHook so public commands
like login are correctly recognized even when global flags appear first.
In `@packages/server/src/routes/admin/auth.ts`:
- Around line 47-71: The rate limiter is being incremented before credential
validation via loginRateLimiter(c) and is never cleared on success; update the
handler so the limiter does not penalize successful logins by either (A) using a
non-incrementing check variant (e.g., loginRateLimiter(c, { increment: false })
or a dedicated check function) before verifying credentials, or (B) if only an
incrementing call exists, call the limiter's reset/clear method after successful
authentication (e.g., loginRateLimiter.reset(c) or similar) immediately after
verifyPassword(...) returns true and before returning the token from
signAdminToken(...); reference loginRateLimiter, verifyPassword, signAdminToken
and the admin auth handler when making the change.
In `@packages/server/src/routes/betterbase/index.ts`:
- Around line 94-105: buildStorageCtx currently falls back to hardcoded
"minioadmin" credentials which is inconsistent with the fail-fast behavior in
generate-upload-url; update buildStorageCtx (function buildStorageCtx and its
use of validateEnv) to require STORAGE_ACCESS_KEY and STORAGE_SECRET_KEY instead
of using defaults—either validate and throw/exit if those env vars are missing
(same pattern as generate-upload-url) or propagate the validated env values
directly (remove the "minioadmin" fallbacks for accessKey/secretKey and ensure
publicBase/endpoint/bucket use validateEnv outputs).
- Around line 234-244: projectSlug is used unsafely in dynamic schema names
(dbSchema and the INSERT into "project_${projectSlug}"._iac_storage) allowing
SQL injection; fix by validating/escaping projectSlug before interpolation:
either export and reuse the existing validateSchemaName from lib/inngest.ts (or
call a new local validator) to assert projectSlug matches /^[a-z][a-z0-9_]*$/
and throw on failure, or use pg-format's identifier escaping (%I) when building
schema-qualified identifiers; ensure all occurrences (the dbSchema const and the
INSERT into _iac_storage) use the validated/escaped identifier rather than raw
projectSlug.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 0623e5b8-34a5-4181-ba9a-4e5f5ec54445
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock,!**/bun.lock,!**/*.lock
📒 Files selected for processing (38)
.github/workflows/ci.ymlDockerfile.projectapps/dashboard/src/components/auth/SetupGuard.tsxapps/dashboard/src/lib/api.tsapps/dashboard/src/vite-env.d.tsdocker-compose.dev.ymldocker-compose.self-hosted.ymldocker-compose.ymlpackages/cli/src/index.tspackages/client/src/iac/provider.tsxpackages/core/src/rls/evaluator.tspackages/server/package.jsonpackages/server/src/index.tspackages/server/src/lib/auth.tspackages/server/src/lib/csrf.tspackages/server/src/routes/admin/auth.tspackages/server/src/routes/betterbase/index.tspackages/server/src/routes/betterbase/ws.tsspecs/03_test_suite.mdspecs/BETTERBASE.mdspecs/BetterBase_Competitive_Plan.mdspecs/BetterBase_Dashboard_Backend_Spec.mdspecs/BetterBase_Dashboard_Frontend_Spec.mdspecs/BetterBase_IaC_Phase2_Spec.mdspecs/BetterBase_IaC_Phase3_Spec.mdspecs/BetterBase_InfraAsCode_Spec.mdspecs/BetterBase_Inngest_Dashboard_Spec.mdspecs/BetterBase_Inngest_Spec.mdspecs/BetterBase_Observability_Spec.docx.mdspecs/BetterBase_SelfHosted_Spec.mdspecs/CODEBASE_MAP.mdspecs/CONTRIBUTING.mdspecs/CodeRabbit_Full_Codebase_review.mdspecs/NOTICE.mdspecs/README.mdspecs/SELF_HOSTED.mdspecs/Security_Review.mdtemplates/auth/src/routes/auth.ts
💤 Files with no reviewable changes (1)
- docker-compose.dev.yml
… fix CSRF/setup check, pin Dockerfile digest, add WebSocket token, fix rate limiter, guard dev tokens
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
packages/server/src/lib/inngest.ts (1)
136-143: 🧹 Nitpick | 🔵 TrivialDirect
process.envaccess in lib module.Per coding guidelines, env access should go through
validateEnv()insrc/lib/env.ts. These Inngest client config options readprocess.envdirectly. Consider adding these env vars to the Zod schema inenv.tsand usingvalidateEnv()here.+import { validateEnv } from "./env"; + export const inngest = new Inngest({ id: "betterbase", schemas: new EventSchemas().fromRecord<Events>(), - baseUrl: process.env.INNGEST_BASE_URL, - signingKey: process.env.INNGEST_SIGNING_KEY, - eventKey: process.env.INNGEST_EVENT_KEY ?? "betterbase-dev-event-key", + baseUrl: validateEnv().INNGEST_BASE_URL, + signingKey: validateEnv().INNGEST_SIGNING_KEY, + eventKey: validateEnv().INNGEST_EVENT_KEY ?? "betterbase-dev-event-key", });As per coding guidelines: "All env access goes through validateEnv() in src/lib/env.ts. Never access process.env directly in route handlers or lib modules outside of env.ts and db.ts."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/server/src/lib/inngest.ts` around lines 136 - 143, The Inngest client config is reading process.env directly; add INNGEST_BASE_URL, INNGEST_SIGNING_KEY, and INNGEST_EVENT_KEY to the Zod schema in src/lib/env.ts and ensure validateEnv() returns them, then update packages/server/src/lib/inngest.ts to import and call validateEnv() (or use the validated env object) instead of accessing process.env directly so baseUrl, signingKey, and eventKey are sourced from the validated environment values.packages/server/src/routes/betterbase/index.ts (1)
185-207:⚠️ Potential issue | 🟡 MinorMissing Zod validation on request body.
Line 186 reads the body directly via
c.req.json()without using@hono/zod-validator. This violates the HONO PATTERNS guideline.Also, the
contentTyperegex rejects valid MIME types containing+(e.g.,application/ld+json,image/svg+xml).Proposed fix with zValidator and broader contentType regex
+import { zValidator } from "@hono/zod-validator"; +const uploadUrlSchema = z.object({ + filename: z.string().regex(/^[a-zA-Z0-9_.-]+$/, "Invalid filename").optional(), + contentType: z.string().regex(/^[a-zA-Z0-9.+-]+\/[a-zA-Z0-9.+-]+$/, "Invalid content type").optional(), +}); // Direct browser upload endpoint: POST /betterbase/storage/generate-upload-url -betterbaseRouter.post("/storage/generate-upload-url", async (c) => { - const { contentType, filename } = await c.req.json(); +betterbaseRouter.post( + "/storage/generate-upload-url", + zValidator("json", uploadUrlSchema), + async (c) => { + const { contentType, filename } = c.req.valid("json");As per coding guidelines: "All request bodies validated with
@hono/zod-validatorbefore handler logic. Never access c.req.json() directly in a handler that should be validated."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/server/src/routes/betterbase/index.ts` around lines 185 - 207, The handler for betterbaseRouter.post("/storage/generate-upload-url") reads the body directly via c.req.json() and uses a restrictive MIME regex; replace that by applying `@hono/zod-validator` with a Zod schema that validates { contentType?: string, filename?: string } (validate filename with /^[a-zA-Z0-9_.-]+$/, ext with /^[a-zA-Z0-9]{1,10}$/, and contentType with a broader MIME regex that allows "+" like /^[a-zA-Z0-9.+-]+\/[a-zA-Z0-9.+-]+$/), then read the parsed body from the validator (not c.req.json()) and keep the existing filename/extension checks using the parsed values; ensure the route uses zValidator middleware (or zValidator in the route options) before executing the current logic in the POST handler.templates/auth/src/routes/auth.ts (1)
68-80:⚠️ Potential issue | 🟠 MajorDevelopment login endpoints return fabricated tokens without creating sessions.
Lines 71, 134, 246, and 322 return synthetic UUIDs, but the file never creates a better-auth session or sets a session cookie. The middleware (
templates/auth/src/middleware/auth.ts) validates exclusively viaauth.api.getSession({ headers }), which ignores the returned token. Login succeeds in the response, but any subsequent protected request fails authorization. Route these endpoints throughauth.api.signIn()or equivalent session creation instead of returning opaque UUIDs.Also applies to:
/otp/verify,/mfa/challenge,/phone/verify🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@templates/auth/src/routes/auth.ts` around lines 68 - 80, The dev-login branches that check isDev and return crypto.randomUUID() produce opaque tokens but never create a real session, so subsequent requests fail auth (middleware uses auth.api.getSession({ headers })). Replace the synthetic response in the dev branches of the login, otp/verify, mfa/challenge, and phone/verify handlers by calling the same session-creation routine used in real flows (e.g., auth.api.signIn() or the project’s signIn handler), persist the session and set the appropriate session cookie/headers on the response instead of returning a UUID, and ensure the returned token/user shape matches what auth.api.getSession expects so middleware will validate the dev session.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/ci.yml:
- Line 8: CI is using BUN_VERSION: "1.3.10" while Dockerfiles are pinned to
"1.3.9-debian" and "1.3.9-alpine", causing a version skew; update the CI
variable BUN_VERSION to match the Dockerfile pins (set BUN_VERSION: "1.3.9") or
instead update the Dockerfile pins to "1.3.10-..." so all places use the same
Bun version (refer to the BUN_VERSION entry and the Dockerfile image tags
"1.3.9-debian" / "1.3.9-alpine" to make the change).
In `@apps/dashboard/src/lib/api.ts`:
- Around line 98-103: The checkSetup function currently calls fetch directly and
lets network errors bubble up; wrap the fetch call in a try/catch so network
failures are handled and do not throw uncaught, and return a safe boolean (e.g.,
false) on error. Update the checkSetup implementation (referencing checkSetup
and API_BASE) to perform the fetch inside a try block, catch any exceptions, log
or handle the error as appropriate, and return false from the catch; if you have
an existing module-level request helper (use that instead of raw fetch), call
that helper to keep all server calls going through the central API abstraction.
- Line 1: API_BASE is assigned directly from import.meta.env.VITE_API_URL and
can be undefined, causing string interpolation like `${API_BASE}/...` to produce
"undefined/..."; update the assignment of API_BASE (or create a small
getApiBase() helper) to validate the env var and either supply a sensible
fallback for dev (e.g., "http://localhost:PORT") or throw a clear error at
startup if missing; ensure all usages that interpolate API_BASE (e.g.,
`${API_BASE}/...`) rely on this validated/fallback value so network requests
never use "undefined" as the origin.
In `@packages/server/src/lib/csrf.ts`:
- Around line 32-43: The code reads X-CSRF-Token-Expiry from the incoming
request (c.req.header("X-CSRF-Token-Expiry")) but that header is only set on the
response, so tokenExpiry will always be undefined and tokens are regenerated
every safe request; fix by removing the request header check and associated
expiry logic in the CSRF flow (leave getCookie(c.req.raw, CSRF_COOKIE_NAME)
check only), compute Max-Age from CSRF_TOKEN_EXPIRY_MS (divide by 1000) when
setting the cookie in c.header("Set-Cookie", ...) and stop setting or reading
the X-CSRF-Token-Expiry header; update references in this block around
getCookie, CSRF_COOKIE_NAME, CSRF_TOKEN_EXPIRY_MS, c.req.header and c.header
accordingly.
In `@packages/server/src/routes/admin/auth.ts`:
- Around line 18-30: The resolveClientIP function must not fall back to the Host
header because Host is a hostname, not a client IP; update
resolveClientIP(Context) to return null/undefined (or throw) when neither
X-Real-IP nor X-Forwarded-For are present instead of returning
c.req.header("Host"), and then update the admin auth route/middleware that calls
resolveClientIP to reject requests with a 4xx (e.g., 400 or 429) when no client
IP is available so sensitive endpoints are not grouped by hostname; reference
resolveClientIP and the admin auth middleware/handler that uses it and ensure
rate-limiter logic treats a missing IP as an error condition rather than using
Host.
---
Outside diff comments:
In `@packages/server/src/lib/inngest.ts`:
- Around line 136-143: The Inngest client config is reading process.env
directly; add INNGEST_BASE_URL, INNGEST_SIGNING_KEY, and INNGEST_EVENT_KEY to
the Zod schema in src/lib/env.ts and ensure validateEnv() returns them, then
update packages/server/src/lib/inngest.ts to import and call validateEnv() (or
use the validated env object) instead of accessing process.env directly so
baseUrl, signingKey, and eventKey are sourced from the validated environment
values.
In `@packages/server/src/routes/betterbase/index.ts`:
- Around line 185-207: The handler for
betterbaseRouter.post("/storage/generate-upload-url") reads the body directly
via c.req.json() and uses a restrictive MIME regex; replace that by applying
`@hono/zod-validator` with a Zod schema that validates { contentType?: string,
filename?: string } (validate filename with /^[a-zA-Z0-9_.-]+$/, ext with
/^[a-zA-Z0-9]{1,10}$/, and contentType with a broader MIME regex that allows "+"
like /^[a-zA-Z0-9.+-]+\/[a-zA-Z0-9.+-]+$/), then read the parsed body from the
validator (not c.req.json()) and keep the existing filename/extension checks
using the parsed values; ensure the route uses zValidator middleware (or
zValidator in the route options) before executing the current logic in the POST
handler.
In `@templates/auth/src/routes/auth.ts`:
- Around line 68-80: The dev-login branches that check isDev and return
crypto.randomUUID() produce opaque tokens but never create a real session, so
subsequent requests fail auth (middleware uses auth.api.getSession({ headers
})). Replace the synthetic response in the dev branches of the login,
otp/verify, mfa/challenge, and phone/verify handlers by calling the same
session-creation routine used in real flows (e.g., auth.api.signIn() or the
project’s signIn handler), persist the session and set the appropriate session
cookie/headers on the response instead of returning a UUID, and ensure the
returned token/user shape matches what auth.api.getSession expects so middleware
will validate the dev session.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: dafc9d68-f0e4-4907-b44a-e114b5d74eea
📒 Files selected for processing (11)
.github/workflows/ci.ymlDockerfile.projectapps/dashboard/src/lib/api.tspackages/client/src/iac/provider.tsxpackages/server/src/lib/csrf.tspackages/server/src/lib/inngest.tspackages/server/src/routes/admin/auth.tspackages/server/src/routes/betterbase/index.tspackages/server/src/routes/betterbase/ws.tsspecs/CodeRabbit_Full_Codebase_review.mdtemplates/auth/src/routes/auth.ts
| branches: [main, develop] | ||
|
|
||
| env: | ||
| BUN_VERSION: "1.3.10" |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Verify Bun version parity across workflow and Dockerfiles (read-only).
set -euo pipefail
echo "== CI workflow Bun version references =="
rg -nP 'BUN_VERSION|bun-version' .github/workflows/ci.yml
echo
echo "== Dockerfile Bun base image references =="
fd -HI 'Dockerfile*' | xargs -I{} rg -nP '^\s*FROM\s+oven/bun(?::|@sha256:)' {}
echo
echo "Expected: same Bun minor/patch across CI and runtime Dockerfiles."Repository: weroperking/Betterbase
Length of output: 636
Synchronize CI Bun version with Dockerfile versions.
CI runs 1.3.10 but all Dockerfiles pin 1.3.9-debian and 1.3.9-alpine (lines 11, 18, 23). This version skew means CI tests won't catch 1.3.9-specific failures that occur in production builds and containers.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/ci.yml at line 8, CI is using BUN_VERSION: "1.3.10" while
Dockerfiles are pinned to "1.3.9-debian" and "1.3.9-alpine", causing a version
skew; update the CI variable BUN_VERSION to match the Dockerfile pins (set
BUN_VERSION: "1.3.9") or instead update the Dockerfile pins to "1.3.10-..." so
all places use the same Bun version (refer to the BUN_VERSION entry and the
Dockerfile image tags "1.3.9-debian" / "1.3.9-alpine" to make the change).
| @@ -1,4 +1,4 @@ | |||
| const API_BASE = import.meta.env.VITE_API_URL ?? "http://localhost:3001"; | |||
| const API_BASE = import.meta.env.VITE_API_URL; | |||
There was a problem hiding this comment.
API_BASE can be undefined if env var unset.
Removing the fallback means ${API_BASE}/... will produce "undefined/..." as a URL when VITE_API_URL is not set, causing network failures.
Proposed fix
-const API_BASE = import.meta.env.VITE_API_URL;
+const API_BASE = import.meta.env.VITE_API_URL;
+if (!API_BASE) {
+ throw new Error("VITE_API_URL environment variable is required");
+}Or provide a development fallback if intended:
-const API_BASE = import.meta.env.VITE_API_URL;
+const API_BASE = import.meta.env.VITE_API_URL ?? "http://localhost:3001";📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const API_BASE = import.meta.env.VITE_API_URL; | |
| const API_BASE = import.meta.env.VITE_API_URL; | |
| if (!API_BASE) { | |
| throw new Error("VITE_API_URL environment variable is required"); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/dashboard/src/lib/api.ts` at line 1, API_BASE is assigned directly from
import.meta.env.VITE_API_URL and can be undefined, causing string interpolation
like `${API_BASE}/...` to produce "undefined/..."; update the assignment of
API_BASE (or create a small getApiBase() helper) to validate the env var and
either supply a sensible fallback for dev (e.g., "http://localhost:PORT") or
throw a clear error at startup if missing; ensure all usages that interpolate
API_BASE (e.g., `${API_BASE}/...`) rely on this validated/fallback value so
network requests never use "undefined" as the origin.
| export async function checkSetup(): Promise<boolean> { | ||
| const res = await fetch(`${API_BASE}/admin/auth/setup/check`, { | ||
| method: "GET", | ||
| }); | ||
| return res.status !== 410; | ||
| } |
There was a problem hiding this comment.
checkSetup() uses raw fetch and swallows errors.
This bypasses the api helper pattern and doesn't handle network errors. If the fetch throws (e.g., network down), the error propagates uncaught.
Proposed fix using try/catch
export async function checkSetup(): Promise<boolean> {
- const res = await fetch(`${API_BASE}/admin/auth/setup/check`, {
- method: "GET",
- });
- return res.status !== 410;
+ try {
+ const res = await fetch(`${API_BASE}/admin/auth/setup/check`, {
+ method: "GET",
+ });
+ return res.status !== 410;
+ } catch {
+ // Network error - assume setup not complete to allow retry
+ return false;
+ }
}As per coding guidelines: "All server calls go through src/lib/api.ts."
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export async function checkSetup(): Promise<boolean> { | |
| const res = await fetch(`${API_BASE}/admin/auth/setup/check`, { | |
| method: "GET", | |
| }); | |
| return res.status !== 410; | |
| } | |
| export async function checkSetup(): Promise<boolean> { | |
| try { | |
| const res = await fetch(`${API_BASE}/admin/auth/setup/check`, { | |
| method: "GET", | |
| }); | |
| return res.status !== 410; | |
| } catch { | |
| // Network error - assume setup not complete to allow retry | |
| return false; | |
| } | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/dashboard/src/lib/api.ts` around lines 98 - 103, The checkSetup function
currently calls fetch directly and lets network errors bubble up; wrap the fetch
call in a try/catch so network failures are handled and do not throw uncaught,
and return a safe boolean (e.g., false) on error. Update the checkSetup
implementation (referencing checkSetup and API_BASE) to perform the fetch inside
a try block, catch any exceptions, log or handle the error as appropriate, and
return false from the catch; if you have an existing module-level request helper
(use that instead of raw fetch), call that helper to keep all server calls going
through the central API abstraction.
| let token = getCookie(c.req.raw, CSRF_COOKIE_NAME); | ||
| const tokenExpiry = c.req.header("X-CSRF-Token-Expiry"); | ||
|
|
||
| if (!token || !tokenExpiry || Number(tokenExpiry) < Date.now()) { | ||
| token = crypto.randomUUID(); | ||
| const expiry = Date.now() + CSRF_TOKEN_EXPIRY_MS; | ||
| c.header( | ||
| "Set-Cookie", | ||
| `${CSRF_COOKIE_NAME}=${token}; SameSite=Strict; Path=/; Max-Age=86400`, | ||
| ); | ||
| c.header("X-CSRF-Token-Expiry", expiry.toString()); | ||
| } |
There was a problem hiding this comment.
Token expiry check reads non-existent request header.
Line 33 reads c.req.header("X-CSRF-Token-Expiry") from the request, but line 42 sets it as a response header. The client never sends this header back, so tokenExpiry is always undefined, causing token regeneration on every safe request.
Either:
- Store expiry in the cookie value itself (e.g.,
token:expiry) - Remove expiry check and rely on cookie
Max-Age - Store server-side token state
Proposed fix: Rely on cookie Max-Age only
- let token = getCookie(c.req.raw, CSRF_COOKIE_NAME);
- const tokenExpiry = c.req.header("X-CSRF-Token-Expiry");
-
- if (!token || !tokenExpiry || Number(tokenExpiry) < Date.now()) {
+ const token = getCookie(c.req.raw, CSRF_COOKIE_NAME);
+
+ if (!token) {
token = crypto.randomUUID();
- const expiry = Date.now() + CSRF_TOKEN_EXPIRY_MS;
c.header(
"Set-Cookie",
`${CSRF_COOKIE_NAME}=${token}; SameSite=Strict; Path=/; Max-Age=86400`,
);
- c.header("X-CSRF-Token-Expiry", expiry.toString());
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| let token = getCookie(c.req.raw, CSRF_COOKIE_NAME); | |
| const tokenExpiry = c.req.header("X-CSRF-Token-Expiry"); | |
| if (!token || !tokenExpiry || Number(tokenExpiry) < Date.now()) { | |
| token = crypto.randomUUID(); | |
| const expiry = Date.now() + CSRF_TOKEN_EXPIRY_MS; | |
| c.header( | |
| "Set-Cookie", | |
| `${CSRF_COOKIE_NAME}=${token}; SameSite=Strict; Path=/; Max-Age=86400`, | |
| ); | |
| c.header("X-CSRF-Token-Expiry", expiry.toString()); | |
| } | |
| let token = getCookie(c.req.raw, CSRF_COOKIE_NAME); | |
| if (!token) { | |
| token = crypto.randomUUID(); | |
| c.header( | |
| "Set-Cookie", | |
| `${CSRF_COOKIE_NAME}=${token}; SameSite=Strict; Path=/; Max-Age=86400`, | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/server/src/lib/csrf.ts` around lines 32 - 43, The code reads
X-CSRF-Token-Expiry from the incoming request
(c.req.header("X-CSRF-Token-Expiry")) but that header is only set on the
response, so tokenExpiry will always be undefined and tokens are regenerated
every safe request; fix by removing the request header check and associated
expiry logic in the CSRF flow (leave getCookie(c.req.raw, CSRF_COOKIE_NAME)
check only), compute Max-Age from CSRF_TOKEN_EXPIRY_MS (divide by 1000) when
setting the cookie in c.header("Set-Cookie", ...) and stop setting or reading
the X-CSRF-Token-Expiry header; update references in this block around
getCookie, CSRF_COOKIE_NAME, CSRF_TOKEN_EXPIRY_MS, c.req.header and c.header
accordingly.
| function resolveClientIP(c: Context): string { | ||
| const realIP = c.req.header("X-Real-IP"); | ||
| if (realIP) return realIP; | ||
|
|
||
| const forwardedFor = c.req.header("X-Forwarded-For"); | ||
| if (forwardedFor) { | ||
| const parts = forwardedFor.split(","); | ||
| const firstIP = parts[0]?.trim(); | ||
| if (firstIP) return firstIP; | ||
| } | ||
|
|
||
| return c.req.header("Host") ?? "unknown"; | ||
| } |
There was a problem hiding this comment.
Host header is not a valid IP fallback.
Line 29 falls back to Host header, which contains the requested hostname (e.g., "example.com"), not the client IP. All clients hitting the same host will share a rate limit bucket.
Per the nginx config, /admin/ sets both X-Real-IP and X-Forwarded-For, so this fallback rarely triggers in production. However, direct access or misconfiguration would cause rate limit bypass.
Proposed fix
function resolveClientIP(c: Context): string {
const realIP = c.req.header("X-Real-IP");
if (realIP) return realIP;
const forwardedFor = c.req.header("X-Forwarded-For");
if (forwardedFor) {
const parts = forwardedFor.split(",");
const firstIP = parts[0]?.trim();
if (firstIP) return firstIP;
}
- return c.req.header("Host") ?? "unknown";
+ // No IP available - use a unique fallback to avoid collapsing all unknown clients
+ return `unknown-${Date.now()}-${Math.random().toString(36).slice(2)}`;
}Alternatively, reject requests without identifiable IP for sensitive endpoints.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/server/src/routes/admin/auth.ts` around lines 18 - 30, The
resolveClientIP function must not fall back to the Host header because Host is a
hostname, not a client IP; update resolveClientIP(Context) to return
null/undefined (or throw) when neither X-Real-IP nor X-Forwarded-For are present
instead of returning c.req.header("Host"), and then update the admin auth
route/middleware that calls resolveClientIP to reject requests with a 4xx (e.g.,
400 or 429) when no client IP is available so sensitive endpoints are not
grouped by hostname; reference resolveClientIP and the admin auth
middleware/handler that uses it and ensure rate-limiter logic treats a missing
IP as an error condition rather than using Host.
- Use actionCommand.name() to derive command when available - Fall back to this.args then process.argv for public command detection - This handles cases where global flags precede the command
Summary
Critical Security Fixes
1. Auth Bypass Fixed (
templates/auth/src/routes/auth.ts)Removed
|| code.length === 6bypass that accepted ANY 6-digit code in production on 5 endpoints (OTP, MFA, phone verify).2. Rate Limiting (
packages/server/src/routes/admin/auth.ts)Added in-memory rate limiter: 5 attempts per 15 minutes per IP on
/login.3. WebSocket Authentication (
packages/server/src/routes/betterbase/ws.ts)Requires Bearer token in Authorization header before WebSocket upgrade.
4. Hardcoded Credentials Removed (
packages/server/src/routes/betterbase/index.ts)Fails fast with 500 if
STORAGE_ACCESS_KEYorSTORAGE_SECRET_KEYnot set.5. Input Validation (
packages/server/src/routes/betterbase/index.ts)Medium Priority Fixes
6. CSRF Middleware (
packages/server/src/lib/csrf.ts)Double-submit cookie pattern. Skips safe methods, /health, /api/inngest, WebSocket.
7. JWT Expiry Reduced (
packages/server/src/lib/auth.ts)Changed from 30 days to 7 days.
8. RLS Role-Based Policies (
packages/core/src/rls/evaluator.ts)Added
userRoleparameter and implementedauth.role() = 'value'evaluation.Verification
Files Changed
templates/auth/src/routes/auth.ts- Auth bypass fixpackages/server/src/routes/admin/auth.ts- Rate limitingpackages/server/src/routes/betterbase/ws.ts- WebSocket authpackages/server/src/routes/betterbase/index.ts- Input validation, credential fixpackages/server/src/index.ts- CSRF middleware integrationpackages/server/src/lib/auth.ts- JWT expiry reductionpackages/server/src/lib/csrf.ts- New CSRF middlewarepackages/core/src/rls/evaluator.ts- Role-based RLSpackages/server/package.json- Added hono-rate-limit dependencyspecs/Security_Review.md- Comprehensive security review documentSummary by CodeRabbit
New Features
Security Improvements
Documentation
Chores