refactor: unify auth and sandbox next proxies#407
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (9)
✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly refactors the application's API proxy infrastructure by centralizing common proxy logic into a single, shared utility. The primary goal is to eliminate redundant code, standardize how API requests are forwarded to the backend, and introduce essential features like request timeouts and custom error handling. This consolidation improves consistency, simplifies future development of new proxy routes, and enhances the overall robustness of the application's interaction with its backend services. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
Vercel Preview
|
|
There was a problem hiding this comment.
Code Review
This pull request introduces a significant and valuable refactoring by unifying the backend API proxy logic into a shared module. The new proxyJson and proxyText helpers in backend-proxy.ts are well-designed, adding important features like request timeouts, custom error handlers, and proper handling of 204 responses. Migrating the auth and sandbox API routes to these helpers has greatly simplified their code and improved consistency.
I have a couple of suggestions to further improve the new proxy module, primarily around security and flexibility. One is a high-severity issue regarding potential information disclosure in the default error handling. The other is a medium-severity suggestion to make the caching behavior more flexible. Overall, this is a solid improvement to the codebase.
| const detail = error instanceof Error ? error.message : String(error) | ||
|
|
||
| return Response.json( | ||
| { | ||
| detail: isTimeout | ||
| ? `Upstream API timed out: ${upstreamUrl}` | ||
| : `Upstream API unreachable: ${upstreamUrl}`, | ||
| error: detail, | ||
| }, | ||
| { status: 502 }, | ||
| ) |
There was a problem hiding this comment.
The default error handler exposes internal error details to the client via the error field in the JSON response. This is a potential information disclosure vulnerability, as it could leak sensitive information about the infrastructure. It's safer to return a generic error message to the client and log the specific error on the server for debugging purposes.
| const detail = error instanceof Error ? error.message : String(error) | |
| return Response.json( | |
| { | |
| detail: isTimeout | |
| ? `Upstream API timed out: ${upstreamUrl}` | |
| : `Upstream API unreachable: ${upstreamUrl}`, | |
| error: detail, | |
| }, | |
| { status: 502 }, | |
| ) | |
| // The original error should be logged on the server for debugging. | |
| // console.error(`[backend-proxy] Upstream fetch to ${upstreamUrl} failed:`, error); | |
| return Response.json( | |
| { | |
| detail: isTimeout | |
| ? `Upstream API timed out: ${upstreamUrl}` | |
| : `Upstream API unreachable: ${upstreamUrl}`, | |
| }, | |
| { status: 502 }, | |
| ) |
| options: Pick<TextProxyOptions, "responseContentType" | "responseHeaders">, | ||
| ): Response { | ||
| const headers = new Headers(options.responseHeaders) | ||
| headers.set("Cache-Control", "no-cache") |
There was a problem hiding this comment.
Hardcoding Cache-Control: no-cache might be too restrictive. It prevents caching of proxied responses, even if the upstream API sends its own caching headers. Consider respecting the upstream's Cache-Control header if it's present, and only falling back to no-cache as a default. This would make the proxy helper more flexible and allow leveraging backend caching strategies.
| headers.set("Cache-Control", "no-cache") | |
| if (upstream.headers.has("Cache-Control")) { | |
| headers.set("Cache-Control", upstream.headers.get("Cache-Control")!) | |
| } else { | |
| headers.set("Cache-Control", "no-cache") | |
| } |
There was a problem hiding this comment.
Pull request overview
Refactors the Next.js API proxy layer to centralize backend URL resolution and request/response handling, then migrates auth and sandbox proxy routes onto the shared helper.
Changes:
- Expand
backend-proxy.tsinto a shared JSON/text proxy helper with timeout + error handling and consistent base URL resolution. - Migrate multiple auth proxy routes (
forgot-password,register,reset-password,me,me/change-password,login-check) to use the shared proxy helper. - Migrate
sandbox/statusto the shared helper and add focused tests for auth passthrough, 204 responses, and custom 502 fallbacks.
Reviewed changes
Copilot reviewed 9 out of 9 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| web/src/app/api/_utils/backend-proxy.ts | Introduces the unified proxy helper layer (base URL, headers/body resolution, timeout + error handling, response building). |
| web/src/app/api/_utils/backend-proxy.test.ts | Adds/updates unit tests covering the new proxy behavior and API. |
| web/src/app/api/sandbox/status/route.ts | Switches sandbox status route to the unified proxy helper. |
| web/src/app/api/auth/forgot-password/route.ts | Switches forgot-password proxy to the unified proxy helper. |
| web/src/app/api/auth/register/route.ts | Switches register proxy to the unified proxy helper. |
| web/src/app/api/auth/reset-password/route.ts | Switches reset-password proxy to the unified proxy helper. |
| web/src/app/api/auth/me/route.ts | Switches auth/me GET/PATCH/DELETE to the unified proxy helper with backend auth. |
| web/src/app/api/auth/me/change-password/route.ts | Switches change-password proxy to the unified proxy helper with backend auth. |
| web/src/app/api/auth/login-check/route.ts | Switches login-check proxy to the unified proxy helper with a custom 502 fallback. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
| const contentType = | ||
| upstream.headers.get("content-type") || options.responseContentType | ||
|
|
||
| if (contentType && upstream.status !== 204 && text.length > 0) { | ||
| headers.set("Content-Type", contentType) | ||
| } | ||
|
|
||
| if (upstream.status === 204 || text.length === 0) { | ||
| return new Response(null, { | ||
| status: upstream.status, | ||
| headers, | ||
| }) | ||
| } |
| const detail = error instanceof Error ? error.message : String(error) | ||
|
|
||
| return Response.json( | ||
| { | ||
| detail: isTimeout | ||
| ? `Upstream API timed out: ${upstreamUrl}` | ||
| : `Upstream API unreachable: ${upstreamUrl}`, | ||
| error: detail, |



What changed
web/src/app/api/_utils/backend-proxy.tsinto a shared JSON/text proxy layer with timeout handling, upstream error fallbacks, and unified backend base URL resolutionforgot-password,register,reset-password,me,me/change-password,login-check) to the shared helpersandbox/statusto the same helper and add focused helper tests for auth passthrough, empty 204 responses, and custom 502 fallbacksWhy
Validation
npx vitest run src/app/api/_utils/backend-proxy.test.ts src/app/api/_utils/auth-headers.test.ts src/app/api/papers/export/route.test.tsnpx eslint src/app/api/_utils/backend-proxy.ts src/app/api/_utils/backend-proxy.test.ts src/app/api/auth/forgot-password/route.ts src/app/api/auth/register/route.ts src/app/api/auth/reset-password/route.ts src/app/api/auth/me/change-password/route.ts src/app/api/auth/me/route.ts src/app/api/auth/login-check/route.ts src/app/api/sandbox/status/route.tsnpx tsc --noEmitnpm run buildFollow-up