Synapse is an AI-native product definition environment. It turns a raw product idea into a structured PRD, then carries that PRD forward into mockups, downstream engineering artifacts, and annotated visual feedback — all from a single client-side workspace.
A single prompt flows through a four-stage pipeline:
prompt → PRD canvas → Mockups → Artifacts → History
Each stage is backed by Google Gemini, structured JSON schemas where they matter, and a versioned store so nothing you generate is lost.
Start from a rough idea and get back a structured PRD with vision, target users, features (with priority, acceptance criteria, and dependencies), architecture, risks, and non-functional requirements.
- Spine versioning — every regeneration creates a new version; history is preserved.
- Branch-based refinement — highlight any text to spawn a threaded branch and iterate on just that passage.
- Consolidation engine — merge branch decisions back into a new unified PRD iteration.
Generate text-based UI mockups directly from the finalized PRD. Configurable platform (mobile / desktop), fidelity (wireframe / mid-fi / high-fi), and scope (single screen / multi-screen / key workflow).
Every mockup run is saved as a new version so you can diff iterations side-by-side.
Extract structured feedback items directly from generated mockups. Feedback surfaces as actionable cards on the PRD stage — applying one spawns a localized branch to address the critique without regenerating the whole document.
Seven developer-ready artifact types generate in parallel from a single finalized PRD:
- Screen Inventory and User Flows
- Component Inventory and Design System
- Data Model schemas with entities, fields, and relationships
- Implementation Plan and Prompt Pack
Three of them (screen_inventory, data_model, component_inventory)
use Gemini JSON mode with explicit schemas and render as card grids,
entity tables, and categorized component cards rather than raw markdown.
Every artifact tracks staleness against the current spine, supports natural-language refinement ("add error states to each screen"), and surfaces quality warnings if the output looks truncated or malformed.
Five annotation types — screenshot annotations, critique boards,
wireframe callouts, flow annotations, and design feedback boards — all
generated from PRD context as MarkupImageSpec JSON and rendered as
resolution-independent SVG with highlights, arrows, numbered markers, and
text blocks. Exportable as SVG.
Save the entire current project — spine versions, branches, artifacts, feedback, history, and the AI-generated mockup images — to Vercel Blob behind a single owner token. Reload from any browser or device, or delete a snapshot when you're done with it.
- Demo viewers never see this panel: it gates on owner-token presence.
- Images bundle along with the project, so a restored snapshot looks identical to the moment it was saved (no re-generation, no missing PNGs).
- Token is stored in your browser's
localStorage; the server side validates with constant-time comparison againstSYNAPSE_OWNER_TOKEN.
Open via the workspace overflow menu → Cloud Snapshots.
Chronological audit log of every spine regeneration, branch consolidation, artifact derivation, and feedback event — with diffs where it matters.
graph TD
A[Initial Prompt] -->|Gemini JSON mode| B(Structured PRD Spine)
B --> C{PRD Canvas}
C -->|Highlight & ask| D[Threaded Branch]
D -->|Consolidation| B
B -->|Mark final| E{Pipeline Stages}
E -->|Mockups| F[UI Mockup Versions]
F -->|Extract feedback| D
E -->|Artifacts| H[7 Core Artifacts]
E -->|Markup| I[Annotated SVGs]
H --> J(Screen Inventory)
H --> K(Data Model)
H --> L(Component Library)
H --> M(Implementation Plan)
- Frontend: React 19, TypeScript, Vite 7, Tailwind CSS 3
- Backend: Vercel serverless API routes + MongoDB (for recruiter auth analytics) + Vercel Blob (for owner-only project snapshots)
- State: Zustand 5 with debounced
localStoragepersistence; mockup PNGs in IndexedDB - LLM: Google Gemini 2.5 Pro / Flash (direct browser calls, streaming support); OpenAI gpt-image-2 for mockup image previews
- Markdown:
react-markdown+remark-gfm+rehype-raw - Routing: React Router v7
- Icons & animation:
lucide-react,@formkit/auto-animate
The product workspace remains browser-first, while recruiter authentication and tracking run through API routes backed by MongoDB.
You'll need a Gemini API key. Get one at Google AI Studio.
npm install
npm run devOpen http://localhost:5173, click the Settings gear, and paste your
key. Workspace state (projects, spines, artifacts) persists to
localStorage; AI-generated mockup PNGs persist to IndexedDB
(typically gigabytes of headroom, so high-quality images don't blow
the localStorage 5-10 MB cap).
To enable the Cloud Snapshots panel for archiving / restoring whole projects (state + images) across devices, set two Vercel environment variables:
| Variable | Where it comes from |
|---|---|
SYNAPSE_OWNER_TOKEN |
Any random string ≥ 24 chars. The server compares with crypto.timingSafeEqual; the client stores it in localStorage. |
BLOB_READ_WRITE_TOKEN |
Created automatically when you provision Vercel Blob for the project. No manual setup needed. |
The owner-token gate is single-tenant: there is no signup, no per-user isolation, no demo access. It exists so the project owner can persist real work without exposing an unauthenticated write endpoint to the public demo. Snapshot bundles are subject to Vercel's serverless body limit (~4.5 MB on Hobby), so very large projects with many high-quality images may need to be split or saved at lower image quality.
npm run builddocs/architecture.md— runtime stack, state layer, LLM services, UI compositiondocs/artifact-flow.md— file-by-file trace of one end-to-end pipeline rundocs/deployment.md— commands, Vercel setup, self-hostingdocs/auth.md— multi-provider auth (email/password, Google, GitHub, LinkedIn), user record schema, env vars, error codesdocs/linkedin-auth.md— LinkedIn OAuth setup, recruiter capture fields, and compliance notedocs/archive/— historical design notes and audits retained for context
Portfolio project. Single-user by design. Demo visitors run the workspace
fully in-browser — spine + artifact state in localStorage, mockup
PNGs in IndexedDB, no telemetry, no cross-device sync. The owner can
opt-in to Vercel-Blob-backed Cloud Snapshots (gated by
SYNAPSE_OWNER_TOKEN) to persist real work across browsers and devices.





