Skip to content

Enables context-aware question answering over PDFs using retrieval-augmented generation with vector embeddings. Built with Next.js App Router and OpenAI models for low-latency document search and response generation.

Notifications You must be signed in to change notification settings

CSroseX/PDF-qa-app

Repository files navigation

PDF Q&A Platform

Production-ready PDF ingestion and retrieval-augmented generation (RAG) web application that indexes documents into vector embeddings and serves low-latency, context-aware question answering through a Next.js App Router frontend orchestrating OpenAI models.

Follow the instructions below to provision dependencies, bootstrap the environment, and run the service locally.

Prerequisites

  • Node.js (v18 or later recommended)
  • npm (comes with Node.js) or yarn

Download

Clone the repository from GitHub:

git clone https://github.com/CSroseX/pdf-qa-app.git

Then navigate to the project directory:

cd pdf-qa-app

Installation

Install runtime and build-time dependencies:

npm install

Running the Project

Development

Launch the hot-reload development server:

npm run dev

The app will be available at http://localhost:3000.

Production

Build the project for production with optimized static assets and server bundles:

npm run build

Then start the production server:

npm start

Environment Variables

Create a .env file in the project root with the following variables (replace the keys with your own):

OPENAI_API_KEY=your_openai_api_key
API_SECRET_KEY=your_api_secret_key
NEXT_PUBLIC_API_SECRET_KEY=your_public_api_secret_key

Additional Information

  • Built with Next.js App Router to leverage server components and edge-ready deployments.
  • PDF ingestion leverages pdf-parse to extract text for downstream embedding.
  • Retrieval latency is minimized via precomputed embeddings and streamlined OpenAI completion calls.

Feel free to open an issue in the repository for any questions or problems.

Getting Started

First, run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:3000 with your browser to see the result.

You can start editing the page by modifying app/page.tsx. The page auto-updates as you edit the file.

This project uses next/font to automatically optimize and load Geist, a new font family for Vercel.

Learn More

To learn more about Next.js, take a look at the following resources:

You can check out the Next.js GitHub repository - your feedback and contributions are welcome!

Deploy on Vercel

The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

Check out our Next.js deployment documentation for more details.

About

Enables context-aware question answering over PDFs using retrieval-augmented generation with vector embeddings. Built with Next.js App Router and OpenAI models for low-latency document search and response generation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published