Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
087f8b2
Provider abstraction and Tavily extraction
t1ffnyw Feb 28, 2026
88fbb35
Serper adapter
t1ffnyw Feb 28, 2026
083e39b
Strategy logic and executeSearch refactor
t1ffnyw Mar 1, 2026
eba1215
updated trendSearch tests and added new tests for serper
t1ffnyw Mar 1, 2026
1003b85
fix errors in lint
t1ffnyw Mar 3, 2026
4f030df
update metadata UI
ezhu15 Mar 7, 2026
5a17c7a
optimized metadata extraction + UI changes
ezhu15 Mar 7, 2026
ea6d591
feat(marketing): add DNA extraction, competitor analysis, and strateg…
rafchen Mar 6, 2026
1472d65
fix: marketing pipeline build and type errors
rafchen Mar 6, 2026
3db9f27
metadata now used for companyDNA
ezhu15 Mar 7, 2026
071ffa8
Add .env.example and update test-trend-search script with Serper docu…
kien-ship-it Mar 8, 2026
7b1a180
Resolve merge conflicts from main
kien-ship-it Mar 8, 2026
0beac71
Remove default value for SEARCH_PROVIDER and improve Serper scoring c…
kien-ship-it Mar 8, 2026
016cc73
Add geolocation and language parameters to Serper API test request
kien-ship-it Mar 8, 2026
a6ec646
Remove unused import and replace type assertions with non-null assert…
kien-ship-it Mar 9, 2026
340c353
Merge remote-tracking branch 'origin/edward/feature/metadata-edit-ui'…
Mar 9, 2026
878049c
Update ServicesSection.tsx
ezhu15 Mar 9, 2026
63024db
Merging updates to Metadata section with Edward's branch
Mar 10, 2026
3b84bba
Merge branch 'edward/feature/metadata-edit-ui' of https://github.com/…
Mar 10, 2026
4e8af5a
Merge origin/main with feature/metadata-display
Mar 10, 2026
b78a39e
Fixed errors from merge so that build passes for PR
Mar 10, 2026
615d40c
Initial plan
Copilot Mar 10, 2026
e8dbf1e
Merge origin/main into feature branch, resolving conflicts and mainta…
Copilot Mar 10, 2026
fff372c
Merge pull request #249 from Deodat-Lawson/copilot/sub-pr-243
kien-ship-it Mar 10, 2026
49dd096
Merge main into tiffany/feature/multi-provider-web-search
Deodat-Lawson Mar 23, 2026
9552ac3
fixed special character in db
Deodat-Lawson Mar 23, 2026
198bc98
Merge pull request #243 from Deodat-Lawson/tiffany/feature/multi-prov…
Deodat-Lawson Mar 23, 2026
1665d9e
Merge pull request #248 from Deodat-Lawson/feature/metadata-display
Deodat-Lawson Mar 23, 2026
a569bce
Merge branch 'stable' into main
Deodat-Lawson Mar 23, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
98 changes: 56 additions & 42 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,65 +1,79 @@
# Since the ".env" file is gitignored, you can use the ".env.example" file to
# build a new ".env" file when you clone the repo. Keep this file up-to-date
# when you add new variables to `.env`.
# Server environment variables
DATABASE_URL="postgresql://user:password@localhost:5432/pdr_ai"

# Clerk Authentication
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY="pk_test_..."
CLERK_SECRET_KEY="sk_test_..."

# OpenAI API
OPENAI_API_KEY="sk-..."

# NextJS client environment variables
NEXT_PUBLIC_UPLOADTHING_ENABLED="true"

# Trend / web search (optional — leave keys unset if you do not use trend search)
# SEARCH_PROVIDER default when unset is tavily-only (same as tavily).
# tavily → requires TAVILY_API_KEY
# serper → requires SERPER_API_KEY
# fallback → Serper first, then Tavily if Serper returns no results; set both keys for full behavior
# parallel → merge Serper + Tavily; set both keys for full behavior (Serper-only works with SERPER_API_KEY only)
# If a required key is missing, the pipeline no-ops that provider; providerUsed may be "none" when no key backs the active path.
TAVILY_API_KEY="tvly-..."
SERPER_API_KEY="..."
SEARCH_PROVIDER="tavily" # "tavily" | "serper" | "fallback" | "parallel"

# Platform API Keys for Marketing Pipeline
REDDIT_CLIENT_ID="your_reddit_client_id"
REDDIT_CLIENT_SECRET="your_reddit_client_secret"
REDDIT_USER_AGENT="your_reddit_user_agent"
TWITTER_BEARER_TOKEN="your_twitter_bearer_token"
LINKEDIN_ACCESS_TOKEN="your_linkedin_access_token"
BLUESKY_HANDLE="your_bluesky_handle"
BLUESKY_APP_PASSWORD="your_bluesky_app_password"

# Document Processing (Optional)
AZURE_DOC_INTELLIGENCE_ENDPOINT="https://..."
AZURE_DOC_INTELLIGENCE_KEY="..."
LANDING_AI_API_KEY="..."

# This file will be committed to version control, so make sure not to have any
# secrets in it. If you are cloning this repo, create a copy of this file named
# ".env" and populate it with your secrets.

# When adding additional environment variables, the schema in "/src/env.js"
# should be updated accordingly.

# Database
DATABASE_URL=""

# Docker Compose: password for PostgreSQL (used by db service)
# POSTGRES_PASSWORD=password

# Clerk Authentication (get from https://clerk.com/)

NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=
CLERK_SECRET_KEY=
# Anthropic API (optional — enables Claude models, get from https://console.anthropic.com/)
ANTHROPIC_API_KEY=
ANTHROPIC_MODEL=

# OpenAI API (get from https://platform.openai.com/)
OPENAI_API_KEY=
OPENAI_MODEL="gpt-5-mini"

# Anthropic API (optional — enables Claude models, get from https://console.anthropic.com/)
ANTHROPIC_API_KEY=

# Google AI API (optional — enables Gemini models, get from https://aistudio.google.com/apikey)
GOOGLE_AI_API_KEY=
GOOGLE_MODEL=

# Ollama (optional — enables local models via Ollama, see https://ollama.com/)
OLLAMA_BASE_URL="http://localhost:11434"
OLLAMA_MODEL="llama3.1:8b"

# UploadThing (get from https://uploadthing.com/)
# LangSmith Tracing (Optional)
LANGCHAIN_TRACING_V2="false"
LANGCHAIN_API_KEY="..."
LANGCHAIN_PROJECT="pdr_ai_v2"

# File Uploading (Optional)
UPLOADTHING_SECRET="your_uploadthing_secret"
UPLOADTHING_APP_ID="your_uploadthing_app_id"
UPLOADTHING_TOKEN="..."

# Datalab OCR API (optional - get from https://www.datalab.to/)
# Required only if you want to enable OCR processing for scanned documents
DATALAB_API_KEY="your_datalab_api_key"

# Landing.AI OCR API (optional - get from https://www.landing.ai/)
LANDING_AI_API_KEY="your_landing_ai_api_key"

# Tavily API (optional - get from https://www.tavily.com/)
TAVILY_API_KEY="your_tavily_api_key"

# Azure Document Intelligence OCR API (optional - get from https://learn.microsoft.com/en-us/azure/applied-ai-services/document-intelligence/quickstarts/get-started-with-rest-api?pivots=programming-language-rest-api)
AZURE_DOC_INTELLIGENCE_ENDPOINT="your_azure_doc_intelligence_endpoint"
AZURE_DOC_INTELLIGENCE_KEY="your_azure_doc_intelligence_key"
# Data APIs (Optional)
DATALAB_API_KEY="..."

# Inngest (required for background document processing - https://inngest.com/)
INNGEST_EVENT_KEY="dev_placeholder"
# Background Jobs
JOB_RUNNER="inngest"
INNGEST_EVENT_KEY="local"
INNGEST_SIGNING_KEY="signkey-dev-xxxxx"

# Sidecar (optional - get from https://github.com/pdr-ai/sidecar)
SIDECAR_URL="your_sidecar_url"
# Sidecar ML Compute (Optional)
SIDECAR_URL="http://localhost:8080"

# Neo4j (optional - get from https://neo4j.com/)
# Neo4j (optional)
NEO4J_URI="your_neo4j_uri"
NEO4J_USERNAME="your_neo4j_username"
NEO4J_PASSWORD="your_neo4j_password"
Loading
Loading