Skip to content

jaiharini004/PromptStream

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PromptStream 🚀

Automated Content Orchestration Agent - Transform long-form blog posts into platform-optimized social media content using AI-powered prompt engineering.


🎯 Problem Statement

Content creators and marketing teams spend hours manually "atomizing" blog posts into snippets for different social networks. Each platform requires a different tone, length, and formatting style, making manual repurposing a significant bottleneck in the content lifecycle.

💡 Solution

PromptStream acts as a linguistic translator. By inputting a URL, the agent:

  1. Extracts the core narrative from webpage noise
  2. Analyzes the blog category, tone, and key themes using LLM
  3. Synthesizes three unique outputs via platform-specific prompts

✨ Key Features

Automated Content Ingestion - Fetches and cleans web content instantly (no manual copy-pasting)

Context-Aware Summarization - Understands technical vs. lifestyle content, adjusts output accordingly

Multi-Persona Generation - Produces three unique outputs simultaneously, each adhering to platform rules

Zero-Shot Formatting - Delivers copy-paste-ready Markdown with hashtags and CTAs

Modular Architecture - Add new platforms by extending prompts.py


🧠 Core Architecture: Prompt Orchestration

The Three Personas

Persona Platform Output Style
Thread Architect Twitter/X 280-char multi-part hook sequences
Professional Curator LinkedIn Corporate storytelling, industry insights
Visual Narrator Instagram Emoji-rich, punchy, mobile-first captions

Processing Pipeline

Blog URL 
  ↓ (BeautifulSoup)
Extract Raw Content
  ↓ (LLM - Category Detection)
Analyze: Category, Tone, Key Themes
  ↓ (prompts.py)
Apply 3 Platform-Specific Prompts
  ↓
Output: 3 Optimized Social Media Posts

🛠️ Tech Stack

Component Technology Version
Language Python 3.11.0
Backend API FastAPI + Uvicorn Latest
Frontend UI Streamlit 1.28.1
LLM Services Groq / Google Gemini Latest
Web Scraping BeautifulSoup4 4.12.2
HTTP Client httpx / requests Latest
Config Management pydantic / python-dotenv Latest

📦 Project Structure

PromptStream/
├── .venv/                          # Virtual environment
├── backend/                        # FastAPI backend
│   ├── main.py                     # Entry point, FastAPI app
│   ├── prompts.py                  # Platform-specific prompts (Thread Architect, Professional Curator, Visual Narrator)
│   ├── scraper.py                  # BeautifulSoup web scraper
│   ├── category_detector.py        # LLM-based category detection
│   ├── orchestrator.py             # Main orchestration logic
│   └── routes/
│       └── api.py                  # API endpoints
├── frontend/                       # Streamlit UI
│   └── app.py                      # Streamlit application
├── config/                         # Configuration
│   └── settings.py                 # Settings & environment config
├── requirements.txt                # Python dependencies
├── .env.example                    # Environment variables template
├── .gitignore                      # Git ignore rules
└── README.md                       # This file

🚀 Installation & Setup

Prerequisites

  • Python 3.11.0
  • pip (or uv for faster installation)
  • API Keys for Groq or Google Gemini

Step 1: Clone & Navigate

cd PromptStream

Step 2: Create & Activate Virtual Environment

python -m venv .venv
.\.venv\Scripts\Activate  # Windows
# or
source .venv/bin/activate  # macOS/Linux

Step 3: Install Dependencies

uv pip install -r requirements.txt
# or
pip install -r requirements.txt

Step 4: Configure Environment Variables

# Copy .env.example to .env
cp .env.example .env

# Edit .env with your API keys
# Add your Groq API Key or Gemini API Key

Step 5: Run the Application

Start FastAPI Backend:

python backend/main.py
# or
uvicorn backend.main:app --reload --host 127.0.0.1 --port 8000

Start Streamlit Frontend (in new terminal):

streamlit run frontend/app.py

🔑 API Configuration

Using Groq (Recommended - Faster)

LLM_PROVIDER=groq
GROQ_API_KEY=your_groq_api_key_here

Get Groq API Key: https://console.groq.com

Using Google Gemini

LLM_PROVIDER=gemini
GEMINI_API_KEY=your_gemini_api_key_here

Get Gemini API Key: https://aistudio.google.com/app/apikey


💻 Usage

Via Streamlit UI (Recommended)

  1. Open browser to http://localhost:8501
  2. Paste blog URL
  3. Click "Generate"
  4. Get three optimized social posts

Via FastAPI

POST http://localhost:8000/api/generate

{
  "blog_url": "https://example.com/blog-post",
  "platforms": ["twitter", "linkedin", "instagram"]
}

📝 How It Works

Step 1: Content Extraction

  • BeautifulSoup targets article tags, main content areas
  • Removes noise (ads, sidebars, comments)
  • Extracts title, body, metadata

Step 2: Category Detection

  • LLM analyzes content:
    • Detects category (tech, lifestyle, business, etc.)
    • Identifies tone (formal, casual, educational, etc.)
    • Extracts key themes and insights

Step 3: Prompt Application

  • prompts.py contains platform-specific instructions
  • Applies category-aware tone adjustments
  • Generates three distinct outputs

Step 4: Formatting

  • Adds platform-appropriate hashtags
  • Includes CTAs (Call-to-Action)
  • Returns copy-paste-ready Markdown

🔄 Adding New Platforms

  1. Open backend/prompts.py
  2. Add new prompt template following existing patterns
  3. Update orchestrator to include new platform
  4. Update API routes to accept new platform

Example:

# backend/prompts.py
PLATFORMS = {
    "twitter": {...},
    "linkedin": {...},
    "instagram": {...},
    "tiktok": {...}  # New platform
}

🧪 Testing

# Run tests
pytest tests/

# Check code quality
flake8 backend/
black --check backend/

📄 Environment Variables Reference

Variable Default Description
LLM_PROVIDER groq LLM service (groq/gemini)
GROQ_API_KEY - Groq API key
GEMINI_API_KEY - Google Gemini API key
FASTAPI_HOST 127.0.0.1 FastAPI server host
FASTAPI_PORT 8000 FastAPI server port
LOG_LEVEL INFO Logging verbosity

🚧 Roadmap

  • Database integration for content history
  • User authentication & API keys
  • Batch processing for multiple URLs
  • Custom prompt templates per user
  • Platform-specific media optimization
  • A/B testing framework for posts
  • Analytics dashboard

🤝 Contributing

Contributions welcome! Please follow the existing code style and structure.


📞 Support

  • Issues: Create a GitHub issue
  • Questions: Check documentation or README

📜 License

MIT License - Feel free to use PromptStream in your projects!


🎓 Learn More


Built with ❤️ for content creators and marketing teams

About

PromptStream is an AI-powered content orchestration tool that converts a blog URL into platform-ready social media posts. It scrapes and analyzes article content, detects category and tone, and generates tailored outputs optimized for Twitter, LinkedIn and Instagram saving creators time and ensuring platform-appropriate messaging.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages