Automated Content Orchestration Agent - Transform long-form blog posts into platform-optimized social media content using AI-powered prompt engineering.
Content creators and marketing teams spend hours manually "atomizing" blog posts into snippets for different social networks. Each platform requires a different tone, length, and formatting style, making manual repurposing a significant bottleneck in the content lifecycle.
PromptStream acts as a linguistic translator. By inputting a URL, the agent:
- Extracts the core narrative from webpage noise
- Analyzes the blog category, tone, and key themes using LLM
- Synthesizes three unique outputs via platform-specific prompts
✅ Automated Content Ingestion - Fetches and cleans web content instantly (no manual copy-pasting)
✅ Context-Aware Summarization - Understands technical vs. lifestyle content, adjusts output accordingly
✅ Multi-Persona Generation - Produces three unique outputs simultaneously, each adhering to platform rules
✅ Zero-Shot Formatting - Delivers copy-paste-ready Markdown with hashtags and CTAs
✅ Modular Architecture - Add new platforms by extending prompts.py
| Persona | Platform | Output Style |
|---|---|---|
| Thread Architect | Twitter/X | 280-char multi-part hook sequences |
| Professional Curator | Corporate storytelling, industry insights | |
| Visual Narrator | Emoji-rich, punchy, mobile-first captions |
Blog URL
↓ (BeautifulSoup)
Extract Raw Content
↓ (LLM - Category Detection)
Analyze: Category, Tone, Key Themes
↓ (prompts.py)
Apply 3 Platform-Specific Prompts
↓
Output: 3 Optimized Social Media Posts
| Component | Technology | Version |
|---|---|---|
| Language | Python | 3.11.0 |
| Backend API | FastAPI + Uvicorn | Latest |
| Frontend UI | Streamlit | 1.28.1 |
| LLM Services | Groq / Google Gemini | Latest |
| Web Scraping | BeautifulSoup4 | 4.12.2 |
| HTTP Client | httpx / requests | Latest |
| Config Management | pydantic / python-dotenv | Latest |
PromptStream/
├── .venv/ # Virtual environment
├── backend/ # FastAPI backend
│ ├── main.py # Entry point, FastAPI app
│ ├── prompts.py # Platform-specific prompts (Thread Architect, Professional Curator, Visual Narrator)
│ ├── scraper.py # BeautifulSoup web scraper
│ ├── category_detector.py # LLM-based category detection
│ ├── orchestrator.py # Main orchestration logic
│ └── routes/
│ └── api.py # API endpoints
├── frontend/ # Streamlit UI
│ └── app.py # Streamlit application
├── config/ # Configuration
│ └── settings.py # Settings & environment config
├── requirements.txt # Python dependencies
├── .env.example # Environment variables template
├── .gitignore # Git ignore rules
└── README.md # This file
- Python 3.11.0
- pip (or uv for faster installation)
- API Keys for Groq or Google Gemini
cd PromptStreampython -m venv .venv
.\.venv\Scripts\Activate # Windows
# or
source .venv/bin/activate # macOS/Linuxuv pip install -r requirements.txt
# or
pip install -r requirements.txt# Copy .env.example to .env
cp .env.example .env
# Edit .env with your API keys
# Add your Groq API Key or Gemini API KeyStart FastAPI Backend:
python backend/main.py
# or
uvicorn backend.main:app --reload --host 127.0.0.1 --port 8000Start Streamlit Frontend (in new terminal):
streamlit run frontend/app.pyLLM_PROVIDER=groq
GROQ_API_KEY=your_groq_api_key_hereGet Groq API Key: https://console.groq.com
LLM_PROVIDER=gemini
GEMINI_API_KEY=your_gemini_api_key_hereGet Gemini API Key: https://aistudio.google.com/app/apikey
- Open browser to
http://localhost:8501 - Paste blog URL
- Click "Generate"
- Get three optimized social posts
POST http://localhost:8000/api/generate
{
"blog_url": "https://example.com/blog-post",
"platforms": ["twitter", "linkedin", "instagram"]
}- BeautifulSoup targets article tags, main content areas
- Removes noise (ads, sidebars, comments)
- Extracts title, body, metadata
- LLM analyzes content:
- Detects category (tech, lifestyle, business, etc.)
- Identifies tone (formal, casual, educational, etc.)
- Extracts key themes and insights
prompts.pycontains platform-specific instructions- Applies category-aware tone adjustments
- Generates three distinct outputs
- Adds platform-appropriate hashtags
- Includes CTAs (Call-to-Action)
- Returns copy-paste-ready Markdown
- Open
backend/prompts.py - Add new prompt template following existing patterns
- Update orchestrator to include new platform
- Update API routes to accept new platform
Example:
# backend/prompts.py
PLATFORMS = {
"twitter": {...},
"linkedin": {...},
"instagram": {...},
"tiktok": {...} # New platform
}# Run tests
pytest tests/
# Check code quality
flake8 backend/
black --check backend/| Variable | Default | Description |
|---|---|---|
LLM_PROVIDER |
groq | LLM service (groq/gemini) |
GROQ_API_KEY |
- | Groq API key |
GEMINI_API_KEY |
- | Google Gemini API key |
FASTAPI_HOST |
127.0.0.1 | FastAPI server host |
FASTAPI_PORT |
8000 | FastAPI server port |
LOG_LEVEL |
INFO | Logging verbosity |
- Database integration for content history
- User authentication & API keys
- Batch processing for multiple URLs
- Custom prompt templates per user
- Platform-specific media optimization
- A/B testing framework for posts
- Analytics dashboard
Contributions welcome! Please follow the existing code style and structure.
- Issues: Create a GitHub issue
- Questions: Check documentation or README
MIT License - Feel free to use PromptStream in your projects!
- FastAPI Documentation
- Streamlit Documentation
- BeautifulSoup Documentation
- Groq API Documentation
- Google Gemini API Documentation
Built with ❤️ for content creators and marketing teams