A streamlined article processing platform that works entirely through your browser extension. No accounts, no dashboards - just save articles and get notifications when they're ready.
cd /path/to/digestible
./manage.sh start- Open Chrome β
chrome://extensions/ - Enable "Developer mode"
- Click "Load unpacked"
- Select the
browser-extension/folder
- Click the extension icon on any webpage
- Click "Save Article"
- Get notified when processing is complete
- View your saved articles in the extension
- One-click article saving from any webpage
- Automatic processing in the background
- Browser notifications when articles are ready
- Local storage - articles saved in your browser
- No accounts required - works immediately
- FastAPI backend for article processing
- PostgreSQL database for data storage
- Redis queue for background processing
- Docker containers for easy deployment
- Chrome extension with modern UI
# Start all services
./manage.sh start
# Stop all services
./manage.sh stop
# Check status
./manage.sh status
# View logs
./manage.sh logs backend
./manage.sh logs
# Restart services
./manage.sh restartβββββββββββββββββββ
β Browser β
β Extension β β Stores articles locally
β (Chrome) β β Shows notifications
βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ βββββββββββββββββββ
β FastAPI ββββββ PostgreSQL β
β Backend β β Database β
β (Processing) β βββββββββββββββββββ
βββββββββββββββββββ β²
β β
βΌ β
βββββββββββββββββββ β
β Redis Queue β βββββββββββ
β (Background β
β Tasks) β
βββββββββββββββββββ
- Popup Interface: Clean, modern design
- Article List: View all your saved articles
- Status Tracking: See processing progress
- Local Storage: Articles stored in browser
manifest.json- Extension configurationpopup.html/js- Main interfacebackground.js- Background processing & notificationsstyles.css- Modern UI styling
POST /api/v1/articles- Submit article for processingGET /api/v1/articles- List all articlesGET /api/v1/articles/{id}- Get specific articleGET /health- Health check
- Server: PostgreSQL database stores processed articles
- Browser: Chrome local storage keeps article list and metadata
- Automatic Sync: Extension polls server for updates
- Check extension is loaded:
chrome://extensions/ - Check API is running:
curl http://localhost:8000/health - Check browser console for errors
# Check Docker
docker info
# Check logs
./manage.sh logs
# Restart
./manage.sh restart# Reset database
./manage.sh stop
docker volume rm digestible_postgres_data
./manage.sh start# Check logs
docker compose logs
# Rebuild containers
docker compose down
docker compose up --build# Test connection
docker compose exec backend python -c "from backend.database import engine; import asyncio; asyncio.run(engine.connect())"
# Reset migrations
docker compose exec backend alembic downgrade base
docker compose exec backend alembic upgrade head# Fix file permissions
sudo chown -R $USER:$USER .- Code changes auto-reload
- Check logs:
./manage.sh logs backend - API docs:
http://localhost:8000/docs
- Edit files in
browser-extension/ - Reload extension in
chrome://extensions/ - Test with live API
- Processing Time: 10-30 seconds per article
- Storage: Unlimited articles (server-side)
- Offline: Access saved articles without internet
- Sync: Automatic updates when online
- Research: Save articles for later reading
- Content Creation: Collect sources and references
- Learning: Build personal knowledge base
- Productivity: Quick article processing and summaries
Ready to save your first article? π
./manage.sh start
# Then load the extension and start saving!- Code changes auto-reload
- Check logs:
./manage.sh logs backend - API docs:
http://localhost:8000/docs
- Edit files in
browser-extension/ - Reload extension in
chrome://extensions/ - Test with live API
# Python linting (backend)
docker compose exec backend black backend/
docker compose exec backend ruff check backend/# FastAPI tests
docker compose exec backend pytest
# Browser extension tests
./test-ci-local.sh# FastAPI schema changes
docker compose exec backend alembic revision --autogenerate -m "description"
docker compose exec backend alembic upgrade headArticles submitted to FastAPI go through this async pipeline:
User submits URL
β
FastAPI receives β Creates DB record
β
Background task starts
β
ββββββββββββββββββββββββββββββββββββ
β 1. FETCH β Download HTML β
β 2. PARSE β Extract text β
β 3. CHUNK β Split into parts β
β 4. SUMMARIZE β Generate summary β β Phase 1: AI integration
β 5. RENDER β Output formats β β Phase 1: TTS/audio
ββββββββββββββββββββββββββββββββββββ
β
Store in PostgreSQL
β
Browser extension displays results
DATABASE_URL- PostgreSQL connection string
REDIS_URL- Redis connection (defaults toredis://redis:6379/0)DEBUG- Enable debug mode (default: true)
The project provides pre-built Docker images available on both GitHub Container Registry and DockerHub:
- Repository: vanistas/digestible-backend
- Latest Image:
docker pull vanistas/digestible-backend:latest - Tags:
latest, branch-specific tags, and commit SHAs
- Repository:
ghcr.io/ASHEN-IX/digestible/backend - Latest Image:
docker pull ghcr.io/ASHEN-IX/digestible/backend:latest
Run with Docker Compose (Recommended):
docker compose up -dRun Backend Only:
# Using DockerHub image
docker run -d \
--name digestible-backend \
-p 8000:8000 \
-e DATABASE_URL="postgresql://..." \
-e REDIS_URL="redis://..." \
vanistas/digestible-backend:latest
# Using GHCR image
docker run -d \
--name digestible-backend \
-p 8000:8000 \
-e DATABASE_URL="postgresql://..." \
-e REDIS_URL="redis://..." \
ghcr.io/ASHEN-IX/digestible/backend:latestManual Deployment:
# Build images locally
docker build -f Dockerfile.api -t digestible-backend .
# Or use published images
docker build -f Dockerfile.api -t digestible-backend .
docker build -f dashboard/Dockerfile -t digestible-dashboard .
# Deploy
docker compose -f docker-compose.prod.yml up -dThe CI/CD pipeline automatically:
- Runs tests on every push to
mainanddevelopbranches - Builds Docker images on successful tests
- Pushes images to both GitHub Container Registry and DockerHub
- Deploys to production environment (when configured)
Workflow Status: GitHub Actions
- Fork the repository
- Create a feature branch
- Make your changes
- Run tests:
docker compose exec backend pytest - Submit a pull request