AI-powered tool for generating professional bios, project summaries, and learning reflections using Google's Gemini AI.
- Interactive Chat Interface: Natural conversation with AI to build your portfolio content
- Multiple Content Types: Generate bios, project descriptions, skills summaries, and more
- Context-Aware: Maintains conversation history for coherent, personalized responses
- Modern UI: Clean, responsive interface built with Streamlit
- Python 3.10 or higher
- Google AI API Key (Get one here)
- Git (optional)
git clone https://github.com/dev-api-org/ai-portfolio-assistant.git
cd ai-portfolio-assistantWindows (PowerShell):
py -m venv .venv
.\.venv\Scripts\ActivatemacOS/Linux:
python3 -m venv .venv
source .venv/bin/activate# Upgrade pip (recommended)
python -m pip install --upgrade pip
# Install all dependencies
pip install -r requirements.txt# Copy the example file
cp .env.example .env
# Edit .env and add your Google API key
# GOOGLE_API_KEY=your_actual_api_key_herestreamlit run frontend/streamlit_chat_canvas.pyThe app will open in your browser at http://localhost:8501
Ensure these files are in your repository:
- β
requirements.txt(unified dependencies) - β
backend/__init__.py(makes backend a Python package) - β
.streamlit/config.toml(Streamlit configuration) - β
.env.example(template for local development) - β
.streamlit/secrets.toml.example(template for cloud secrets)
- Sign in to Streamlit Cloud
- Click "New app"
- Configure your app:
- Repository:
dev-api-org/ai-portfolio-assistant - Branch:
main - Main file path:
frontend/streamlit_chat_canvas.py
- Repository:
- Click "Advanced settings"
- Set Python version:
3.10or higher
In Streamlit Cloud, go to App Settings > Secrets and add:
GOOGLE_API_KEY = "your_google_api_key_here"
MODEL_NAME = "gemini-2.0-flash-exp"
MODEL_TEMPERATURE = "0.7"Click "Deploy" and wait for the build to complete.
- β
.envfile with real API keys - β
.streamlit/secrets.tomlwith real secrets - β
Use
.env.exampleand.streamlit/secrets.toml.exampleas templates
The app uses the following environment variables:
| Variable | Required | Default | Description |
|---|---|---|---|
GOOGLE_API_KEY |
β Yes | - | Your Google AI API key |
MODEL_NAME |
No | gemini-2.0-flash-exp |
Gemini model to use |
MODEL_TEMPERATURE |
No | 0.7 |
Model creativity (0.0-1.0) |
GLOBAL_SYSTEM_PROMPT |
No | From config | Custom system prompt |
ai-portfolio-assistant/
βββ backend/
β βββ __init__.py # Makes backend a package
β βββ chat_core.py # Core chat logic
β βββ config.py # Configuration management
β βββ session_memory.py # Session state management
β βββ prompts.json # Prompt templates
β βββ systemprompts.json # System prompts
βββ frontend/
β βββ components/ # Reusable UI components
β βββ img/ # Images and assets
β βββ pages/ # Additional pages
β βββ streamlit_chat_canvas.py # Main app
β βββ utils.py # Utility functions
βββ .streamlit/
β βββ config.toml # Streamlit configuration
β βββ secrets.toml.example # Secrets template
βββ requirements.txt # Python dependencies
βββ .env.example # Environment template
βββ .gitignore # Git ignore rules
βββ README.md # This file
To test the backend independently:
python backend/llm_service.pyThis runs a terminal-based chat to verify your API connection.
Solution: Ensure backend/__init__.py exists (should be an empty file).
Solution:
- Verify your API key at Google AI Studio
- Check that
GOOGLE_API_KEYis set in.env(local) or Streamlit secrets (cloud) - Ensure no extra spaces or quotes in the key
Solution:
- Check that
requirements.txtis in the repository root - Verify all imports use proper package structure (
from backend import ...) - Review build logs in Streamlit Cloud dashboard
- The app maintains conversation history per session
- Session data is stored in memory (resets on restart)
- For production, consider adding persistent storage
- Rate limits apply based on your Google AI API tier
- Fork the repository
- Create a feature branch
- Make your changes
- Test locally
- Submit a pull request
This project is for educational and portfolio purposes.