This project is an AI-powered personal assistant built using Streamlit for the frontend and FastAPI for the backend. The backend interacts with the OpenAI API to generate responses, and the frontend provides a user-friendly chat interface.
- Chat Interface: Users can interact with the AI assistant via a chat interface.
- Session Management: Each conversation is associated with a unique session ID.
- Conversation History: The chat history is stored in an SQLite database.
- OpenAI Integration: The backend uses the OpenAI API to generate responses.
Before running the project, ensure you have the following installed:
- Docker: Install Docker
- Docker Compose: Install Docker Compose
ai-personal-assistant/
├── backend/
│ ├── app.py/ # Backend application code
│ ├── Dockerfile # Dockerfile for the backend
├── frontend/
│ ├── streamlit_app.py/ # Frontend application code
│ ├── Dockerfile # Dockerfile for the frontend
├── docker-compose.yml # Docker Compose configuration
└── README.md # Project documentation
git clone https://github.com/your-username/ai-personal-assistant.git
cd ai-personal-assistantUse Docker Compose to build and run the backend and frontend services:
docker-compose up --buildThis will:
- Build the Docker images for the backend and frontend.
- Start the backend service on port
8000. - Start the frontend service on port
8501.
- Frontend: Open your browser and go to
http://localhost:8501. - Backend: The backend API will be available at
http://localhost:8000.
-
Enter OpenAI API Key:
- On the left sidebar of the Streamlit frontend, enter your OpenAI API key.
-
Start Chatting:
- Type your message in the chat input box and press Enter.
- The AI assistant will respond to your queries.
-
Reset Conversation:
- Use the "Start New Conversation" button in the sidebar to reset the chat history.
The docker-compose.yml file defines two services:
-
Backend:
- Built from the
./backend/Dockerfile. - Exposes port
8000for the FastAPI backend.
- Built from the
-
Frontend:
- Built from the
./frontend/Dockerfile. - Exposes port
8501for the Streamlit frontend. - Depends on the backend service.
- Built from the
The backend is built using FastAPI and provides the following functionality:
-
API Endpoint:
POST /llm- Accepts a JSON payload with the user's question, conversation history, session ID, and OpenAI API key.
- Returns the AI-generated response.
-
Database:
- Conversations are stored in an SQLite database (
conversations.db).
- Conversations are stored in an SQLite database (
The frontend is built using Streamlit and provides the following functionality:
-
Chat Interface:
- Users can enter their OpenAI API key in the sidebar.
- Users can interact with the AI assistant via a chat interface.
-
Session Management:
- Each conversation is associated with a unique session ID.
To stop the services, run:
docker-compose down-
Docker Compose Fails to Start:
- Ensure Docker and Docker Compose are installed correctly.
- Check the logs for errors using
docker-compose logs.
-
Frontend Cannot Connect to Backend:
- Ensure the backend service is running and accessible at
http://localhost:8000. - Verify the frontend is correctly configured to call the backend API.
- Ensure the backend service is running and accessible at
-
OpenAI API Key Not Working:
- Ensure the API key is valid and has sufficient credits.
This project is licensed under the MIT License. See the LICENSE file for details.
- OpenAI: For providing the GPT-3.5-turbo model.
- Streamlit: For the easy-to-use frontend framework.
- FastAPI: For the high-performance backend framework.
Feel free to customize this README.md to better suit your project. Let me know if you need further assistance!