A facial expression and meme detection system that uses deep learning to classify emotions in images. Built for CSCI218 - Foundations of AI (UOW).
- Emotion detection — Classifies faces into 7 emotions: Surprise, Fear, Disgust, Happiness, Sadness, Anger, Neutral
- Face detection — Uses OpenCV Haar Cascade for face localisation
- Hand gesture detection — MediaPipe for hand landmark detection
- Pre-trained model — Emotion classification via ONNX (models in
FER_models/models/)
- Python 3.11 (required)
- See
requirements.txtfor package dependencies
Check your Python version:
python3.11 --versionIf Python 3.11 is not installed:
- macOS (Homebrew):
brew install python@3.11 - Ubuntu/Debian:
sudo apt install python3.11 python3.11-venv - Windows: Download from python.org
From the project root directory:
python3.11 -m venv .venv311This creates a virtual environment named .venv311 using Python 3.11.
macOS / Linux / WSL (bash/zsh):
source .venv311/bin/activateWindows (Command Prompt):
.venv311\Scripts\activate.batWindows (PowerShell):
.venv311\Scripts\Activate.ps1When activated, your prompt will show (.venv311) at the start.
pip install -r requirements.txtdeactivateFrom the project root (after Setup):
python app.pyOpen http://127.0.0.1:5001 in your browser. Allow camera access to see the webcam, emotion probability bars, and meme result.
A standalone webcam CLI is available in the archive:
python archive/main.pyPress q to quit.
expression-meme-detector/
├── app.py # Web app (Flask)
├── static/
│ ├── index.html # Web UI (webcam, emotion bars, meme)
│ ├── index_v2.html # Alternate UI
│ └── index_v3.html # Alternate UI
├── archive/
│ └── main.py # CLI webcam app
├── hand_gesture_classifier.py # Hand gesture logic
├── FER_models/
│ ├── models/ # Pre-trained emotion models (.onnx)
│ └── archive_model_files/ # Archived model copies by architecture
├── CSCI218_FT02_NOTEBOOK/ # Training notebooks (Custom CNN, ResNet18, MobileNetV2, EfficientNet-B0)
├── monkey_memes/ # Meme images (if present)
├── requirements.txt # Dependencies (opencv, mediapipe, onnxruntime, flask)
└── README.md
Training notebooks and dataset details are documented in CSCI218_FT02_NOTEBOOK/README.md.
- Grant Full Disk Access to Terminal/Cursor in System Settings → Privacy & Security
- Or remove quarantine:
xattr -rd com.apple.quarantine .venv311
The app uses ONNX Runtime for emotion inference. Install with pip install -r requirements.txt.