An interactive technical presentation for an industrial-grade face mask detection system, built with YuNet face detection and MobileNetV2 classification.
🐍 Looking for the actual detection system? The Python source code (OpenCV + TensorFlow) lives at
github.com/rebeeh/Face-Mask-Detection-Python
This repository is an interactive slideshow (React/Vite SPA) that documents the architecture, data pipeline, and performance analytics of a production-grade real-time face mask detection system.
Key system metrics:
| Metric | Value |
|---|---|
| Precision | 98.2% |
| Recall | 97.5% |
| F1 Score | 0.978 |
| Inference Latency | 22ms |
| mAP@.5 | 0.962 |
The underlying ML system uses a two-stage pipeline:
Camera Frame
│
▼
┌─────────────────────────────┐
│ YuNet Face Detector (ONNX) │ ~5ms / face
│ Score threshold: 0.60 │
│ NMS threshold: 0.30 │
└────────────┬────────────────┘
│ Face ROI crops
▼
┌──────────────────────────────────────────┐
│ Preprocessing Pipeline │
│ 1. BGR → RGB color conversion │
│ 2. Aspect-ratio-preserving square pad │
│ 3. Resize to 224×224 + MobileNet norm │
└────────────┬─────────────────────────────┘
│ Normalized tensor
▼
┌──────────────────────────────────────────┐
│ MobileNetV2 Classifier (TFLite FP16) │
│ Binary: With Mask / No Mask │
│ Quantized: 3.4MB → 1.8MB (47% savings) │
└──────────────────────────────────────────┘
Threading model: Camera I/O runs in a dedicated thread with a threading.Lock mutex, decoupling frame capture from inference to sustain 30+ FPS with 0ms input lag.
- Base images: 3,833 (1,916 masked / 1,917 unmasked) — near-perfect class balance
- Augmentations: Rotation, brightness, blur, zoom
- Training: 8 epochs · Binary Cross-Entropy · Adam optimizer
- Validation accuracy at epoch 8: 97.8%
| Technology | Purpose |
|---|---|
| React 19 + TypeScript | UI framework |
| Vite 7 | Build tool & dev server |
| TailwindCSS v4 | Utility-first styling |
| Framer Motion | Slide & element animations |
| Recharts | Training analytics charts |
| Lucide React | Icon system |
- Node.js ≥ 18
- npm ≥ 9
# 1. Clone the repository
git clone https://github.com/rebeeh/Face-Mask-Detection-Overview.git
cd Face-Mask-Detection-Overview
# 2. Install dependencies
npm install
# 3. Start the dev server
npm run devOpen http://localhost:5173 in your browser.
npm run buildThe output is written to ./dist/.
npm run previewsrc/
├── types/
│ └── index.ts # Shared TypeScript interfaces
├── data/
│ └── constants.ts # Chart data & code snippet constants
├── components/
│ ├── ui/
│ │ ├── GlassCard.tsx # Reusable frosted-glass card
│ │ ├── CodeBlock.tsx # Syntax-highlighted code display
│ │ └── SlideContainer.tsx# Animated slide wrapper
│ ├── layout/
│ │ ├── ProgressBar.tsx # Top progress indicator
│ │ ├── SideNav.tsx # Left dot-navigation
│ │ └── NavControls.tsx # Bottom prev/next navigation
│ └── slides/
│ ├── HeroSlide.tsx
│ ├── DataSlide.tsx
│ ├── StackSlide.tsx
│ ├── PreprocessingSlide.tsx
│ ├── YuNetSlide.tsx
│ ├── AnalyticsSlide.tsx
│ ├── MobileNetSlide.tsx
│ ├── ThreadingSlide.tsx
│ ├── OptimizationSlide.tsx
│ └── ConclusionSlide.tsx
└── App.tsx # Root orchestrator (~90 lines)
| Action | Control |
|---|---|
| Next slide | → Arrow / Space |
| Previous slide | ← Arrow |
| Jump to slide | Click the left sidebar dot |
This project is automatically deployed to GitHub Pages on every push to main via the workflow at .github/workflows/deploy.yaml.
Live URL: https://rebeeh.github.io/Face-Mask-Detection-Overview/
To deploy your own fork:
- Fork this repository
- Go to Settings → Pages → Source and select GitHub Actions
- Push any change to
main— the workflow handles the rest
MIT — see LICENSE for details.