This project showcases a Lambda Architecture using FinDrum for ingesting, processing, storing, and visualizing financial data. It combines real-time analytics via Kafka and PostgreSQL with batch processing through scheduled FinDrum pipelines, offering a complete and modular environment for technical stock analysis.
- Kafka ingests stock prices via the
stock_pricestopic. - A Kafka-based trigger in FinDrum listens for new messages.
- Upon message arrival the results are saved in a PostgreSQL table (
realtime_stock). - Grafana queries PostgreSQL for real-time metrics like volume, open/close prices, etc.
- Kafka Connect pushes all incoming events to MinIO, preserving them for later analysis.
- A scheduled FinDrum pipeline runs hourly starting:
- Reads historical data from MinIO.
- Aggregates prices (average, volume, etc.).
- Saves the summary into a PostgreSQL table (
hourly_stock_summary).
This batch layer is ideal for daily reports or technical indicators like moving averages.
- Grafana connects to PostgreSQL to visualize both real-time and aggregated metrics.
- Dashboards include:
- Minute-by-minute close prices.
- Volume trends by stock symbol.
- Historical summaries per hour.
| Service | Port(s) | Purpose |
|---|---|---|
| Zookeeper | 2181 | Kafka coordination |
| Kafka | 9092 | Event streaming |
| MinIO | 9000, 9001 | S3-compatible storage |
| Kafka Connect | 8083 | Sink connector to MinIO |
| PostgreSQL | 5432 | Structured data persistence |
| Grafana | 3000 | Real-time dashboard |
- Stock events are sent to Kafka (
stock_pricestopic). - Kafka Connect stores events in MinIO.
- A real-time processing pipeline:
- Consumes messages.
- Saves results in PostgreSQL.
- Grafana connects to PostgreSQL and visualizes the live data.
git https://github.com/FinDrum/Lambda-Architecture.git
cd Lambda-Architecture/infrastructure
docker-compose up --buildActivate your Python environment.
Install requirements.txt
pip install -r requirements.txtAnd run the pipelines
from findrum import Platform
platform = Platform("./config.yaml")
platform.register_pipeline("./pipelines/realtime_pipeline.yaml")
platform.register_pipeline("./pipelines/batch_pipeline.yaml")
platform.start()- MinIO Console: http://localhost:9001 (user:
minioadmin, pass:minioadmin) - Grafana: http://localhost:3000 (user:
admin, pass:admin)
- Docker & Docker Compose
- Python 3.12+ (if running the pipeline manually)
-
Automate batch jobs using Spark over MinIO data.
-
Integrate Prometheus for extended monitoring.
-
Add anomaly detection pipelines using ML.
© 2025 – FinDrum

