Skip to content

SANDAG/Validation_Tool

Repository files navigation

SANDAG Validation Tool


The SANDAG Validation Tool is a Python-Dash web application designed to validate travel model outputs against observed traffic and transit counts. It allows transportation planners and analysts to visualize model performance across multiple scenarios with interactive plots, statistics, and geospatial maps. The app is deployed both locally and on Azure using automated GitHub Actions workflows.

🔗 Live App: validation-tool

📦 Tech Stack: Python · Dash · Plotly · Dash Leaflet · Azure Web App · Databricks · GitHub Actions


✨ Features:

Scenario Comparison: Select and compare model scenarios for different time periods and vehicle classes.

Volume & VMT Validation: Compare modeled vs. observed flows and VMT by geography, volume category, and road class.

Truck Validation: Analyze light/heavy/mid-duty trucks with year-based filters.

Transit Validation: Evaluate model performance for transit boardings across modes and TODs.

Interactive Maps: Visualize segment-level or route-level performance gaps on styled Leaflet maps.

Automated CI/CD: Changes pushed to dev or main trigger deployment to Azure dev or production slots respectively.


📁 File structure:

.
├── .github
│   └── workflows
│       └── azure_dev_validation-tool.yml
|       └──main_validation-tool.yml
├── .gitignore
├── .python-version
├── README.md
├── app.py
├── config.yaml
├── load_data.py
├── pyproject.toml
├── requirements.txt
└── validation_plot_generator.py
  • app.py: main script defining the layout of dash app. Including page layout design, scenario selector, menu and page switching and callbacks.
  • load_data.py: script to read data from databricks and T drive according to environment
  • validation_plot_generator.py: includes a series functions about generating graphs, maps and layouts
  • pyproject.toml: project metadata and dependencies (managed by uv)
  • requirements.txt: legacy requirements file (kept for compatibility)
  • config.yaml: config file for local use. Set up scenarios that will load in app.
  • git workflow: automatically update changes into Azure web service and redeploy (main branch to production slot and dev branch to dev slot)

🛠️ Data Pipeline

image


🔁 Development Process

  1. Clone repo in local
  2. In dev branch, edit script and review changes by running app locally: python app.py
  3. After checking, push changes to original dev branch
  4. It will automatically update dash app in dev slot in Azure web service by git workflow. Test updates by reviewing site.
  5. After testing, merge change from dev branch to main branch. And this updates in main branch will trigger workflow to update dash app in production slot in Azure web service. image

⚙️ Local Setup:

  1. Make sure you have access to T drive. Connect to VPN if needed

  2. Install uv if not already installed:

    # On Windows
    powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
    
    # On macOS/Linux
    curl -LsSf https://astral.sh/uv/install.sh | sh
  3. Install dependencies using uv:

    # Option 1: Install from pyproject.toml (recommended)
    uv pip install .
    
    # Option 2: Install from requirements.txt
    uv pip install -r requirements.txt
  4. Set up scenarios that you want to load in app by config.yaml

    LOCAL_FLAG:1

    LOCAL_SCENARIO_LIST: - T:\***

Define LOCAL_SCNEARIO_LIST as data paths of all scenarios that you want to compare in the visualization board

  1. Launch app: Activate the virtual environment and run the app:

    # On Windows
    .venv\Scripts\activate
    
    # On macOS/Linux
    source .venv/bin/activate
    
    # Run the app
    python app.py

    Preview the dashboard in http://127.0.0.1:8050/

  2. Press ctrl+c to stop


☁️ Azure Deployment:

  • set up environment variables (use token to read data from databricks)

    DATABRICKS_SERVER_HOSTNAME = ***

    DATABRICKS_HTTP_PATH = ***

    DATABRICKS_TOKEN = your_token

    SCM_DO_BUILD_DURING_DEPLOYMENT=true (required!)

  • set up start up command under configuration

    gunicorn --workers 4 app:server

    image

  • Define the scenarios that you want to compare in Environment variables

    AZURE_SCENARIO_LIST=1150,272,254

    image

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors