Skip to content

timchapman/sqlserver-to-postgresql

Repository files navigation

SQL Server to PostgreSQL Migration Tool

Automated migration tool for converting SQL Server databases to PostgreSQL using hybrid rule-based and AI-assisted conversion. This tool connects directly to SQL Server instances to extract live database schemas, or alternatively works with DACPAC files, then intelligently converts schema and code objects to PostgreSQL-compatible SQL.

📖 For complete workflow, see MIGRATION_WORKFLOW.md

✨ Key Features

  • 🔌 Direct SQL Server Connection: Connects to live SQL Server instances using SMO to extract database schema
  • 📦 DACPAC Support: Alternative extraction from .dacpac or .bacpac files
  • 🤖 Hybrid Migration: Rule-based schema migration (fast, deterministic) + AI-powered code conversion (3-stage pipeline)
  • 📊 Organized Outputs: 10 numbered schema deployment files + individual code objects for version control
  • Parallel Processing: Multi-threaded execution for faster migrations
  • 🧩 Extension Detection: Automatically identifies required PostgreSQL extensions (pgcrypto, uuid-ossp, ltree, postgis, etc.)
  • Production Ready: Proper dependency ordering, UTF-8 encoding, PostgreSQL best practices

📋 Prerequisites

  • PowerShell 7+ (for automation scripts)
  • Python 3.8+ (for AI pipeline)
  • Azure OpenAI access (for code migration) - Get access
  • SQL Server (for direct extraction) OR .dacpac file
  • VS Code + GitHub Copilot (optional, for interactive mode)

🚀 Quick Start

1️⃣ Configure Environment

Copy .env.example to .env and configure:

# Azure OpenAI Settings (required for code migration)
DRAFT_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
DRAFT_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_DRAFT=gpt-4o

# SQL Server Connection (for direct extraction)
SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USERNAME=sa
SQL_SERVER_PASSWORD=YourPassword
SQL_SERVER_USE_WINDOWS_AUTH=false  # Set to true for Windows Auth

2️⃣ Extract SQL Server Objects

Option A: Direct SQL Server Connection (Recommended)

Connects to a live SQL Server instance and extracts all database objects:

.\scripts\extract_sqlserver_objects.ps1 `
    -Server "localhost" `
    -Database "AdventureWorks2016" `
    -OutputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016"

Or use credentials from .env:

.\scripts\extract_database.ps1

Option B: From DACPAC File

Extract from a pre-exported .dacpac file:

.\scripts\extract_dacpac_objects.ps1 -Package "path\to\database.dacpac"

⚠️ Important: Let the extraction complete fully. Do NOT run monitoring commands while it's running.

Output Structure:

Migrations/DatabaseName/Input/DatabaseName/
├── Tables/
│   ├── Tables/           # Table definitions
│   ├── Constraints/      # Constraints (PK, FK, Check, Default)
│   └── Indexes/          # Index definitions
└── Programmability/
    ├── Views/            # View definitions
    ├── Functions/        # User-defined functions
    ├── StoredProcedures/ # Stored procedures
    └── Triggers/         # Triggers

3️⃣ Migrate Schema Objects (Automated)

Converts tables, constraints, indexes, sequences using hybrid rule-based + AI approach:

.\scripts\migrate_schema_objects.ps1 `
    -InputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016" `
    -OutputDir "Migrations\AdventureWorks2016\Output\schema" `
    -MaxParallel 4

Output: 10 numbered deployment files in proper dependency order:

  • 01_extensions.sql - Required PostgreSQL extensions
  • 02_schemas.sql - Schema definitions
  • 03_sequences.sql - Identity sequences
  • 04_tables.sql - Table definitions
  • 05_primary_keys.sql - Primary key constraints
  • 06_unique_constraints.sql - Unique constraints
  • 07_check_constraints.sql - Check constraints
  • 08_default_constraints.sql - Default values
  • 09_foreign_keys.sql - Foreign key relationships
  • 10_indexes.sql - Indexes

4️⃣ Migrate Code Objects (AI-Powered)

Converts views, functions, stored procedures, and triggers using 3-stage AI pipeline:

.\scripts\migrate_code_objects.ps1 `
    -InputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016" `
    -OutputDir "Migrations\AdventureWorks2016\Output\AdventureWorks2016" `
    -MaxParallel 4

⚠️ Important: Start the command and let it run uninterrupted. Do NOT check progress during execution.

AI Pipeline Stages:

  1. Draft - Initial T-SQL → PostgreSQL conversion
  2. Refine - Improves accuracy by comparing with original
  3. Verify - Validates correctness and syntax

Output:

  • 00_code_extensions.sql - Extensions required by code objects
  • 11_views.sql - All views (consolidated)
  • 12_functions.sql - All functions (consolidated)
  • 13_procedures.sql - All stored procedures (consolidated)
  • 14_triggers.sql - All triggers (consolidated)
  • Programmability/Views/*.sql - Individual view files for version control
  • Programmability/Functions/*.sql - Individual function files
  • Programmability/StoredProcedures/*.sql - Individual procedure files
  • Programmability/Triggers/*.sql - Individual trigger files

5️⃣ Deploy to PostgreSQL

Deploy the generated files in order:

# 1. Deploy schema (order matters!)
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/01_extensions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/02_schemas.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/03_sequences.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/04_tables.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/05_primary_keys.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/06_unique_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/07_check_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/08_default_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/09_foreign_keys.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/10_indexes.sql

# 2. Deploy code objects
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/00_code_extensions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/11_views.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/12_functions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/13_procedures.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/14_triggers.sql

Or use a simple loop:

# Deploy all schema files
for f in Migrations/AdventureWorks2016/Output/schema/*.sql; do
  psql -d your_database -f "$f"
done

# Deploy all code files
for f in Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/*.sql; do
  psql -d your_database -f "$f"
done

6️⃣ Validate (Optional)

Check for any conversion issues:

.\scripts\validate_migrated_objects.ps1 `
    -OutputDir "Migrations\AdventureWorks2016\Output\AdventureWorks2016"

Report saved to: Output/AdventureWorks2016/code_validation.json

🏗️ Architecture

Migration Workflow

SQL Server Instance → Extract → Schema Migration → Code Migration → PostgreSQL
       ↓                 ↓            ↓                  ↓              ↓
  [Live Database]    [Input/]    [01-10.sql]        [11-14.sql]   [Deploy]
   or [DACPAC]

Schema Migration (Hybrid Approach)

  • 90% rule-based for predictable, fast conversion
  • 10% AI assistance for complex edge cases
  • Handles: data types, constraints, indexes, sequences, user-defined types
  • Deterministic and repeatable

Code Migration (AI-Powered)

  • 3-stage pipeline (Draft → Refine → Verify)
  • Uses Azure OpenAI for intelligent conversion
  • Handles complex T-SQL logic patterns

Intelligent Pattern Handling:

  • MERGE statements → INSERT ... ON CONFLICT
  • Cursors → FOR loops or set-based operations
  • SQL Server functions → PostgreSQL equivalents (GETDATE()CURRENT_TIMESTAMP)
  • Window functions → PostgreSQL window function syntax
  • Temp tables (#temp) → TEMPORARY TABLE
  • Table variables → CTEs or temp tables
  • TRY/CATCHBEGIN ... EXCEPTION

PostgreSQL Extension Detection

Automatically detects and generates extension requirements:

Extension Purpose SQL Server Equivalent
uuid-ossp UUID generation NEWID(), NEWSEQUENTIALID()
pgcrypto Cryptographic functions HASHBYTES(), encryption functions
ltree Hierarchical data hierarchyid
postgis Spatial types geometry, geography
pg_trgm Text similarity Full-text search functions
tablefunc Crosstab/pivot PIVOT, UNPIVOT
hstore Key-value storage Property bags, JSON

📁 Directory Structure

Migrations/
└── DatabaseName/
    ├── Input/                          # Extracted SQL Server DDL
    │   └── DatabaseName/
    │       ├── Tables/
    │       │   ├── Tables/             # Table definitions
    │       │   ├── Constraints/        # Constraints (PK, FK, etc.)
    │       │   └── Indexes/            # Index definitions
    │       └── Programmability/
    │           ├── Views/              # View definitions
    │           ├── Functions/          # User-defined functions
    │           ├── StoredProcedures/   # Stored procedures
    │           └── Triggers/           # Trigger definitions
    └── Output/
        ├── schema/                     # Schema migration output
        │   ├── 01_extensions.sql
        │   ├── 02_schemas.sql
        │   ├── 03_sequences.sql
        │   ├── 04_tables.sql
        │   ├── 05_primary_keys.sql
        │   ├── 06_unique_constraints.sql
        │   ├── 07_check_constraints.sql
        │   ├── 08_default_constraints.sql
        │   ├── 09_foreign_keys.sql
        │   └── 10_indexes.sql
        └── DatabaseName/               # Code migration output
            └── Programmability/
                ├── 00_code_extensions.sql
                ├── 11_views.sql
                ├── 12_functions.sql
                ├── 13_procedures.sql
                ├── 14_triggers.sql
                ├── Views/              # Individual files
                ├── Functions/          # Individual files
                ├── StoredProcedures/   # Individual files
                └── Triggers/           # Individual files

⚙️ Configuration

Azure OpenAI Setup

Configure different models for each AI stage:

# Draft Stage (initial conversion) - uses GPT-4o
DRAFT_PROVIDER=azure
DRAFT_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
DRAFT_AZURE_OPENAI_API_VERSION=2025-01-01-preview
DRAFT_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_DRAFT=gpt-4o

# Refine Stage (accuracy improvement) - uses GPT-4o
REFINE_PROVIDER=azure
REFINE_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
REFINE_AZURE_OPENAI_API_VERSION=2025-01-01-preview
REFINE_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_REFINE=gpt-4o

# Verify Stage (validation) - can use cheaper model
VERIFY_PROVIDER=azure
VERIFY_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
VERIFY_AZURE_OPENAI_API_VERSION=2025-01-01-preview
VERIFY_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_VERIFY=gpt-4o-mini

Entra ID (Azure AD) Authentication

Leave *_AZURE_OPENAI_KEY blank to use Azure AD authentication:

DRAFT_AZURE_OPENAI_KEY=
# Requires: az login, managed identity, or other DefaultAzureCredential method

SQL Server Connection Options

Windows Authentication:

SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USE_WINDOWS_AUTH=true

SQL Server Authentication:

SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USERNAME=sa
SQL_SERVER_PASSWORD=YourPassword
SQL_SERVER_USE_WINDOWS_AUTH=false

Connection String (Alternative):

SQL_SERVER_CONNECTION_STRING=Server=localhost;Database=AdventureWorks2016;Integrated Security=True;TrustServerCertificate=True

🎯 Common Scenarios

Scenario 1: Full Database Migration

# 1. Extract from SQL Server
.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"

# 2. Migrate schema
.\scripts\migrate_schema_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\schema"

# 3. Migrate code
.\scripts\migrate_code_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\MyDB"

Scenario 2: Schema-Only Migration

.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
.\scripts\migrate_schema_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\schema"

Scenario 3: Code-Only Migration

.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
.\scripts\migrate_code_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\MyDB"

Scenario 4: Using Stored Credentials

# Configure .env file with connection details
# Then run without parameters:
.\scripts\extract_database.ps1

🔧 Troubleshooting

SQL Server Connection Issues

Problem: "Unable to connect to SQL Server"

Solutions:

  • Verify SQL Server is running: sqlcmd -S localhost -Q "SELECT @@VERSION"
  • Check firewall settings (default port 1433)
  • Ensure SQL Server authentication is enabled (mixed mode)
  • For Windows Auth, ensure your Windows user has access
  • Test connection with SSMS first
  • Check connection string format

Common Error Messages:

Login failed for user 'sa'
→ Check username/password in .env

Named Pipes Provider: Could not open a connection
→ Verify SQL Server service is running

A network-related or instance-specific error
→ Check server name and firewall

Azure OpenAI Errors

Problem: "API key invalid" or "Deployment not found"

Solutions:

  • Verify endpoint URL in .env (should end with .openai.azure.com)
  • Check API key is correct and not expired
  • Ensure deployment name exactly matches your Azure OpenAI resource
  • Verify API version is supported (2025-01-01-preview recommended)
  • Check Azure subscription has available quota

Check your configuration:

# Test Azure OpenAI connection
curl -H "api-key: YOUR_KEY" "YOUR_ENDPOINT/openai/deployments/YOUR_DEPLOYMENT/chat/completions?api-version=2025-01-01-preview"

Script Execution Issues

Problem: Script hangs or produces no output

Solutions:

  • Do NOT monitor progress while script is running
  • Do NOT check terminal output during execution
  • Let scripts complete fully before checking results
  • Check output directory for log files
  • Increase -MaxParallel parameter if too slow
  • Decrease -MaxParallel if experiencing API throttling

Migration Quality Issues

Problem: Converted code has errors

Solutions:

  • Review validation report from validate_migrated_objects.ps1
  • Check Programmability/ individual files for specific issues
  • Verify required extensions are installed in PostgreSQL
  • Test converted SQL in PostgreSQL manually
  • Check for SQL Server-specific features that need manual conversion

⚠️ Known Limitations

This tool focuses on schema and code migration only. The following are intentionally out of scope:

Feature Status Alternative
Data migration ❌ Out of scope Use pg_dump, ETL tools, or custom scripts
CLR assemblies ❌ Not supported Rewrite in PL/pgSQL or external service
SQL Server Agent jobs ❌ Not supported Use pg_cron extension or external scheduler
Service Broker ❌ Not supported Use message queues (RabbitMQ, Kafka)
Linked servers ❌ Not supported Use Foreign Data Wrappers (FDW)
Full-text search ⚠️ Limited Use PostgreSQL FTS or Elasticsearch
Replication ❌ Not supported Configure PostgreSQL replication separately
SSRS/SSIS/SSAS ❌ Not supported Requires separate BI tool migration

🤝 Contributing

Contributions are welcome! To contribute:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Test thoroughly
  5. Commit with clear messages (git commit -m 'Add amazing feature')
  6. Push to your branch (git push origin feature/amazing-feature)
  7. Open a Pull Request

Areas for contribution:

  • Additional SQL Server → PostgreSQL conversion patterns
  • Support for more data types and functions
  • Performance improvements
  • Documentation enhancements
  • Bug fixes

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🆘 Support

For issues, questions, or feature requests:

🙏 Acknowledgments

  • Built with Azure OpenAI
  • Uses SQL Server SMO for extraction
  • Inspired by real-world SQL Server → PostgreSQL migration challenges

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published