Automated migration tool for converting SQL Server databases to PostgreSQL using hybrid rule-based and AI-assisted conversion. This tool connects directly to SQL Server instances to extract live database schemas, or alternatively works with DACPAC files, then intelligently converts schema and code objects to PostgreSQL-compatible SQL.
📖 For complete workflow, see MIGRATION_WORKFLOW.md
- 🔌 Direct SQL Server Connection: Connects to live SQL Server instances using SMO to extract database schema
- 📦 DACPAC Support: Alternative extraction from
.dacpacor.bacpacfiles - 🤖 Hybrid Migration: Rule-based schema migration (fast, deterministic) + AI-powered code conversion (3-stage pipeline)
- 📊 Organized Outputs: 10 numbered schema deployment files + individual code objects for version control
- ⚡ Parallel Processing: Multi-threaded execution for faster migrations
- 🧩 Extension Detection: Automatically identifies required PostgreSQL extensions (pgcrypto, uuid-ossp, ltree, postgis, etc.)
- ✅ Production Ready: Proper dependency ordering, UTF-8 encoding, PostgreSQL best practices
- PowerShell 7+ (for automation scripts)
- Python 3.8+ (for AI pipeline)
- Azure OpenAI access (for code migration) - Get access
- SQL Server (for direct extraction) OR
.dacpacfile - VS Code + GitHub Copilot (optional, for interactive mode)
Copy .env.example to .env and configure:
# Azure OpenAI Settings (required for code migration)
DRAFT_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
DRAFT_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_DRAFT=gpt-4o
# SQL Server Connection (for direct extraction)
SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USERNAME=sa
SQL_SERVER_PASSWORD=YourPassword
SQL_SERVER_USE_WINDOWS_AUTH=false # Set to true for Windows AuthOption A: Direct SQL Server Connection (Recommended)
Connects to a live SQL Server instance and extracts all database objects:
.\scripts\extract_sqlserver_objects.ps1 `
-Server "localhost" `
-Database "AdventureWorks2016" `
-OutputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016"Or use credentials from .env:
.\scripts\extract_database.ps1Option B: From DACPAC File
Extract from a pre-exported .dacpac file:
.\scripts\extract_dacpac_objects.ps1 -Package "path\to\database.dacpac"
⚠️ Important: Let the extraction complete fully. Do NOT run monitoring commands while it's running.
Output Structure:
Migrations/DatabaseName/Input/DatabaseName/
├── Tables/
│ ├── Tables/ # Table definitions
│ ├── Constraints/ # Constraints (PK, FK, Check, Default)
│ └── Indexes/ # Index definitions
└── Programmability/
├── Views/ # View definitions
├── Functions/ # User-defined functions
├── StoredProcedures/ # Stored procedures
└── Triggers/ # Triggers
Converts tables, constraints, indexes, sequences using hybrid rule-based + AI approach:
.\scripts\migrate_schema_objects.ps1 `
-InputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016" `
-OutputDir "Migrations\AdventureWorks2016\Output\schema" `
-MaxParallel 4Output: 10 numbered deployment files in proper dependency order:
01_extensions.sql- Required PostgreSQL extensions02_schemas.sql- Schema definitions03_sequences.sql- Identity sequences04_tables.sql- Table definitions05_primary_keys.sql- Primary key constraints06_unique_constraints.sql- Unique constraints07_check_constraints.sql- Check constraints08_default_constraints.sql- Default values09_foreign_keys.sql- Foreign key relationships10_indexes.sql- Indexes
Converts views, functions, stored procedures, and triggers using 3-stage AI pipeline:
.\scripts\migrate_code_objects.ps1 `
-InputDir "Migrations\AdventureWorks2016\Input\AdventureWorks2016" `
-OutputDir "Migrations\AdventureWorks2016\Output\AdventureWorks2016" `
-MaxParallel 4
⚠️ Important: Start the command and let it run uninterrupted. Do NOT check progress during execution.
AI Pipeline Stages:
- Draft - Initial T-SQL → PostgreSQL conversion
- Refine - Improves accuracy by comparing with original
- Verify - Validates correctness and syntax
Output:
00_code_extensions.sql- Extensions required by code objects11_views.sql- All views (consolidated)12_functions.sql- All functions (consolidated)13_procedures.sql- All stored procedures (consolidated)14_triggers.sql- All triggers (consolidated)Programmability/Views/*.sql- Individual view files for version controlProgrammability/Functions/*.sql- Individual function filesProgrammability/StoredProcedures/*.sql- Individual procedure filesProgrammability/Triggers/*.sql- Individual trigger files
Deploy the generated files in order:
# 1. Deploy schema (order matters!)
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/01_extensions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/02_schemas.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/03_sequences.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/04_tables.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/05_primary_keys.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/06_unique_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/07_check_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/08_default_constraints.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/09_foreign_keys.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/schema/10_indexes.sql
# 2. Deploy code objects
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/00_code_extensions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/11_views.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/12_functions.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/13_procedures.sql
psql -d your_database -f Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/14_triggers.sqlOr use a simple loop:
# Deploy all schema files
for f in Migrations/AdventureWorks2016/Output/schema/*.sql; do
psql -d your_database -f "$f"
done
# Deploy all code files
for f in Migrations/AdventureWorks2016/Output/AdventureWorks2016/Programmability/*.sql; do
psql -d your_database -f "$f"
doneCheck for any conversion issues:
.\scripts\validate_migrated_objects.ps1 `
-OutputDir "Migrations\AdventureWorks2016\Output\AdventureWorks2016"Report saved to: Output/AdventureWorks2016/code_validation.json
SQL Server Instance → Extract → Schema Migration → Code Migration → PostgreSQL
↓ ↓ ↓ ↓ ↓
[Live Database] [Input/] [01-10.sql] [11-14.sql] [Deploy]
or [DACPAC]
- 90% rule-based for predictable, fast conversion
- 10% AI assistance for complex edge cases
- Handles: data types, constraints, indexes, sequences, user-defined types
- Deterministic and repeatable
- 3-stage pipeline (Draft → Refine → Verify)
- Uses Azure OpenAI for intelligent conversion
- Handles complex T-SQL logic patterns
Intelligent Pattern Handling:
MERGEstatements →INSERT ... ON CONFLICT- Cursors →
FORloops or set-based operations - SQL Server functions → PostgreSQL equivalents (
GETDATE()→CURRENT_TIMESTAMP) - Window functions → PostgreSQL window function syntax
- Temp tables (
#temp) →TEMPORARY TABLE - Table variables → CTEs or temp tables
TRY/CATCH→BEGIN ... EXCEPTION
Automatically detects and generates extension requirements:
| Extension | Purpose | SQL Server Equivalent |
|---|---|---|
uuid-ossp |
UUID generation | NEWID(), NEWSEQUENTIALID() |
pgcrypto |
Cryptographic functions | HASHBYTES(), encryption functions |
ltree |
Hierarchical data | hierarchyid |
postgis |
Spatial types | geometry, geography |
pg_trgm |
Text similarity | Full-text search functions |
tablefunc |
Crosstab/pivot | PIVOT, UNPIVOT |
hstore |
Key-value storage | Property bags, JSON |
Migrations/
└── DatabaseName/
├── Input/ # Extracted SQL Server DDL
│ └── DatabaseName/
│ ├── Tables/
│ │ ├── Tables/ # Table definitions
│ │ ├── Constraints/ # Constraints (PK, FK, etc.)
│ │ └── Indexes/ # Index definitions
│ └── Programmability/
│ ├── Views/ # View definitions
│ ├── Functions/ # User-defined functions
│ ├── StoredProcedures/ # Stored procedures
│ └── Triggers/ # Trigger definitions
└── Output/
├── schema/ # Schema migration output
│ ├── 01_extensions.sql
│ ├── 02_schemas.sql
│ ├── 03_sequences.sql
│ ├── 04_tables.sql
│ ├── 05_primary_keys.sql
│ ├── 06_unique_constraints.sql
│ ├── 07_check_constraints.sql
│ ├── 08_default_constraints.sql
│ ├── 09_foreign_keys.sql
│ └── 10_indexes.sql
└── DatabaseName/ # Code migration output
└── Programmability/
├── 00_code_extensions.sql
├── 11_views.sql
├── 12_functions.sql
├── 13_procedures.sql
├── 14_triggers.sql
├── Views/ # Individual files
├── Functions/ # Individual files
├── StoredProcedures/ # Individual files
└── Triggers/ # Individual files
Configure different models for each AI stage:
# Draft Stage (initial conversion) - uses GPT-4o
DRAFT_PROVIDER=azure
DRAFT_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
DRAFT_AZURE_OPENAI_API_VERSION=2025-01-01-preview
DRAFT_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_DRAFT=gpt-4o
# Refine Stage (accuracy improvement) - uses GPT-4o
REFINE_PROVIDER=azure
REFINE_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
REFINE_AZURE_OPENAI_API_VERSION=2025-01-01-preview
REFINE_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_REFINE=gpt-4o
# Verify Stage (validation) - can use cheaper model
VERIFY_PROVIDER=azure
VERIFY_AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
VERIFY_AZURE_OPENAI_API_VERSION=2025-01-01-preview
VERIFY_AZURE_OPENAI_KEY=your-key-here
AZURE_OPENAI_DEPLOYMENT_VERIFY=gpt-4o-miniLeave *_AZURE_OPENAI_KEY blank to use Azure AD authentication:
DRAFT_AZURE_OPENAI_KEY=
# Requires: az login, managed identity, or other DefaultAzureCredential methodWindows Authentication:
SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USE_WINDOWS_AUTH=trueSQL Server Authentication:
SQL_INSTANCE=localhost
SQL_SERVER_DATABASE=AdventureWorks2016
SQL_SERVER_USERNAME=sa
SQL_SERVER_PASSWORD=YourPassword
SQL_SERVER_USE_WINDOWS_AUTH=falseConnection String (Alternative):
SQL_SERVER_CONNECTION_STRING=Server=localhost;Database=AdventureWorks2016;Integrated Security=True;TrustServerCertificate=True# 1. Extract from SQL Server
.\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
# 2. Migrate schema
.\scripts\migrate_schema_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\schema"
# 3. Migrate code
.\scripts\migrate_code_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\MyDB".\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
.\scripts\migrate_schema_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\schema".\scripts\extract_sqlserver_objects.ps1 -Server "localhost" -Database "MyDB"
.\scripts\migrate_code_objects.ps1 -InputDir "Migrations\MyDB\Input\MyDB" -OutputDir "Migrations\MyDB\Output\MyDB"# Configure .env file with connection details
# Then run without parameters:
.\scripts\extract_database.ps1Problem: "Unable to connect to SQL Server"
Solutions:
- Verify SQL Server is running:
sqlcmd -S localhost -Q "SELECT @@VERSION" - Check firewall settings (default port 1433)
- Ensure SQL Server authentication is enabled (mixed mode)
- For Windows Auth, ensure your Windows user has access
- Test connection with SSMS first
- Check connection string format
Common Error Messages:
Login failed for user 'sa'
→ Check username/password in .env
Named Pipes Provider: Could not open a connection
→ Verify SQL Server service is running
A network-related or instance-specific error
→ Check server name and firewall
Problem: "API key invalid" or "Deployment not found"
Solutions:
- Verify endpoint URL in
.env(should end with.openai.azure.com) - Check API key is correct and not expired
- Ensure deployment name exactly matches your Azure OpenAI resource
- Verify API version is supported (2025-01-01-preview recommended)
- Check Azure subscription has available quota
Check your configuration:
# Test Azure OpenAI connection
curl -H "api-key: YOUR_KEY" "YOUR_ENDPOINT/openai/deployments/YOUR_DEPLOYMENT/chat/completions?api-version=2025-01-01-preview"Problem: Script hangs or produces no output
Solutions:
- Do NOT monitor progress while script is running
- Do NOT check terminal output during execution
- Let scripts complete fully before checking results
- Check output directory for log files
- Increase
-MaxParallelparameter if too slow - Decrease
-MaxParallelif experiencing API throttling
Problem: Converted code has errors
Solutions:
- Review validation report from
validate_migrated_objects.ps1 - Check
Programmability/individual files for specific issues - Verify required extensions are installed in PostgreSQL
- Test converted SQL in PostgreSQL manually
- Check for SQL Server-specific features that need manual conversion
This tool focuses on schema and code migration only. The following are intentionally out of scope:
| Feature | Status | Alternative |
|---|---|---|
| Data migration | ❌ Out of scope | Use pg_dump, ETL tools, or custom scripts |
| CLR assemblies | ❌ Not supported | Rewrite in PL/pgSQL or external service |
| SQL Server Agent jobs | ❌ Not supported | Use pg_cron extension or external scheduler |
| Service Broker | ❌ Not supported | Use message queues (RabbitMQ, Kafka) |
| Linked servers | ❌ Not supported | Use Foreign Data Wrappers (FDW) |
| Full-text search | Use PostgreSQL FTS or Elasticsearch | |
| Replication | ❌ Not supported | Configure PostgreSQL replication separately |
| SSRS/SSIS/SSAS | ❌ Not supported | Requires separate BI tool migration |
Contributions are welcome! To contribute:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Test thoroughly
- Commit with clear messages (
git commit -m 'Add amazing feature') - Push to your branch (
git push origin feature/amazing-feature) - Open a Pull Request
Areas for contribution:
- Additional SQL Server → PostgreSQL conversion patterns
- Support for more data types and functions
- Performance improvements
- Documentation enhancements
- Bug fixes
This project is licensed under the MIT License - see the LICENSE file for details.
For issues, questions, or feature requests:
- 🐛 Bug Reports: Open an issue
- 💡 Feature Requests: Open an issue
- 📖 Documentation: See MIGRATION_WORKFLOW.md and MIGRATION_GUIDE.md
- 💬 Discussions: GitHub Discussions
- Built with Azure OpenAI
- Uses SQL Server SMO for extraction
- Inspired by real-world SQL Server → PostgreSQL migration challenges