evolved.io logotype

#TypeScript #TypeORM #Database #Migrations #AI #Cursor #MCP #Optimization

AI-Powered TypeORM Migration Squashing

Transform your TypeORM performance with AI-assisted migration consolidation. Learn how Cursor, MCP tools, and smart schema validation can reduce startup times while maintaining perfect database integrity.

Dennis Gaidel
Dennis GaidelSeptember 7, 2025

Over time, successful projects accumulate hundreds or even thousands of database migrations. While each migration serves its purpose during development, this growth creates significant performance problems. Real-world evidence shows that migration execution times can increase dramatically - one documented case saw startup times increase from under 10 seconds to 120 seconds when the project reached just 50 migration files (GitHub Issue #4136).

In practice, a mature codebase with over 400 migrations can take more than 4 minutes to run the migration command, creating a substantial burden during development and deployment cycles.

Migration squashing offers a solution by consolidating historical migrations into clean, manageable snapshots while preserving the final database state.

What is Migration Squashing?

Migration squashing is the process of consolidating multiple historical migrations into fewer, cleaner migrations while preserving the final database state. Instead of running 400+ individual migration files, you create a single "base" migration that represents the current schema, then archive the historical migrations.

Think of it like Git's squash commits - you're creating a cleaner history while maintaining the end result.

Why Squash Migrations?

Performance Benefits

# Before squashing - 400+ migrations
npm run migration:run  # Takes 4+ minutes
 
# After squashing - 3 base migrations  
npm run migration:run  # Takes 3 seconds

Reduced Complexity

Historical migrations often contain:

  • Outdated business logic
  • Dead code paths
  • Complex interdependencies
  • Type safety issues from evolved entities

Easier Onboarding

New developers can understand the current schema from a few clean migrations rather than piecing together hundreds of historical changes.

The Migration Squashing Process

Step 1: Create Migration Archives

First, move historical migrations to an archive directory:

# Create archive structure
mkdir -p src/db/migrations-archive/20240301
mv src/db/migrations/*.ts src/db/migrations-archive/20240301/
 
# Keep archive organized by squash date
src/db/migrations-archive/
├── 20240301/           # First squash
   ├── 1640995200000-create-users-table.ts
   ├── 1641081600000-add-email-index.ts
   └── ... (400+ files)
├── 20240601/           # Future squash
└── 20240901/           # Another squash

Step 2: Generate Base Schema Migration

Use TypeORM's built-in command to generate a migration from your current schema:

# Generate initial schema from entities
npm run typeorm migration:generate src/db/migrations/1700000000000-initial-schema
 
# This creates a complete schema representation
# of all your current entities, indexes, constraints

The generated migration captures your current database state:

import { MigrationInterface, QueryRunner } from "typeorm"
 
export class InitialSchema1700000000000 implements MigrationInterface {
    name = 'InitialSchema1700000000000'
 
    public async up(queryRunner: QueryRunner): Promise<void> {
        await queryRunner.query(`
            CREATE TABLE "users" (
                "id" uuid NOT NULL DEFAULT uuid_generate_v4(),
                "email" character varying NOT NULL,
                "firstName" character varying,
                "lastName" character varying,
                "createdAt" TIMESTAMP NOT NULL DEFAULT now(),
                "updatedAt" TIMESTAMP NOT NULL DEFAULT now(),
                CONSTRAINT "PK_users" PRIMARY KEY ("id")
            )
        `)
        
        await queryRunner.query(`
            CREATE TABLE "orders" (
                "id" uuid NOT NULL DEFAULT uuid_generate_v4(),
                "userId" uuid NOT NULL,
                "status" character varying NOT NULL DEFAULT 'pending',
                "totalAmount" decimal(10,2) NOT NULL,
                "createdAt" TIMESTAMP NOT NULL DEFAULT now(),
                CONSTRAINT "PK_orders" PRIMARY KEY ("id"),
                CONSTRAINT "FK_orders_user" FOREIGN KEY ("userId") REFERENCES "users"("id")
            )
        `)
        
        await queryRunner.query(`
            CREATE TABLE "products" (
                "id" uuid NOT NULL DEFAULT uuid_generate_v4(),
                "name" character varying NOT NULL,
                "description" text,
                "price" decimal(10,2) NOT NULL,
                "search_vector" tsvector,
                "createdAt" TIMESTAMP NOT NULL DEFAULT now(),
                CONSTRAINT "PK_products" PRIMARY KEY ("id")
            )
        `)
        
        await queryRunner.query(`
            CREATE UNIQUE INDEX "IDX_users_email" ON "users" ("email")
        `)
        
        await queryRunner.query(`
            CREATE INDEX "IDX_orders_status" ON "orders" ("status")
        `)
        
        await queryRunner.query(`
            CREATE INDEX "IDX_products_name" ON "products" ("name")
        `)
        
        // ... hundreds more lines for complete schema
    }
 
    public async down(queryRunner: QueryRunner): Promise<void> {
        await queryRunner.query(`DROP INDEX "IDX_products_name"`)
        await queryRunner.query(`DROP INDEX "IDX_orders_status"`)
        await queryRunner.query(`DROP INDEX "IDX_users_email"`)
        await queryRunner.query(`DROP TABLE "orders"`)
        await queryRunner.query(`DROP TABLE "products"`)
        await queryRunner.query(`DROP TABLE "users"`)
        // ... corresponding drops
    }
}

Step 3: Handle Advanced Database Objects

The auto-generated migration won't capture everything. Create additional migrations for:

Database Functions and Triggers

npm run typeorm migration:create src/db/migrations/1700000001000-product-search
export class ProductSearch1700000001000 implements MigrationInterface {
    public async up(queryRunner: QueryRunner): Promise<void> {
        // Custom search functions
        await queryRunner.query(`
            CREATE OR REPLACE FUNCTION update_product_search_vector()
            RETURNS TRIGGER AS $$
            BEGIN
                NEW.search_vector = to_tsvector('english', 
                    COALESCE(NEW.name, '') || ' ' || 
                    COALESCE(NEW.description, '')
                );
                RETURN NEW;
            END;
            $$ LANGUAGE plpgsql;
        `)
        
        // Triggers
        await queryRunner.query(`
            CREATE TRIGGER update_products_search_vector
            BEFORE INSERT OR UPDATE ON products
            FOR EACH ROW EXECUTE FUNCTION update_product_search_vector();
        `)
    }
 
    public async down(queryRunner: QueryRunner): Promise<void> {
        await queryRunner.query(`DROP TRIGGER update_products_search_vector ON products`)
        await queryRunner.query(`DROP FUNCTION update_product_search_vector()`)
    }
}

Views and Computed Columns

npm run typeorm migration:create src/db/migrations/1700000002000-order-analytics

Custom Extensions and Types

export class Extensions1700000000500 implements MigrationInterface {
    public async up(queryRunner: QueryRunner): Promise<void> {
        await queryRunner.query(`CREATE EXTENSION IF NOT EXISTS "uuid-ossp"`)
        await queryRunner.query(`CREATE EXTENSION IF NOT EXISTS "pg_trgm"`)
        
        // Custom enum types
        await queryRunner.query(`
            CREATE TYPE "order_status_enum" AS ENUM (
                'pending', 'processing', 'shipped', 'delivered', 'cancelled'
            )
        `)
        
        await queryRunner.query(`
            CREATE TYPE "user_role_enum" AS ENUM (
                'admin', 'user', 'guest'
            )
        `)
    }
}

Essential TypeORM Commands

The migration squashing process only requires three core TypeORM commands:

# Generate migration from current entities (for base schema)
npm run typeorm migration:generate src/db/migrations/1700000000000-initial-schema
 
# Create empty migration file (for custom functions/triggers)
npm run typeorm migration:create src/db/migrations/1700000001000-product-search
 
# Run migrations (for testing validation)
npm run typeorm migration:run

That's it! The squashing process doesn't need status checking, reverting, or schema synchronization commands.

Managing the Migration Table

After squashing, you need to manually manage the migrations table to reflect the new structure.

Option 1: Existing Database Migration

For production databases with existing migration history:

-- 1. Backup current migrations table
CREATE TABLE migrations_backup AS SELECT * FROM migrations;
 
-- 2. Clear migrations table
DELETE FROM migrations;
 
-- 3. Add new base migration entries (without running them)
INSERT INTO migrations (timestamp, name) VALUES 
    (1700000000000, 'InitialSchema1700000000000'),
    (1700000001000, 'ProductSearch1700000001000'),
    (1700000002000, 'OrderAnalytics1700000002000');
    
-- 4. Verify schema matches exactly
-- Run schema comparison between production and development

Option 2: Gradual Migration

-- Mark old migrations as run, add new ones
INSERT INTO migrations (timestamp, name) 
SELECT timestamp, name FROM migrations_backup 
WHERE timestamp < 1700000000000;
 
-- Add new squashed migrations
INSERT INTO migrations (timestamp, name) VALUES 
    (1700000000000, 'InitialSchema1700000000000'),
    (1700000001000, 'ProductSearch1700000001000');

Validation and Testing

Schema Validation Process

The most critical step in migration squashing is ensuring your new migrations produce identical database schemas to the old ones. Here's the comprehensive validation process:

Step 1: Create Test Databases

# Create two clean test databases (PostgreSQL)
createdb migration_test_old
createdb migration_test_new
 
# For other databases, use equivalent commands:
# MySQL: CREATE DATABASE migration_test_old;
# SQLite: Will be created automatically

Step 2: Apply Old Migrations and Dump Schema

# Temporarily point TypeORM to archived migrations folder
# Update your `TypeORM` config migrations path to: "migrations-archive/20240301/*.ts"
 
# Run original migrations against first database
export DATABASE_URL="postgresql://username:password@localhost/migration_test_old"
npm run migration:run  # Apply all historical migrations
 
# Generate schema dump using pg_dump
pg_dump -U username -h localhost -d migration_test_old -s -f old_schema.sql

Step 3: Apply New Squashed Migrations and Dump Schema

# Point TypeORM back to your new squashed migrations folder
# Update your `TypeORM` config migrations path to: "src/db/migrations/*.ts"
 
# Run squashed migrations against second database
export DATABASE_URL="postgresql://username:password@localhost/migration_test_new"
npm run migration:run  # Apply squashed migrations
 
# Generate schema dump
pg_dump -U username -h localhost -d migration_test_new -s -f new_schema.sql

Step 4: AI-Assisted Schema Comparison

Modern AI tools like Cursor, especially with MCP (Model Context Protocol) tools, can significantly accelerate the validation process:

Using MCP Postgres Tool for Analysis:

  • Connect to both databases simultaneously
  • Query table counts, column definitions, and constraints
  • Compare function implementations line by line
  • Validate index structures and foreign key relationships
-- Example queries for comparison (can be run via MCP)
SELECT COUNT(*) as table_count FROM information_schema.tables 
WHERE table_schema = 'public';
 
SELECT table_name, column_name, data_type, is_nullable
FROM information_schema.columns 
WHERE table_schema = 'public'
ORDER BY table_name, ordinal_position;

AI-Powered Diff Analysis:

# Generate diff file
diff old_schema.sql new_schema.sql > schema_diff.txt
 
# Use AI to analyze the diff
# - Identify genuine differences vs. formatting changes
# - Spot missing tables, columns, or indexes
# - Validate function and trigger implementations
# - Check constraint definitions

AI tools excel at:

  • Pattern Recognition: Spotting systematic differences across multiple tables
  • Noise Filtering: Distinguishing between cosmetic and structural changes
  • Comprehensive Analysis: Checking hundreds of objects systematically
  • Error Detection: Identifying missing foreign keys, indexes, or constraints

Data Consistency Testing

-- Test data migration integrity
-- Compare key metrics before/after squashing
SELECT 
    COUNT(*) as total_users,
    COUNT(DISTINCT email) as unique_emails,
    MIN("createdAt") as oldest_user,
    MAX("createdAt") as newest_user
FROM users;
 
-- Compare order statistics
SELECT 
    COUNT(*) as total_orders,
    COUNT(DISTINCT status) as unique_statuses,
    SUM("totalAmount") as total_revenue
FROM orders;

Performance Benchmarking

# Before squashing
time npm run migration:run  # Record baseline
 
# After squashing  
time npm run migration:run  # Compare improvement

Conclusion

Migration squashing transforms 400+ migrations taking 4+ minutes into 3 clean migrations running in 3 seconds. The process is straightforward: archive old migrations, generate new base schema, validate with pg_dump comparison.

The critical step is schema validation - use AI tools like Cursor with MCP connections to systematically compare database dumps and catch any discrepancies.

Test on staging first, always backup your migrations table, and your deployment cycles will thank you.