Skip to content

Database/Code Migrations

Migrations are the backbone of evolving software. Whether you’re altering database schemas, upgrading frameworks, or transforming data formats, migrations must be precise, reversible, and safe. Claude Code transforms migration development from a nerve-wracking process into a systematic, confidence-inspiring workflow. This lesson explores how to leverage AI assistance for all types of migrations.

Scenario: Your application has grown from startup to scale-up. The database schema that worked for 100 users is creaking under 100,000. The framework you chose three years ago has breaking changes in its latest version. The data format that seemed simple now needs complete restructuring. Traditional approach: months of planning, crossing fingers, and late-night rollbacks. With Claude Code: systematic migrations with confidence.

Week 1: Planning
- Manually map all changes
- Hope nothing is missed
- Create migration scripts
Week 2: Testing
- Run on staging
- Find edge cases
- Fix and repeat
Week 3: Deployment
- Schedule downtime
- Run migration
- Pray it works
Week 4: Firefighting
- Handle rollbacks
- Fix data corruption
- Apologize to users
> Analyze our database schema and create a migration to:
> - Add user roles and permissions system
> - Implement soft deletes across all tables
> - Add audit logging columns
> - Optimize indexes for current query patterns
> Include rollback procedures and data preservation

Claude creates a complete migration strategy:

  1. Schema Analysis

    schema-analyzer.js
    const { Client } = require('pg');
    class SchemaAnalyzer {
    constructor(connectionString) {
    this.client = new Client({ connectionString });
    }
    async analyze() {
    await this.client.connect();
    const analysis = {
    tables: await this.getTables(),
    indexes: await this.getIndexes(),
    constraints: await this.getConstraints(),
    dependencies: await this.getDependencies(),
    currentQueries: await this.analyzeQueryPatterns()
    };
    await this.client.end();
    return analysis;
    }
    async getTables() {
    const result = await this.client.query(`
    SELECT
    table_name,
    column_name,
    data_type,
    is_nullable,
    column_default
    FROM information_schema.columns
    WHERE table_schema = 'public'
    ORDER BY table_name, ordinal_position
    `);
    // Group by table
    const tables = {};
    for (const row of result.rows) {
    if (!tables[row.table_name]) {
    tables[row.table_name] = {
    name: row.table_name,
    columns: []
    };
    }
    tables[row.table_name].columns.push({
    name: row.column_name,
    type: row.data_type,
    nullable: row.is_nullable === 'YES',
    default: row.column_default
    });
    }
    return tables;
    }
    async getIndexes() {
    const result = await this.client.query(`
    SELECT
    schemaname,
    tablename,
    indexname,
    indexdef
    FROM pg_indexes
    WHERE schemaname = 'public'
    `);
    return result.rows;
    }
    async analyzeQueryPatterns() {
    // Analyze pg_stat_statements if available
    try {
    const result = await this.client.query(`
    SELECT
    query,
    calls,
    mean_exec_time,
    total_exec_time
    FROM pg_stat_statements
    WHERE query NOT LIKE '%pg_%'
    ORDER BY total_exec_time DESC
    LIMIT 50
    `);
    return result.rows;
    } catch (error) {
    // Extension might not be installed
    return [];
    }
    }
    }
  2. Migration Generator

    migration-generator.js
    class MigrationGenerator {
    constructor(analysis) {
    this.analysis = analysis;
    this.timestamp = new Date().toISOString().replace(/[^\d]/g, '').slice(0, 14);
    }
    generateRolesMigration() {
    const upQueries = [];
    const downQueries = [];
    // Create roles table
    upQueries.push(`
    CREATE TABLE IF NOT EXISTS roles (
    id SERIAL PRIMARY KEY,
    name VARCHAR(50) UNIQUE NOT NULL,
    description TEXT,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
    );
    `);
    downQueries.push('DROP TABLE IF EXISTS roles CASCADE;');
    // Create permissions table
    upQueries.push(`
    CREATE TABLE IF NOT EXISTS permissions (
    id SERIAL PRIMARY KEY,
    resource VARCHAR(100) NOT NULL,
    action VARCHAR(50) NOT NULL,
    description TEXT,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    UNIQUE(resource, action)
    );
    `);
    downQueries.push('DROP TABLE IF EXISTS permissions CASCADE;');
    // Create role_permissions junction table
    upQueries.push(`
    CREATE TABLE IF NOT EXISTS role_permissions (
    role_id INTEGER REFERENCES roles(id) ON DELETE CASCADE,
    permission_id INTEGER REFERENCES permissions(id) ON DELETE CASCADE,
    granted_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    PRIMARY KEY (role_id, permission_id)
    );
    `);
    downQueries.push('DROP TABLE IF EXISTS role_permissions CASCADE;');
    // Add role_id to users table
    upQueries.push(`
    ALTER TABLE users
    ADD COLUMN role_id INTEGER REFERENCES roles(id);
    `);
    downQueries.push('ALTER TABLE users DROP COLUMN IF EXISTS role_id;');
    // Create default roles
    upQueries.push(`
    INSERT INTO roles (name, description) VALUES
    ('admin', 'Full system access'),
    ('user', 'Standard user access'),
    ('guest', 'Limited read-only access')
    ON CONFLICT (name) DO NOTHING;
    `);
    // Create indexes
    upQueries.push(`
    CREATE INDEX idx_users_role_id ON users(role_id);
    CREATE INDEX idx_role_permissions_role_id ON role_permissions(role_id);
    CREATE INDEX idx_role_permissions_permission_id ON role_permissions(permission_id);
    `);
    return {
    up: upQueries,
    down: downQueries.reverse()
    };
    }
    generateSoftDeletesMigration() {
    const upQueries = [];
    const downQueries = [];
    // Add soft delete columns to all tables
    for (const tableName of Object.keys(this.analysis.tables)) {
    const table = this.analysis.tables[tableName];
    // Skip system tables and already soft-deletable tables
    if (tableName.startsWith('pg_') ||
    table.columns.some(c => c.name === 'deleted_at')) {
    continue;
    }
    upQueries.push(`
    ALTER TABLE ${tableName}
    ADD COLUMN deleted_at TIMESTAMP DEFAULT NULL,
    ADD COLUMN deleted_by INTEGER REFERENCES users(id);
    `);
    downQueries.push(`
    ALTER TABLE ${tableName}
    DROP COLUMN IF EXISTS deleted_at,
    DROP COLUMN IF EXISTS deleted_by;
    `);
    // Create partial index for non-deleted records
    upQueries.push(`
    CREATE INDEX idx_${tableName}_not_deleted
    ON ${tableName}(id)
    WHERE deleted_at IS NULL;
    `);
    downQueries.push(`
    DROP INDEX IF EXISTS idx_${tableName}_not_deleted;
    `);
    // Create view for active records
    upQueries.push(`
    CREATE OR REPLACE VIEW ${tableName}_active AS
    SELECT * FROM ${tableName}
    WHERE deleted_at IS NULL;
    `);
    downQueries.push(`
    DROP VIEW IF EXISTS ${tableName}_active;
    `);
    }
    return {
    up: upQueries,
    down: downQueries.reverse()
    };
    }
    generateAuditMigration() {
    const upQueries = [];
    const downQueries = [];
    // Create audit log table
    upQueries.push(`
    CREATE TABLE IF NOT EXISTS audit_logs (
    id BIGSERIAL PRIMARY KEY,
    table_name VARCHAR(100) NOT NULL,
    record_id INTEGER NOT NULL,
    action VARCHAR(20) NOT NULL,
    user_id INTEGER REFERENCES users(id),
    changes JSONB,
    metadata JSONB,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
    );
    `);
    downQueries.push('DROP TABLE IF EXISTS audit_logs CASCADE;');
    // Create indexes
    upQueries.push(`
    CREATE INDEX idx_audit_logs_table_record ON audit_logs(table_name, record_id);
    CREATE INDEX idx_audit_logs_user_id ON audit_logs(user_id);
    CREATE INDEX idx_audit_logs_created_at ON audit_logs(created_at);
    CREATE INDEX idx_audit_logs_action ON audit_logs(action);
    `);
    // Add audit columns to all tables
    for (const tableName of Object.keys(this.analysis.tables)) {
    if (tableName.startsWith('pg_') || tableName === 'audit_logs') continue;
    const table = this.analysis.tables[tableName];
    // Add created_by if not exists
    if (!table.columns.some(c => c.name === 'created_by')) {
    upQueries.push(`
    ALTER TABLE ${tableName}
    ADD COLUMN created_by INTEGER REFERENCES users(id);
    `);
    downQueries.push(`
    ALTER TABLE ${tableName}
    DROP COLUMN IF EXISTS created_by;
    `);
    }
    // Add updated_by if not exists
    if (!table.columns.some(c => c.name === 'updated_by')) {
    upQueries.push(`
    ALTER TABLE ${tableName}
    ADD COLUMN updated_by INTEGER REFERENCES users(id);
    `);
    downQueries.push(`
    ALTER TABLE ${tableName}
    DROP COLUMN IF EXISTS updated_by;
    `);
    }
    // Add updated_at with trigger if not exists
    if (!table.columns.some(c => c.name === 'updated_at')) {
    upQueries.push(`
    ALTER TABLE ${tableName}
    ADD COLUMN updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP;
    CREATE OR REPLACE FUNCTION update_${tableName}_updated_at()
    RETURNS TRIGGER AS $$
    BEGIN
    NEW.updated_at = CURRENT_TIMESTAMP;
    RETURN NEW;
    END;
    $$ LANGUAGE plpgsql;
    CREATE TRIGGER trigger_update_${tableName}_updated_at
    BEFORE UPDATE ON ${tableName}
    FOR EACH ROW
    EXECUTE FUNCTION update_${tableName}_updated_at();
    `);
    downQueries.push(`
    DROP TRIGGER IF EXISTS trigger_update_${tableName}_updated_at ON ${tableName};
    DROP FUNCTION IF EXISTS update_${tableName}_updated_at();
    ALTER TABLE ${tableName} DROP COLUMN IF EXISTS updated_at;
    `);
    }
    }
    return {
    up: upQueries,
    down: downQueries.reverse()
    };
    }
    generateIndexOptimization() {
    const upQueries = [];
    const downQueries = [];
    // Analyze query patterns to suggest new indexes
    const queryPatterns = this.analysis.currentQueries;
    for (const pattern of queryPatterns) {
    // Simple pattern matching for WHERE clauses
    const whereMatch = pattern.query.match(/WHERE\s+(\w+)\.?(\w+)\s*=/i);
    if (whereMatch) {
    const [_, tableAlias, column] = whereMatch;
    // Check if index exists
    const indexExists = this.analysis.indexes.some(idx =>
    idx.indexdef.includes(column)
    );
    if (!indexExists && pattern.mean_exec_time > 100) {
    // Suggest index for slow queries
    upQueries.push(`
    -- Query averaging ${pattern.mean_exec_time}ms
    -- ${pattern.query.substring(0, 100)}...
    CREATE INDEX CONCURRENTLY idx_${tableAlias}_${column}_perf
    ON ${tableAlias}(${column});
    `);
    downQueries.push(`
    DROP INDEX IF EXISTS idx_${tableAlias}_${column}_perf;
    `);
    }
    }
    }
    return {
    up: upQueries,
    down: downQueries.reverse()
    };
    }
    generateCompleteMigration() {
    const migrations = [
    this.generateRolesMigration(),
    this.generateSoftDeletesMigration(),
    this.generateAuditMigration(),
    this.generateIndexOptimization()
    ];
    const combined = {
    up: [],
    down: []
    };
    // Combine all migrations
    for (const migration of migrations) {
    combined.up.push(...migration.up);
    combined.down.push(...migration.down);
    }
    return this.wrapInTransaction(combined);
    }
    wrapInTransaction(migration) {
    return {
    filename: `${this.timestamp}_comprehensive_schema_update.sql`,
    up: `

— Migration: Comprehensive Schema Update — Generated: + new Date().toISOString() + — Description: Add roles, permissions, soft deletes, audit logging, and optimize indexes

BEGIN;

+ migration.up.join('\n\n') +

— Verify migration DO $$ BEGIN — Check roles table exists IF NOT EXISTS (SELECT 1 FROM information_schema.tables WHERE table_name = ‘roles’) THEN RAISE EXCEPTION ‘Migration failed: roles table not created’; END IF;

— Check soft delete columns IF NOT EXISTS ( SELECT 1 FROM information_schema.columns WHERE table_name = ‘users’ AND column_name = ‘deleted_at’ ) THEN RAISE EXCEPTION ‘Migration failed: soft delete columns not added’; END IF;

RAISE NOTICE ‘Migration verification passed’; END $$;

COMMIT; , down: — Rollback: Comprehensive Schema Update — Generated: + new Date().toISOString() +

BEGIN;

+ migration.down.join('\n\n') +

COMMIT; ` }; } }

migration-executor.js
3. **Migration Executor**
```javascript
class MigrationExecutor {
constructor(connectionString) {
this.connectionString = connectionString;
}
async execute(migration, direction = 'up') {
const client = new Client({ connectionString: this.connectionString });
try {
await client.connect();
console.log(`Executing migration: ${migration.filename} (${direction})`);
// Start timing
const startTime = Date.now();
// Execute migration
await client.query(migration[direction]);
// Record in migrations table
if (direction === 'up') {
await client.query(`
INSERT INTO schema_migrations (version, executed_at)
VALUES ($1, CURRENT_TIMESTAMP)
`, [migration.filename]);
} else {
await client.query(`
DELETE FROM schema_migrations
WHERE version = $1
`, [migration.filename]);
}
const duration = Date.now() - startTime;
console.log(`Migration completed in ${duration}ms`);
return { success: true, duration };
} catch (error) {
console.error(`Migration failed: ${error.message}`);
throw error;
} finally {
await client.end();
}
}
async dryRun(migration) {
console.log('=== DRY RUN MODE ===');
console.log('UP Migration:');
console.log(migration.up);
console.log('\nDOWN Migration:');
console.log(migration.down);
// Validate SQL syntax
const client = new Client({ connectionString: this.connectionString });
try {
await client.connect();
// Use EXPLAIN to validate without executing
await client.query('BEGIN');
// Parse and validate each statement
const statements = migration.up.split(';').filter(s => s.trim());
for (const statement of statements) {
if (statement.trim() && !statement.trim().startsWith('--')) {
try {
// For DDL statements, we can't use EXPLAIN, so just parse
await client.query(`DO $$ BEGIN RETURN; END $$;`);
console.log(`✓ Valid: ${statement.substring(0, 50)}...`);
} catch (error) {
console.error(`✗ Invalid: ${statement.substring(0, 50)}...`);
console.error(` Error: ${error.message}`);
}
}
}
await client.query('ROLLBACK');
return { valid: true };
} catch (error) {
console.error(`Validation failed: ${error.message}`);
return { valid: false, error };
} finally {
await client.end();
}
}
}
> Implement a zero-downtime migration pattern for:
> - Renaming a heavily-used column
> - Changing column data types
> - Splitting one table into multiple tables
> Include compatibility layers and gradual rollout
zero-downtime-migration.js
class ZeroDowntimeMigration {
constructor(tableName, client) {
this.tableName = tableName;
this.client = client;
}
// Pattern 1: Column Rename with Backward Compatibility
async renameColumn(oldName, newName) {
const steps = [];
// Step 1: Add new column as computed column
steps.push({
name: 'Add computed column',
up: `
ALTER TABLE ${this.tableName}
ADD COLUMN ${newName} ${await this.getColumnType(oldName)}
GENERATED ALWAYS AS (${oldName}) STORED;
`,
down: `
ALTER TABLE ${this.tableName}
DROP COLUMN ${newName};
`,
validation: async () => {
// Verify new column exists and has same values
const result = await this.client.query(`
SELECT COUNT(*) FROM ${this.tableName}
WHERE ${oldName} IS DISTINCT FROM ${newName}
`);
return result.rows[0].count === '0';
}
});
// Step 2: Deploy application changes to use new column
steps.push({
name: 'Deploy app using new column',
manual: true,
instructions: `
1. Update application code to read from ${newName}
2. Keep writing to ${oldName}
3. Deploy and monitor for issues
`
});
// Step 3: Start dual writes
steps.push({
name: 'Create dual-write trigger',
up: `
CREATE OR REPLACE FUNCTION sync_${oldName}_to_${newName}()
RETURNS TRIGGER AS $$
BEGIN
NEW.${newName} := NEW.${oldName};
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER trigger_sync_${oldName}_to_${newName}
BEFORE INSERT OR UPDATE ON ${this.tableName}
FOR EACH ROW
EXECUTE FUNCTION sync_${oldName}_to_${newName}();
`,
down: `
DROP TRIGGER IF EXISTS trigger_sync_${oldName}_to_${newName} ON ${this.tableName};
DROP FUNCTION IF EXISTS sync_${oldName}_to_${newName}();
`
});
// Step 4: Backfill any differences
steps.push({
name: 'Backfill data',
up: `
UPDATE ${this.tableName}
SET ${newName} = ${oldName}
WHERE ${newName} IS DISTINCT FROM ${oldName};
`,
validation: async () => {
const result = await this.client.query(`
SELECT COUNT(*) FROM ${this.tableName}
WHERE ${oldName} IS DISTINCT FROM ${newName}
`);
return result.rows[0].count === '0';
}
});
// Step 5: Switch primary column
steps.push({
name: 'Make new column primary',
up: `
-- Drop the generated column constraint
ALTER TABLE ${this.tableName}
ALTER COLUMN ${newName} DROP EXPRESSION;
-- Drop old column
ALTER TABLE ${this.tableName}
DROP COLUMN ${oldName};
-- Cleanup trigger
DROP TRIGGER IF EXISTS trigger_sync_${oldName}_to_${newName} ON ${this.tableName};
DROP FUNCTION IF EXISTS sync_${oldName}_to_${newName}();
`,
down: `
-- Re-add old column
ALTER TABLE ${this.tableName}
ADD COLUMN ${oldName} ${await this.getColumnType(newName)};
-- Copy data back
UPDATE ${this.tableName}
SET ${oldName} = ${newName};
`
});
return steps;
}
// Pattern 2: Data Type Change with Validation
async changeColumnType(columnName, newType, converter) {
const steps = [];
const tempColumn = `${columnName}_new`;
// Step 1: Add temporary column
steps.push({
name: 'Add temporary column',
up: `
ALTER TABLE ${this.tableName}
ADD COLUMN ${tempColumn} ${newType};
`,
down: `
ALTER TABLE ${this.tableName}
DROP COLUMN IF EXISTS ${tempColumn};
`
});
// Step 2: Populate with converted data
steps.push({
name: 'Convert and populate data',
up: converter || `
UPDATE ${this.tableName}
SET ${tempColumn} = ${columnName}::${newType};
`,
validation: async () => {
// Check for conversion errors
const result = await this.client.query(`
SELECT COUNT(*) FROM ${this.tableName}
WHERE ${columnName} IS NOT NULL
AND ${tempColumn} IS NULL
`);
return result.rows[0].count === '0';
}
});
// Step 3: Create sync trigger
steps.push({
name: 'Sync new writes',
up: `
CREATE OR REPLACE FUNCTION sync_${columnName}_type_change()
RETURNS TRIGGER AS $$
BEGIN
NEW.${tempColumn} := NEW.${columnName}::${newType};
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER trigger_sync_${columnName}_type
BEFORE INSERT OR UPDATE ON ${this.tableName}
FOR EACH ROW
EXECUTE FUNCTION sync_${columnName}_type_change();
`,
down: `
DROP TRIGGER IF EXISTS trigger_sync_${columnName}_type ON ${this.tableName};
DROP FUNCTION IF EXISTS sync_${columnName}_type_change();
`
});
// Step 4: Atomic swap
steps.push({
name: 'Swap columns',
up: `
BEGIN;
-- Drop the old column
ALTER TABLE ${this.tableName}
DROP COLUMN ${columnName};
-- Rename new column
ALTER TABLE ${this.tableName}
RENAME COLUMN ${tempColumn} TO ${columnName};
-- Cleanup
DROP TRIGGER IF EXISTS trigger_sync_${columnName}_type ON ${this.tableName};
DROP FUNCTION IF EXISTS sync_${columnName}_type_change();
COMMIT;
`,
down: `
-- Would need to reverse the type change
ALTER TABLE ${this.tableName}
ALTER COLUMN ${columnName} TYPE ${await this.getColumnType(columnName)}
USING ${columnName}::${await this.getColumnType(columnName)};
`
});
return steps;
}
// Pattern 3: Table Split
async splitTable(newTables) {
const steps = [];
// Step 1: Create new tables
for (const newTable of newTables) {
steps.push({
name: `Create table ${newTable.name}`,
up: newTable.createStatement,
down: `DROP TABLE IF EXISTS ${newTable.name} CASCADE;`
});
}
// Step 2: Create views for backward compatibility
steps.push({
name: 'Create compatibility view',
up: `
CREATE OR REPLACE VIEW ${this.tableName}_compat AS
SELECT
${newTables.map(t => t.columns.map(c => `${t.alias}.${c}`).join(', ')).join(',\n ')}
FROM ${newTables[0].name} ${newTables[0].alias}
${newTables.slice(1).map(t => `
JOIN ${t.name} ${t.alias} ON ${t.alias}.${t.joinColumn} = ${newTables[0].alias}.id
`).join('')};
`,
down: `DROP VIEW IF EXISTS ${this.tableName}_compat;`
});
// Step 3: Create instead-of triggers
steps.push({
name: 'Create write triggers',
up: `
CREATE OR REPLACE FUNCTION ${this.tableName}_insert_trigger()
RETURNS TRIGGER AS $$
DECLARE
${newTables.map(t => `${t.name}_id INTEGER;`).join('\n ')}
BEGIN
-- Insert into split tables
${newTables.map((t, i) => `
INSERT INTO ${t.name} (${t.columns.join(', ')})
VALUES (${t.columns.map(c => `NEW.${c}`).join(', ')})
RETURNING id INTO ${t.name}_id;
`).join('\n')}
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER ${this.tableName}_instead_insert
INSTEAD OF INSERT ON ${this.tableName}_compat
FOR EACH ROW
EXECUTE FUNCTION ${this.tableName}_insert_trigger();
`,
down: `
DROP TRIGGER IF EXISTS ${this.tableName}_instead_insert ON ${this.tableName}_compat;
DROP FUNCTION IF EXISTS ${this.tableName}_insert_trigger();
`
});
return steps;
}
async getColumnType(columnName) {
const result = await this.client.query(`
SELECT data_type, character_maximum_length, numeric_precision, numeric_scale
FROM information_schema.columns
WHERE table_name = $1 AND column_name = $2
`, [this.tableName, columnName]);
const col = result.rows[0];
if (col.character_maximum_length) {
return `${col.data_type}(${col.character_maximum_length})`;
} else if (col.numeric_precision) {
return `${col.data_type}(${col.numeric_precision},${col.numeric_scale})`;
}
return col.data_type;
}
}
> Create a data migration to:
> - Normalize denormalized data into proper tables
> - Merge duplicate records intelligently
> - Transform legacy formats to new structure
> - Preserve data integrity and relationships
data-migration.js
class DataMigration {
constructor(client) {
this.client = client;
this.batchSize = 1000;
}
// Normalize denormalized data
async normalizeDenormalizedData() {
console.log('Starting data normalization...');
// Example: Extract addresses from users table
await this.client.query('BEGIN');
try {
// Create addresses table
await this.client.query(`
CREATE TABLE IF NOT EXISTS addresses (
id SERIAL PRIMARY KEY,
street_address VARCHAR(255),
city VARCHAR(100),
state VARCHAR(50),
postal_code VARCHAR(20),
country VARCHAR(2),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
UNIQUE(street_address, city, state, postal_code, country)
);
`);
// Create user_addresses junction table
await this.client.query(`
CREATE TABLE IF NOT EXISTS user_addresses (
user_id INTEGER REFERENCES users(id),
address_id INTEGER REFERENCES addresses(id),
address_type VARCHAR(20) DEFAULT 'primary',
is_primary BOOLEAN DEFAULT false,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (user_id, address_id, address_type)
);
`);
// Migrate addresses in batches
let offset = 0;
let hasMore = true;
while (hasMore) {
const users = await this.client.query(`
SELECT id, street_address, city, state, postal_code, country
FROM users
WHERE street_address IS NOT NULL
ORDER BY id
LIMIT $1 OFFSET $2
`, [this.batchSize, offset]);
if (users.rows.length === 0) {
hasMore = false;
break;
}
for (const user of users.rows) {
// Insert or get address
const addressResult = await this.client.query(`
INSERT INTO addresses (street_address, city, state, postal_code, country)
VALUES ($1, $2, $3, $4, $5)
ON CONFLICT (street_address, city, state, postal_code, country)
DO UPDATE SET street_address = EXCLUDED.street_address
RETURNING id
`, [user.street_address, user.city, user.state, user.postal_code, user.country]);
// Link to user
await this.client.query(`
INSERT INTO user_addresses (user_id, address_id, address_type, is_primary)
VALUES ($1, $2, 'primary', true)
ON CONFLICT DO NOTHING
`, [user.id, addressResult.rows[0].id]);
}
console.log(`Processed ${offset + users.rows.length} users...`);
offset += this.batchSize;
}
// Add foreign key to users table
await this.client.query(`
ALTER TABLE users
ADD COLUMN primary_address_id INTEGER REFERENCES addresses(id);
`);
// Update primary address reference
await this.client.query(`
UPDATE users u
SET primary_address_id = (
SELECT address_id
FROM user_addresses ua
WHERE ua.user_id = u.id
AND ua.is_primary = true
LIMIT 1
);
`);
await this.client.query('COMMIT');
console.log('Data normalization completed successfully');
} catch (error) {
await this.client.query('ROLLBACK');
throw error;
}
}
// Merge duplicate records
async mergeDuplicates() {
console.log('Starting duplicate merge...');
// Find duplicates by email
const duplicates = await this.client.query(`
WITH duplicate_groups AS (
SELECT
email,
array_agg(id ORDER BY created_at ASC) as user_ids,
array_agg(created_at ORDER BY created_at ASC) as created_dates,
COUNT(*) as duplicate_count
FROM users
WHERE deleted_at IS NULL
GROUP BY email
HAVING COUNT(*) > 1
)
SELECT * FROM duplicate_groups
ORDER BY duplicate_count DESC
`);
console.log(`Found ${duplicates.rows.length} duplicate groups`);
await this.client.query('BEGIN');
try {
for (const group of duplicates.rows) {
const [primaryId, ...duplicateIds] = group.user_ids;
console.log(`Merging ${duplicateIds.length} duplicates into user ${primaryId}`);
// Merge strategy: Keep oldest account, merge data from others
await this.mergeUserData(primaryId, duplicateIds);
}
await this.client.query('COMMIT');
console.log('Duplicate merge completed successfully');
} catch (error) {
await this.client.query('ROLLBACK');
throw error;
}
}
async mergeUserData(primaryId, duplicateIds) {
// Merge orders
await this.client.query(`
UPDATE orders
SET user_id = $1
WHERE user_id = ANY($2::int[])
`, [primaryId, duplicateIds]);
// Merge addresses (avoiding duplicates)
await this.client.query(`
INSERT INTO user_addresses (user_id, address_id, address_type, is_primary)
SELECT $1, address_id, address_type, false
FROM user_addresses
WHERE user_id = ANY($2::int[])
ON CONFLICT DO NOTHING
`, [primaryId, duplicateIds]);
// Merge profile data (keep non-null values)
const profiles = await this.client.query(`
SELECT * FROM users
WHERE id = ANY($1::int[])
ORDER BY created_at DESC
`, [[primaryId, ...duplicateIds]]);
const mergedProfile = this.mergeProfiles(profiles.rows);
await this.client.query(`
UPDATE users
SET
first_name = COALESCE($2, first_name),
last_name = COALESCE($3, last_name),
phone = COALESCE($4, phone),
bio = COALESCE($5, bio),
preferences = COALESCE($6, preferences)
WHERE id = $1
`, [
primaryId,
mergedProfile.first_name,
mergedProfile.last_name,
mergedProfile.phone,
mergedProfile.bio,
mergedProfile.preferences
]);
// Soft delete duplicates
await this.client.query(`
UPDATE users
SET
deleted_at = CURRENT_TIMESTAMP,
deleted_by = $1,
email = email || '_deleted_' || id
WHERE id = ANY($2::int[])
`, [primaryId, duplicateIds]);
// Create audit log
await this.client.query(`
INSERT INTO audit_logs (table_name, record_id, action, user_id, changes, metadata)
VALUES ('users', $1, 'merge', $1, $2, $3)
`, [
primaryId,
JSON.stringify({ merged_ids: duplicateIds }),
JSON.stringify({
reason: 'duplicate_email_merge',
duplicate_count: duplicateIds.length
})
]);
}
mergeProfiles(profiles) {
const merged = {};
// Priority: non-null values from newest to oldest
const fields = ['first_name', 'last_name', 'phone', 'bio'];
for (const field of fields) {
for (const profile of profiles) {
if (profile[field] && !merged[field]) {
merged[field] = profile[field];
break;
}
}
}
// Merge JSON preferences
merged.preferences = {};
for (const profile of profiles.reverse()) {
if (profile.preferences) {
merged.preferences = { ...merged.preferences, ...profile.preferences };
}
}
return merged;
}
// Transform legacy formats
async transformLegacyData() {
console.log('Starting legacy data transformation...');
// Example: Transform old JSON structure to new normalized structure
const legacyRecords = await this.client.query(`
SELECT id, legacy_data
FROM legacy_table
WHERE migrated = false
ORDER BY id
`);
const transformer = new LegacyDataTransformer();
await this.client.query('BEGIN');
try {
for (const record of legacyRecords.rows) {
const transformed = transformer.transform(record.legacy_data);
// Insert into new structure
await this.insertTransformedData(transformed);
// Mark as migrated
await this.client.query(`
UPDATE legacy_table
SET migrated = true, migrated_at = CURRENT_TIMESTAMP
WHERE id = $1
`, [record.id]);
}
await this.client.query('COMMIT');
console.log('Legacy data transformation completed');
} catch (error) {
await this.client.query('ROLLBACK');
throw error;
}
}
}
class LegacyDataTransformer {
transform(legacyData) {
// Example transformation logic
const transformed = {
user: {
email: legacyData.email || legacyData.user_email,
name: this.parseName(legacyData.name || legacyData.full_name),
created_at: this.parseDate(legacyData.date_created || legacyData.created)
},
profile: {
bio: legacyData.description || legacyData.about,
avatar_url: this.normalizeUrl(legacyData.avatar || legacyData.profile_image),
preferences: this.parsePreferences(legacyData.settings || {})
},
addresses: this.parseAddresses(legacyData.addresses || legacyData.locations || [])
};
return transformed;
}
parseName(nameString) {
if (!nameString) return { first_name: null, last_name: null };
const parts = nameString.trim().split(/\s+/);
if (parts.length === 1) {
return { first_name: parts[0], last_name: null };
}
return {
first_name: parts[0],
last_name: parts.slice(1).join(' ')
};
}
parseDate(dateValue) {
if (!dateValue) return null;
// Handle various legacy date formats
const date = new Date(dateValue);
return isNaN(date.getTime()) ? null : date.toISOString();
}
normalizeUrl(url) {
if (!url) return null;
// Add protocol if missing
if (!url.match(/^https?:\/\//)) {
url = 'https://' + url;
}
try {
new URL(url); // Validate
return url;
} catch {
return null;
}
}
parsePreferences(settings) {
// Transform old settings structure to new preferences
return {
notifications: {
email: settings.email_notifications !== false,
push: settings.push_notifications === true,
sms: settings.sms_notifications === true
},
privacy: {
profile_visible: settings.public_profile !== false,
show_email: settings.show_email === true
},
theme: settings.theme || 'light'
};
}
parseAddresses(addresses) {
if (!Array.isArray(addresses)) {
addresses = [addresses].filter(Boolean);
}
return addresses.map((addr, index) => ({
street_address: addr.street || addr.address1,
city: addr.city,
state: addr.state || addr.province,
postal_code: addr.zip || addr.postal_code,
country: addr.country || 'US',
is_primary: index === 0
}));
}
}
> Migrate our application from:
> - React 17 to React 18 with concurrent features
> - Express 4 to Express 5
> - Webpack 4 to Webpack 5
> Include codemods, compatibility fixes, and testing
framework-migration.js
class FrameworkMigration {
constructor(projectPath) {
this.projectPath = projectPath;
}
// React 17 to 18 Migration
async migrateReact17to18() {
const steps = [];
// Step 1: Update dependencies
steps.push({
name: 'Update React packages',
command: `
npm update react@^18.0.0 react-dom@^18.0.0
npm update @types/react@^18.0.0 @types/react-dom@^18.0.0 --save-dev
`,
validation: async () => {
const pkg = require(path.join(this.projectPath, 'package.json'));
return pkg.dependencies.react.startsWith('18');
}
});
// Step 2: Update ReactDOM.render calls
steps.push({
name: 'Update root rendering',
codemod: async () => {
const files = await this.findFiles('**/*.{js,jsx,ts,tsx}');
for (const file of files) {
let content = await fs.readFile(file, 'utf8');
let modified = false;
// Update ReactDOM.render to createRoot
if (content.includes('ReactDOM.render')) {
content = content.replace(
/ReactDOM\.render\s*\(\s*<(.+?)\/>,\s*document\.getElementById\(['"](.+?)['"]\)\s*\)/g,
(match, component, id) => {
modified = true;
return `
const root = ReactDOM.createRoot(document.getElementById('${id}'));
root.render(<${component}/>);`;
}
);
}
// Update hydrate to hydrateRoot
if (content.includes('ReactDOM.hydrate')) {
content = content.replace(
/ReactDOM\.hydrate\s*\(\s*<(.+?)\/>,\s*document\.getElementById\(['"](.+?)['"]\)\s*\)/g,
(match, component, id) => {
modified = true;
return `
const root = ReactDOM.hydrateRoot(document.getElementById('${id}'), <${component}/>);`;
}
);
}
if (modified) {
await fs.writeFile(file, content);
console.log(`Updated: ${file}`);
}
}
}
});
// Step 3: Add Suspense boundaries
steps.push({
name: 'Add Suspense boundaries for concurrent features',
codemod: async () => {
const componentFiles = await this.findFiles('**/components/**/*.{jsx,tsx}');
for (const file of componentFiles) {
let content = await fs.readFile(file, 'utf8');
// Find lazy-loaded components
if (content.includes('React.lazy')) {
// Check if Suspense is already imported
if (!content.includes('Suspense')) {
content = content.replace(
/import\s+React(.+?)from\s+['"]react['"]/,
"import React$1, { Suspense } from 'react'"
);
}
// Wrap lazy components with Suspense
content = content.replace(
/<(\w+)\s+\/>/g,
(match, component) => {
if (content.includes(`const ${component} = React.lazy`)) {
return `<Suspense fallback={<div>Loading...</div>}>
${match}
</Suspense>`;
}
return match;
}
);
await fs.writeFile(file, content);
}
}
}
});
// Step 4: Update StrictMode usage
steps.push({
name: 'Update StrictMode for React 18',
notes: `
React 18's StrictMode has additional checks:
- Components will remount in development
- Effects will re-run
- Update any code that relies on single mounting
`
});
return steps;
}
// Express 4 to 5 Migration
async migrateExpress4to5() {
const steps = [];
// Step 1: Update dependencies
steps.push({
name: 'Update Express',
command: 'npm install express@^5.0.0',
validation: async () => {
const pkg = require(path.join(this.projectPath, 'package.json'));
return pkg.dependencies.express.startsWith('5');
}
});
// Step 2: Update middleware
steps.push({
name: 'Update removed middleware',
codemod: async () => {
const files = await this.findFiles('**/*.js');
for (const file of files) {
let content = await fs.readFile(file, 'utf8');
let modified = false;
// Replace app.param with router.param
if (content.includes('app.param')) {
content = content.replace(/app\.param/g, 'router.param');
modified = true;
}
// Update error handling
if (content.includes('app.use(function(err')) {
console.log(`Note: Error handlers in ${file} may need updating for Express 5`);
}
if (modified) {
await fs.writeFile(file, content);
}
}
}
});
// Step 3: Update removed methods
steps.push({
name: 'Update deprecated methods',
codemod: async () => {
const replacements = {
'req.param()': 'req.params, req.body, or req.query',
'res.send(status)': 'res.sendStatus(status)',
'res.redirect(url, status)': 'res.redirect(status, url)'
};
const files = await this.findFiles('**/routes/**/*.js');
for (const file of files) {
let content = await fs.readFile(file, 'utf8');
let modified = false;
for (const [old, replacement] of Object.entries(replacements)) {
if (content.includes(old)) {
console.log(`Warning: ${file} uses ${old}, replace with ${replacement}`);
modified = true;
}
}
if (modified) {
// Add TODO comments
content = '// TODO: Update for Express 5 compatibility\n' + content;
await fs.writeFile(file, content);
}
}
}
});
return steps;
}
// Webpack 4 to 5 Migration
async migrateWebpack4to5() {
const steps = [];
// Step 1: Update webpack and related packages
steps.push({
name: 'Update Webpack packages',
command: `
npm install webpack@^5.0.0 webpack-cli@^4.0.0 webpack-dev-server@^4.0.0
npm install html-webpack-plugin@^5.0.0 mini-css-extract-plugin@^2.0.0
`
});
// Step 2: Update webpack configuration
steps.push({
name: 'Update webpack.config.js',
codemod: async () => {
const configPath = path.join(this.projectPath, 'webpack.config.js');
let config = await fs.readFile(configPath, 'utf8');
// Update node polyfills
if (config.includes('node:')) {
config = config.replace(
/node:\s*{[^}]+}/,
`node: false // Node polyfills removed in Webpack 5`
);
// Add fallbacks if needed
config = config.replace(
/module\.exports\s*=\s*{/,
`module.exports = {
resolve: {
fallback: {
"path": require.resolve("path-browserify"),
"fs": false,
"crypto": require.resolve("crypto-browserify"),
"stream": require.resolve("stream-browserify"),
"buffer": require.resolve("buffer/")
}
},`
);
}
// Update asset modules
config = config.replace(
/file-loader|url-loader/g,
'asset/resource'
);
// Update optimization
if (!config.includes('optimization')) {
config = config.replace(
/module\.exports\s*=\s*{/,
`module.exports = {
optimization: {
usedExports: true,
sideEffects: false,
splitChunks: {
chunks: 'all',
cacheGroups: {
vendor: {
test: /[\\/]node_modules[\\/]/,
name: 'vendors',
priority: 10
}
}
}
},`
);
}
await fs.writeFile(configPath, config);
}
});
// Step 3: Update imports
steps.push({
name: 'Update dynamic imports',
codemod: async () => {
const files = await this.findFiles('**/*.{js,jsx,ts,tsx}');
for (const file of files) {
let content = await fs.readFile(file, 'utf8');
// Update magic comments
content = content.replace(
/\/\*\s*webpackChunkName:\s*"([^"]+)"\s*\*\//g,
'/* webpackChunkName: "$1", webpackPrefetch: true */'
);
await fs.writeFile(file, content);
}
}
});
return steps;
}
async findFiles(pattern) {
const glob = require('glob');
return new Promise((resolve, reject) => {
glob(pattern, { cwd: this.projectPath }, (err, files) => {
if (err) reject(err);
else resolve(files.map(f => path.join(this.projectPath, f)));
});
});
}
}
> Create a Git workflow for safe migrations:
> - Feature branches for each migration step
> - Automated testing at each stage
> - Rollback procedures
> - Gradual rollout strategy
migration-git-workflow.js
class MigrationGitWorkflow {
constructor(repoPath) {
this.repoPath = repoPath;
this.git = simpleGit(repoPath);
}
async createMigrationWorkflow(migrationName) {
const workflow = {
branches: [],
pullRequests: [],
rollbackPlan: []
};
// Create main migration branch
const mainBranch = `migration/${migrationName}`;
await this.git.checkoutBranch(mainBranch, 'main');
workflow.branches.push(mainBranch);
// Create step branches
const steps = [
'schema-changes',
'compatibility-layer',
'data-migration',
'code-updates',
'cleanup'
];
for (const step of steps) {
const stepBranch = `${mainBranch}/${step}`;
await this.git.checkoutBranch(stepBranch, mainBranch);
workflow.branches.push(stepBranch);
// Create PR template
workflow.pullRequests.push({
branch: stepBranch,
target: mainBranch,
title: `[${migrationName}] ${step}`,
template: this.generatePRTemplate(step),
checks: this.getRequiredChecks(step)
});
}
// Create rollback branches
for (const step of steps) {
const rollbackBranch = `${mainBranch}/rollback-${step}`;
workflow.rollbackPlan.push({
step,
branch: rollbackBranch,
procedure: this.generateRollbackProcedure(step)
});
}
return workflow;
}
generatePRTemplate(step) {
return `
## Migration Step: ${step}
### Changes
- [ ] List all changes made in this step
### Testing
- [ ] Unit tests pass
- [ ] Integration tests pass
- [ ] Migration tested on staging
- [ ] Rollback tested
### Verification
- [ ] Data integrity verified
- [ ] Performance impact measured
- [ ] No breaking changes for running systems
### Rollback Plan
Describe how to rollback this specific step if needed
### Dependencies
- Previous steps that must be completed
- Systems that need to be notified
### Monitoring
- Metrics to watch during deployment
- Alerts configured
`;
}
getRequiredChecks(step) {
const baseChecks = [
'ci/tests',
'ci/lint',
'ci/security'
];
const stepSpecificChecks = {
'schema-changes': ['db/migration-test', 'db/compatibility'],
'data-migration': ['data/integrity', 'data/performance'],
'code-updates': ['app/integration', 'app/smoke-tests']
};
return [
...baseChecks,
...(stepSpecificChecks[step] || [])
];
}
generateRollbackProcedure(step) {
const procedures = {
'schema-changes': `
1. Run down migration: npm run migrate:down -- --step=schema-changes
2. Deploy previous version of database schema
3. Verify all applications can connect
4. Check data integrity
`,
'data-migration': `
1. Stop data migration process
2. Restore from backup taken before migration
3. Or run reverse data migration script
4. Verify data consistency
`,
'code-updates': `
1. Revert code deployment
2. Clear caches
3. Restart services
4. Verify functionality
`
};
return procedures[step] || 'Standard rollback procedure';
}
async createMigrationTests() {
const testFile = `
// migration.test.js
const { MigrationRunner } = require('./migration-runner');
describe('Migration: ${this.migrationName}', () => {
let runner;
let testDb;
beforeAll(async () => {
testDb = await createTestDatabase();
runner = new MigrationRunner(testDb);
});
afterAll(async () => {
await testDb.close();
});
describe('Forward Migration', () => {
test('should complete successfully', async () => {
const result = await runner.migrate('up');
expect(result.success).toBe(true);
expect(result.errors).toHaveLength(0);
});
test('should maintain data integrity', async () => {
const before = await testDb.query('SELECT COUNT(*) FROM users');
await runner.migrate('up');
const after = await testDb.query('SELECT COUNT(*) FROM users');
expect(after.rows[0].count).toBe(before.rows[0].count);
});
test('should be idempotent', async () => {
await runner.migrate('up');
const result = await runner.migrate('up');
expect(result.alreadyMigrated).toBe(true);
});
});
describe('Rollback', () => {
test('should rollback successfully', async () => {
await runner.migrate('up');
const result = await runner.migrate('down');
expect(result.success).toBe(true);
});
test('should restore original state', async () => {
const snapshot = await createSnapshot(testDb);
await runner.migrate('up');
await runner.migrate('down');
const restored = await createSnapshot(testDb);
expect(restored).toEqual(snapshot);
});
});
describe('Performance', () => {
test('should complete within time limit', async () => {
const start = Date.now();
await runner.migrate('up');
const duration = Date.now() - start;
expect(duration).toBeLessThan(300000); // 5 minutes
});
test('should not lock tables excessively', async () => {
const locks = await monitorLocks(async () => {
await runner.migrate('up');
});
expect(locks.maxDuration).toBeLessThan(1000); // 1 second
});
});
});
`;
await fs.writeFile(
path.join(this.repoPath, 'tests', 'migration.test.js'),
testFile
);
}
}
> Create comprehensive monitoring for migrations:
> - Pre-migration validation
> - Real-time progress tracking
> - Post-migration verification
> - Performance impact analysis
migration-monitor.js
class MigrationMonitor {
constructor(config) {
this.config = config;
this.metrics = {
startTime: null,
endTime: null,
rowsProcessed: 0,
errors: [],
warnings: [],
performance: []
};
}
async preMigrationChecks() {
const checks = {
database: await this.checkDatabaseHealth(),
diskSpace: await this.checkDiskSpace(),
connections: await this.checkActiveConnections(),
backup: await this.verifyBackup(),
dependencies: await this.checkDependencies()
};
const failed = Object.entries(checks)
.filter(([_, result]) => !result.passed)
.map(([name, result]) => ({ name, ...result }));
if (failed.length > 0) {
throw new Error(`Pre-migration checks failed: ${JSON.stringify(failed)}`);
}
return checks;
}
async checkDatabaseHealth() {
try {
// Check connection
await this.config.db.query('SELECT 1');
// Check replication lag
const lag = await this.config.db.query(`
SELECT
EXTRACT(EPOCH FROM (now() - pg_last_xact_replay_timestamp())) as lag_seconds
`);
const lagSeconds = lag.rows[0]?.lag_seconds || 0;
return {
passed: lagSeconds < 10,
details: { replicationLag: lagSeconds }
};
} catch (error) {
return {
passed: false,
error: error.message
};
}
}
async checkDiskSpace() {
const { execSync } = require('child_process');
const output = execSync('df -h /var/lib/postgresql').toString();
const lines = output.split('\n');
const dataLine = lines[1];
const usage = parseInt(dataLine.split(/\s+/)[4]);
return {
passed: usage < 80,
details: { diskUsagePercent: usage }
};
}
startMigration(name) {
this.metrics.startTime = Date.now();
this.metrics.name = name;
// Start monitoring
this.monitoringInterval = setInterval(() => {
this.collectMetrics();
}, 5000);
console.log(`Migration started: ${name}`);
}
async collectMetrics() {
// Collect database metrics
const stats = await this.config.db.query(`
SELECT
(SELECT count(*) FROM pg_stat_activity) as connections,
(SELECT count(*) FROM pg_stat_activity WHERE state = 'active') as active_queries,
(SELECT count(*) FROM pg_locks WHERE granted = false) as waiting_locks,
pg_database_size(current_database()) as database_size
`);
this.metrics.performance.push({
timestamp: Date.now(),
...stats.rows[0]
});
// Check for long-running queries
const longQueries = await this.config.db.query(`
SELECT
pid,
now() - pg_stat_activity.query_start AS duration,
query
FROM pg_stat_activity
WHERE (now() - pg_stat_activity.query_start) > interval '5 minutes'
AND state = 'active'
`);
if (longQueries.rows.length > 0) {
this.metrics.warnings.push({
type: 'long_running_query',
timestamp: Date.now(),
details: longQueries.rows
});
}
}
updateProgress(processed, total) {
this.metrics.rowsProcessed = processed;
this.metrics.totalRows = total;
const percentage = ((processed / total) * 100).toFixed(2);
const elapsed = (Date.now() - this.metrics.startTime) / 1000;
const rate = processed / elapsed;
const eta = (total - processed) / rate;
console.log(
`Progress: ${processed}/${total} (${percentage}%) - ` +
`Rate: ${rate.toFixed(0)}/sec - ETA: ${this.formatTime(eta)}`
);
}
recordError(error) {
this.metrics.errors.push({
timestamp: Date.now(),
message: error.message,
stack: error.stack,
context: error.context
});
console.error(`Migration error: ${error.message}`);
}
async finishMigration() {
this.metrics.endTime = Date.now();
clearInterval(this.monitoringInterval);
// Run post-migration verification
const verification = await this.postMigrationVerification();
// Generate report
const report = this.generateReport(verification);
// Save report
await this.saveReport(report);
return report;
}
async postMigrationVerification() {
const checks = {
dataIntegrity: await this.verifyDataIntegrity(),
constraints: await this.verifyConstraints(),
indexes: await this.verifyIndexes(),
performance: await this.verifyPerformance(),
application: await this.verifyApplicationHealth()
};
return checks;
}
async verifyDataIntegrity() {
// Run checksums or count validations
const checks = [];
// Example: Verify row counts match
if (this.config.validations?.rowCounts) {
for (const [table, expectedCount] of Object.entries(this.config.validations.rowCounts)) {
const result = await this.config.db.query(
`SELECT COUNT(*) as count FROM ${table}`
);
checks.push({
table,
expected: expectedCount,
actual: result.rows[0].count,
passed: result.rows[0].count === expectedCount
});
}
}
return {
passed: checks.every(c => c.passed),
details: checks
};
}
generateReport(verification) {
const duration = this.metrics.endTime - this.metrics.startTime;
return {
migration: this.metrics.name,
status: this.metrics.errors.length === 0 ? 'SUCCESS' : 'FAILED',
startTime: new Date(this.metrics.startTime).toISOString(),
endTime: new Date(this.metrics.endTime).toISOString(),
duration: this.formatTime(duration / 1000),
rowsProcessed: this.metrics.rowsProcessed,
errors: this.metrics.errors,
warnings: this.metrics.warnings,
verification,
performance: {
averageRate: this.metrics.rowsProcessed / (duration / 1000),
peakConnections: Math.max(...this.metrics.performance.map(p => p.connections)),
peakActiveQueries: Math.max(...this.metrics.performance.map(p => p.active_queries))
}
};
}
formatTime(seconds) {
const hours = Math.floor(seconds / 3600);
const minutes = Math.floor((seconds % 3600) / 60);
const secs = Math.floor(seconds % 60);
return `${hours}h ${minutes}m ${secs}s`;
}
async saveReport(report) {
const filename = `migration-report-${report.migration}-${Date.now()}.json`;
await fs.writeFile(filename, JSON.stringify(report, null, 2));
console.log(`Report saved to: ${filename}`);
}
}
// Usage example
async function runMigrationWithMonitoring() {
const monitor = new MigrationMonitor({
db: databaseConnection,
validations: {
rowCounts: {
users: 10000,
orders: 50000
}
}
});
try {
// Pre-checks
await monitor.preMigrationChecks();
// Start monitoring
monitor.startMigration('add-user-roles');
// Run migration
const migrator = new DatabaseMigrator(databaseConnection);
await migrator.runMigration((progress, total) => {
monitor.updateProgress(progress, total);
});
// Finish and verify
const report = await monitor.finishMigration();
if (report.status === 'SUCCESS') {
console.log('Migration completed successfully!');
} else {
console.error('Migration failed!', report.errors);
}
} catch (error) {
monitor.recordError(error);
const report = await monitor.finishMigration();
throw error;
}
}

You’ve learned how to leverage Claude Code for comprehensive migration development and execution. The key is treating migrations as first-class code - thoroughly planned, carefully tested, and safely reversible. Build migrations that give you confidence to evolve your systems without fear.

Remember: The best migration is one you can run on Friday afternoon without losing sleep over the weekend. Use Claude Code to build migrations that are predictable, monitorable, and reversible, ensuring your data and systems evolve smoothly with your business needs.