Performance Analysis
Ta treść nie jest jeszcze dostępna w Twoim języku.
Performance is the difference between software that delights and software that frustrates. Whether you’re optimizing response times, reducing memory usage, or scaling to millions of users, Claude Code transforms performance optimization from guesswork into data-driven engineering. This lesson explores how to leverage AI assistance for comprehensive performance analysis and optimization.
The Performance Revolution
Section titled “The Performance Revolution”Scenario: Your application was fast with 100 users. Now with 10,000 users, it’s crawling. Database queries timeout, APIs respond slowly, and users are complaining. Traditional approach: throw more servers at it and hope. With Claude Code: systematic analysis that finds and fixes the real bottlenecks.
Traditional vs AI-Powered Performance Optimization
Section titled “Traditional vs AI-Powered Performance Optimization”Week 1: Complaints Start- "The app is slow"- Add more servers- Still slow
Week 2: Random Optimization- Cache everything- Add indexes everywhere- Break something else
Week 3: Profiling Attempt- Complex profiler setup- Overwhelming data- Not sure what to fix
Week 4: Consultants- Expensive analysis- Generic recommendations- Marginal improvements
Day 1: Systematic Analysis> Profile application behavior> Identify bottlenecks> Measure baseline metrics
Day 2: Targeted Optimization> Fix identified bottlenecks> Optimize critical paths> Validate improvements
Day 3: Scale Testing> Load test optimizations> Fine-tune based on data> Deploy with confidence
Ongoing: Continuous Optimization> Monitor performance> Proactive optimization> Prevent degradation
Performance Profiling
Section titled “Performance Profiling”Comprehensive Application Profiling
Section titled “Comprehensive Application Profiling”> Set up comprehensive performance profiling for our application:> - CPU profiling> - Memory profiling> - Database query analysis> - Network request timing> - Rendering performance> Create visualization dashboard
Claude creates a complete profiling setup:
-
Application Performance Monitoring
performance-monitor.js const { performance, PerformanceObserver } = require('perf_hooks');const v8 = require('v8');const os = require('os');class PerformanceMonitor {constructor() {this.metrics = {requests: new Map(),database: new Map(),cache: new Map(),cpu: [],memory: []};this.startMonitoring();}startMonitoring() {// Request timingthis.requestObserver = new PerformanceObserver((list) => {for (const entry of list.getEntries()) {if (entry.name.startsWith('request-')) {this.recordRequest(entry);}}});this.requestObserver.observe({ entryTypes: ['measure'] });// System metricsthis.systemInterval = setInterval(() => {this.recordSystemMetrics();}, 1000);// V8 heap statisticsthis.heapInterval = setInterval(() => {this.recordHeapStatistics();}, 5000);}recordRequest(entry) {const route = entry.name.replace('request-', '');if (!this.metrics.requests.has(route)) {this.metrics.requests.set(route, {count: 0,totalTime: 0,minTime: Infinity,maxTime: 0,p50: 0,p95: 0,p99: 0,times: []});}const stats = this.metrics.requests.get(route);stats.count++;stats.totalTime += entry.duration;stats.minTime = Math.min(stats.minTime, entry.duration);stats.maxTime = Math.max(stats.maxTime, entry.duration);stats.times.push(entry.duration);// Calculate percentilesif (stats.times.length > 100) {const sorted = [...stats.times].sort((a, b) => a - b);stats.p50 = sorted[Math.floor(sorted.length * 0.5)];stats.p95 = sorted[Math.floor(sorted.length * 0.95)];stats.p99 = sorted[Math.floor(sorted.length * 0.99)];// Keep only recent 1000 samplesstats.times = stats.times.slice(-1000);}}recordSystemMetrics() {const cpus = os.cpus();const totalIdle = cpus.reduce((acc, cpu) => acc + cpu.times.idle, 0);const totalTick = cpus.reduce((acc, cpu) =>acc + Object.values(cpu.times).reduce((a, b) => a + b), 0);const cpuUsage = 100 - ~~(100 * totalIdle / totalTick);this.metrics.cpu.push({timestamp: Date.now(),usage: cpuUsage,loadAvg: os.loadavg()});// Memory usageconst memUsage = process.memoryUsage();this.metrics.memory.push({timestamp: Date.now(),rss: memUsage.rss,heapTotal: memUsage.heapTotal,heapUsed: memUsage.heapUsed,external: memUsage.external,arrayBuffers: memUsage.arrayBuffers});// Keep only last hour of dataconst hourAgo = Date.now() - 3600000;this.metrics.cpu = this.metrics.cpu.filter(m => m.timestamp > hourAgo);this.metrics.memory = this.metrics.memory.filter(m => m.timestamp > hourAgo);}recordHeapStatistics() {const heapStats = v8.getHeapStatistics();const heapSpaces = v8.getHeapSpaceStatistics();// Detect memory leaksif (heapStats.used_heap_size > heapStats.heap_size_limit * 0.9) {console.warn('WARNING: Heap usage above 90%');this.takeHeapSnapshot();}}async takeHeapSnapshot() {const { writeHeapSnapshot } = require('v8');const filename = `heap-${Date.now()}.heapsnapshot`;writeHeapSnapshot(filename);console.log(`Heap snapshot written to ${filename}`);}getReport() {const report = {timestamp: new Date().toISOString(),requests: {},system: {cpu: this.calculateStats(this.metrics.cpu.map(m => m.usage)),memory: this.calculateStats(this.metrics.memory.map(m => m.heapUsed))}};// Convert request metrics to report formatfor (const [route, stats] of this.metrics.requests) {report.requests[route] = {count: stats.count,avgTime: stats.totalTime / stats.count,minTime: stats.minTime,maxTime: stats.maxTime,p50: stats.p50,p95: stats.p95,p99: stats.p99};}return report;}calculateStats(values) {if (values.length === 0) return {};const sorted = [...values].sort((a, b) => a - b);return {min: sorted[0],max: sorted[sorted.length - 1],avg: values.reduce((a, b) => a + b) / values.length,p50: sorted[Math.floor(sorted.length * 0.5)],p95: sorted[Math.floor(sorted.length * 0.95)],p99: sorted[Math.floor(sorted.length * 0.99)]};}}// Express middleware for request timingfunction performanceMiddleware(monitor) {return (req, res, next) => {const start = performance.now();const route = req.route?.path || req.path;res.on('finish', () => {const duration = performance.now() - start;performance.measure(`request-${route}`, {start,duration});});next();};}module.exports = { PerformanceMonitor, performanceMiddleware }; -
Database Query Profiling
db-profiler.js class DatabaseProfiler {constructor(db) {this.db = db;this.queries = new Map();this.slowQueryThreshold = 100; // msthis.wrapDatabase();}wrapDatabase() {const originalQuery = this.db.query.bind(this.db);this.db.query = async (sql, params) => {const start = performance.now();const stack = new Error().stack;try {const result = await originalQuery(sql, params);const duration = performance.now() - start;this.recordQuery(sql, duration, true, stack);if (duration > this.slowQueryThreshold) {this.handleSlowQuery(sql, duration, params);}return result;} catch (error) {const duration = performance.now() - start;this.recordQuery(sql, duration, false, stack);throw error;}};}recordQuery(sql, duration, success, stack) {const queryKey = this.normalizeQuery(sql);if (!this.queries.has(queryKey)) {this.queries.set(queryKey, {query: queryKey,count: 0,totalTime: 0,minTime: Infinity,maxTime: 0,failures: 0,locations: new Set()});}const stats = this.queries.get(queryKey);stats.count++;stats.totalTime += duration;stats.minTime = Math.min(stats.minTime, duration);stats.maxTime = Math.max(stats.maxTime, duration);if (!success) stats.failures++;// Extract calling locationconst caller = this.extractCaller(stack);if (caller) stats.locations.add(caller);}normalizeQuery(sql) {// Remove specific values to group similar queriesreturn sql.replace(/\b\d+\b/g, '?').replace(/'[^']*'/g, '?').replace(/\s+/g, ' ').trim();}extractCaller(stack) {const lines = stack.split('\n');// Find first non-framework linefor (const line of lines.slice(2)) {if (!line.includes('node_modules') && line.includes('.js')) {const match = line.match(/at .* \((.*:\d+:\d+)\)/);return match ? match[1] : null;}}return null;}async handleSlowQuery(sql, duration, params) {console.warn(`Slow query detected (${duration.toFixed(2)}ms):`);console.warn(sql);// Get query execution plantry {const explainResult = await this.db.query(`EXPLAIN ANALYZE ${sql}`, params);console.warn('Execution plan:', explainResult.rows);// Check for missing indexesconst indexSuggestions = this.suggestIndexes(explainResult.rows);if (indexSuggestions.length > 0) {console.warn('Suggested indexes:', indexSuggestions);}} catch (error) {// EXPLAIN might fail for some queries}}suggestIndexes(explainRows) {const suggestions = [];for (const row of explainRows) {const plan = row['QUERY PLAN'] || row.plan || '';// Look for sequential scans on large tablesif (plan.includes('Seq Scan') && plan.includes('rows=')) {const rowMatch = plan.match(/rows=(\d+)/);if (rowMatch && parseInt(rowMatch[1]) > 1000) {const tableMatch = plan.match(/on (\w+)/);if (tableMatch) {suggestions.push(`Consider index on ${tableMatch[1]}`);}}}// Look for expensive sortsif (plan.includes('Sort') && plan.includes('Sort Method: external')) {suggestions.push('Consider index for ORDER BY columns');}}return [...new Set(suggestions)];}getSlowQueries(limit = 10) {const queries = Array.from(this.queries.values()).filter(q => q.maxTime > this.slowQueryThreshold).sort((a, b) => b.totalTime - a.totalTime).slice(0, limit);return queries.map(q => ({query: q.query,count: q.count,avgTime: q.totalTime / q.count,maxTime: q.maxTime,totalTime: q.totalTime,locations: Array.from(q.locations)}));}}module.exports = DatabaseProfiler; -
Memory Leak Detection
memory-leak-detector.js const { writeHeapSnapshot } = require('v8');const fs = require('fs').promises;class MemoryLeakDetector {constructor(options = {}) {this.threshold = options.threshold || 100 * 1024 * 1024; // 100MBthis.checkInterval = options.checkInterval || 60000; // 1 minutethis.samples = [];this.leaks = [];this.startMonitoring();}startMonitoring() {this.interval = setInterval(() => {this.checkMemory();}, this.checkInterval);// Take initial baselinethis.takeBaseline();}async takeBaseline() {if (global.gc) {global.gc();}await new Promise(resolve => setTimeout(resolve, 1000));const baseline = process.memoryUsage();this.samples.push({timestamp: Date.now(),...baseline});}checkMemory() {const current = process.memoryUsage();const sample = {timestamp: Date.now(),...current};this.samples.push(sample);// Keep only last hour of samplesconst hourAgo = Date.now() - 3600000;this.samples = this.samples.filter(s => s.timestamp > hourAgo);// Analyze for leaksconst leak = this.detectLeak();if (leak) {this.handleLeak(leak);}}detectLeak() {if (this.samples.length < 5) return null;// Calculate growth rateconst recent = this.samples.slice(-5);const old = this.samples.slice(0, 5);const recentAvg = recent.reduce((sum, s) => sum + s.heapUsed, 0) / recent.length;const oldAvg = old.reduce((sum, s) => sum + s.heapUsed, 0) / old.length;const growth = recentAvg - oldAvg;const growthRate = growth / oldAvg;// Detect sustained growthif (growth > this.threshold && growthRate > 0.2) {return {growth,growthRate,currentHeap: recentAvg,samples: this.samples.slice(-10)};}return null;}async handleLeak(leak) {console.error('Memory leak detected!', {growth: `${(leak.growth / 1024 / 1024).toFixed(2)} MB`,rate: `${(leak.growthRate * 100).toFixed(2)}%`,currentHeap: `${(leak.currentHeap / 1024 / 1024).toFixed(2)} MB`});this.leaks.push({timestamp: Date.now(),...leak});// Take heap snapshotawait this.captureHeapSnapshot();// Analyze allocation sourcesconst sources = await this.findAllocationSources();if (sources.length > 0) {console.error('Potential leak sources:', sources);}}async captureHeapSnapshot() {const filename = `leak-${Date.now()}.heapsnapshot`;writeHeapSnapshot(filename);console.log(`Heap snapshot saved: ${filename}`);// Compare with previous snapshot if existsif (this.previousSnapshot) {await this.compareSnapshots(this.previousSnapshot, filename);}this.previousSnapshot = filename;}async findAllocationSources() {// Check common leak sourcesconst sources = [];// Event listenersif (process._events) {const eventCounts = {};for (const event in process._events) {const listeners = process._events[event];const count = Array.isArray(listeners) ? listeners.length : 1;if (count > 10) {sources.push(`High listener count for '${event}': ${count}`);}}}// Global variablesconst globalVars = Object.keys(global).length;if (globalVars > 100) {sources.push(`High global variable count: ${globalVars}`);}// Timersconst activeTimers = process._getActiveHandles? process._getActiveHandles().filter(h => h.constructor.name === 'Timer').length: 0;if (activeTimers > 50) {sources.push(`High active timer count: ${activeTimers}`);}return sources;}}module.exports = MemoryLeakDetector;
Frontend Performance Profiling
Section titled “Frontend Performance Profiling”> Create frontend performance profiling:> - React component render times> - Bundle size analysis> - Network waterfall> - Core Web Vitals monitoring
class FrontendProfiler { constructor() { this.metrics = { components: new Map(), vitals: [], resources: [] };
this.initializeObservers(); }
initializeObservers() { // Performance Observer for various metrics if ('PerformanceObserver' in window) { // Largest Contentful Paint new PerformanceObserver((entryList) => { for (const entry of entryList.getEntries()) { this.metrics.vitals.push({ name: 'LCP', value: entry.renderTime || entry.loadTime, timestamp: Date.now() }); } }).observe({ entryTypes: ['largest-contentful-paint'] });
// First Input Delay new PerformanceObserver((entryList) => { for (const entry of entryList.getEntries()) { this.metrics.vitals.push({ name: 'FID', value: entry.processingStart - entry.startTime, timestamp: Date.now() }); } }).observe({ entryTypes: ['first-input'] });
// Cumulative Layout Shift let clsValue = 0; new PerformanceObserver((entryList) => { for (const entry of entryList.getEntries()) { if (!entry.hadRecentInput) { clsValue += entry.value; } } this.metrics.vitals.push({ name: 'CLS', value: clsValue, timestamp: Date.now() }); }).observe({ entryTypes: ['layout-shift'] }); } }
// React component profiling profileComponent(Component) { const WrappedComponent = React.forwardRef((props, ref) => { const renderStart = performance.now();
React.useEffect(() => { const renderEnd = performance.now(); const componentName = Component.displayName || Component.name || 'Unknown';
if (!this.metrics.components.has(componentName)) { this.metrics.components.set(componentName, { renders: 0, totalTime: 0, avgTime: 0, maxTime: 0 }); }
const stats = this.metrics.components.get(componentName); stats.renders++; stats.totalTime += renderEnd - renderStart; stats.avgTime = stats.totalTime / stats.renders; stats.maxTime = Math.max(stats.maxTime, renderEnd - renderStart); });
return <Component ref={ref} {...props} />; });
WrappedComponent.displayName = `Profiled(${Component.displayName || Component.name})`; return WrappedComponent; }
// Bundle size analysis async analyzeBundleSize() { const resources = performance.getEntriesByType('resource'); const scripts = resources.filter(r => r.name.endsWith('.js'));
const analysis = { totalSize: 0, scripts: [], largestScripts: [] };
for (const script of scripts) { const size = script.transferSize || script.encodedBodySize || 0; analysis.totalSize += size;
const scriptInfo = { name: script.name.split('/').pop(), size, duration: script.duration, compressed: script.transferSize < script.decodedBodySize };
analysis.scripts.push(scriptInfo); }
analysis.largestScripts = analysis.scripts .sort((a, b) => b.size - a.size) .slice(0, 5);
return analysis; }
// Network waterfall getNetworkWaterfall() { const resources = performance.getEntriesByType('resource');
return resources .map(resource => ({ name: resource.name, type: this.getResourceType(resource.name), startTime: resource.startTime, duration: resource.duration, size: resource.transferSize || 0, timeline: { dns: resource.domainLookupEnd - resource.domainLookupStart, tcp: resource.connectEnd - resource.connectStart, ssl: resource.secureConnectionStart > 0 ? resource.connectEnd - resource.secureConnectionStart : 0, ttfb: resource.responseStart - resource.requestStart, download: resource.responseEnd - resource.responseStart } })) .sort((a, b) => a.startTime - b.startTime); }
getResourceType(url) { if (url.match(/\.(js|mjs)$/)) return 'script'; if (url.match(/\.css$/)) return 'stylesheet'; if (url.match(/\.(jpg|jpeg|png|gif|webp|svg)$/)) return 'image'; if (url.match(/\.(woff|woff2|ttf|eot)$/)) return 'font'; if (url.match(/\.json$/)) return 'json'; return 'other'; }
// Generate performance report generateReport() { return { timestamp: Date.now(), vitals: this.getWebVitals(), components: this.getSlowComponents(), bundle: this.analyzeBundleSize(), network: this.getNetworkWaterfall() }; }
getWebVitals() { const vitals = {}; ['LCP', 'FID', 'CLS'].forEach(metric => { const values = this.metrics.vitals .filter(v => v.name === metric) .map(v => v.value);
if (values.length > 0) { vitals[metric] = { value: values[values.length - 1], rating: this.getVitalRating(metric, values[values.length - 1]) }; } });
return vitals; }
getVitalRating(metric, value) { const thresholds = { LCP: { good: 2500, poor: 4000 }, FID: { good: 100, poor: 300 }, CLS: { good: 0.1, poor: 0.25 } };
const threshold = thresholds[metric]; if (value <= threshold.good) return 'good'; if (value <= threshold.poor) return 'needs-improvement'; return 'poor'; }
getSlowComponents() { return Array.from(this.metrics.components.entries()) .map(([name, stats]) => ({ name, ...stats })) .filter(c => c.avgTime > 16) // Slower than 60fps .sort((a, b) => b.avgTime - a.avgTime) .slice(0, 10); }}
Bottleneck Detection
Section titled “Bottleneck Detection”Automated Bottleneck Analysis
Section titled “Automated Bottleneck Analysis”> Analyze our application for performance bottlenecks:> - Identify slow database queries> - Find CPU-intensive operations> - Detect memory leaks> - Analyze network latency> - Profile rendering performance> Provide specific optimization recommendations
class BottleneckAnalyzer { constructor(performanceData) { this.data = performanceData; this.bottlenecks = []; }
async analyze() { // Analyze different aspects await Promise.all([ this.analyzeDatabaseQueries(), this.analyzeCPUUsage(), this.analyzeMemoryUsage(), this.analyzeNetworkLatency(), this.analyzeRenderingPerformance() ]);
// Prioritize bottlenecks by impact this.bottlenecks.sort((a, b) => b.impact - a.impact);
return this.generateReport(); }
async analyzeDatabaseQueries() { const slowQueries = this.data.database.queries .filter(q => q.avgTime > 100) // > 100ms average .sort((a, b) => b.totalTime - a.totalTime);
for (const query of slowQueries.slice(0, 5)) { const impact = (query.totalTime / this.data.totalTime) * 100;
this.bottlenecks.push({ type: 'database', severity: impact > 20 ? 'critical' : impact > 10 ? 'high' : 'medium', impact, details: { query: query.query, avgTime: query.avgTime, count: query.count, totalTime: query.totalTime }, recommendations: await this.getDatabaseRecommendations(query) }); } }
async getDatabaseRecommendations(query) { const recommendations = [];
// Check for missing indexes if (query.query.includes('WHERE') && query.avgTime > 200) { recommendations.push({ title: 'Add Index', description: 'Consider adding an index on the WHERE clause columns', example: this.generateIndexSuggestion(query.query) }); }
// Check for N+1 queries if (query.count > 100 && query.query.includes('SELECT')) { recommendations.push({ title: 'Batch Queries', description: 'This query is executed many times. Consider batching or using JOIN', example: 'Use a single query with JOIN or IN clause instead of multiple queries' }); }
// Check for full table scans if (query.plan && query.plan.includes('Seq Scan')) { recommendations.push({ title: 'Avoid Full Table Scan', description: 'Query is performing a sequential scan', example: 'Add appropriate indexes or limit the result set' }); }
return recommendations; }
generateIndexSuggestion(query) { // Extract table and column names from WHERE clause const whereMatch = query.match(/FROM\s+(\w+).*WHERE\s+(\w+)/i); if (whereMatch) { return `CREATE INDEX idx_${whereMatch[1]}_${whereMatch[2]} ON ${whereMatch[1]}(${whereMatch[2]});`; } return 'CREATE INDEX idx_table_column ON table(column);'; }
async analyzeCPUUsage() { const cpuSpikes = this.data.cpu.samples.filter(s => s.usage > 80);
if (cpuSpikes.length > 0) { const avgHighCPU = cpuSpikes.reduce((sum, s) => sum + s.usage, 0) / cpuSpikes.length;
this.bottlenecks.push({ type: 'cpu', severity: avgHighCPU > 90 ? 'critical' : 'high', impact: cpuSpikes.length / this.data.cpu.samples.length * 100, details: { avgHighCPU, spikeCount: cpuSpikes.length, duration: cpuSpikes.length * this.data.cpu.sampleInterval }, recommendations: [ { title: 'Profile CPU Usage', description: 'Use CPU profiler to identify hot functions', example: 'node --prof app.js && node --prof-process isolate-*.log' }, { title: 'Optimize Algorithms', description: 'Review algorithmic complexity in hot paths', example: 'Replace O(n²) operations with O(n log n) alternatives' }, { title: 'Add Caching', description: 'Cache computation results to avoid repeated calculations', example: 'Implement memoization for expensive functions' } ] }); } }
async analyzeMemoryUsage() { const memoryGrowth = this.calculateMemoryGrowth();
if (memoryGrowth.rate > 0.1) { // 10% growth this.bottlenecks.push({ type: 'memory', severity: memoryGrowth.rate > 0.5 ? 'critical' : 'high', impact: memoryGrowth.rate * 100, details: { growthRate: memoryGrowth.rate, totalGrowth: memoryGrowth.total, suspectedLeaks: memoryGrowth.leaks }, recommendations: [ { title: 'Fix Memory Leaks', description: 'Remove event listeners and clear references', example: `// Remove listeners when doneemitter.removeListener('event', handler);
// Clear timersclearInterval(intervalId);
// Nullify referenceslargeObject = null;` }, { title: 'Use WeakMap/WeakSet', description: 'Use weak references for caches', example: 'const cache = new WeakMap(); // Allows garbage collection' }, { title: 'Stream Large Data', description: 'Process large datasets in chunks', example: 'Use streams instead of loading entire files into memory' } ] }); } }
calculateMemoryGrowth() { const samples = this.data.memory.samples; if (samples.length < 2) return { rate: 0, total: 0, leaks: [] };
const first = samples[0]; const last = samples[samples.length - 1];
const totalGrowth = last.heapUsed - first.heapUsed; const rate = totalGrowth / first.heapUsed;
// Detect potential leak sources const leaks = []; if (this.data.memory.eventListeners > 100) { leaks.push('Excessive event listeners'); } if (this.data.memory.timers > 50) { leaks.push('Many active timers'); }
return { rate, total: totalGrowth, leaks }; }
async analyzeNetworkLatency() { const slowRequests = this.data.network.requests .filter(r => r.duration > 1000) // > 1 second .sort((a, b) => b.duration - a.duration);
if (slowRequests.length > 0) { const avgLatency = slowRequests.reduce((sum, r) => sum + r.duration, 0) / slowRequests.length;
this.bottlenecks.push({ type: 'network', severity: avgLatency > 3000 ? 'critical' : 'high', impact: slowRequests.length / this.data.network.requests.length * 100, details: { slowRequests: slowRequests.slice(0, 5), avgLatency }, recommendations: [ { title: 'Implement Caching', description: 'Cache frequently accessed resources', example: 'Add Cache-Control headers and implement service worker caching' }, { title: 'Use CDN', description: 'Serve static assets from CDN', example: 'Move images, CSS, and JS to a CDN for faster delivery' }, { title: 'Enable Compression', description: 'Compress responses with gzip/brotli', example: 'app.use(compression()); // Express compression middleware' }, { title: 'Optimize Payloads', description: 'Reduce response sizes', example: 'Use pagination, GraphQL, or sparse fieldsets' } ] }); } }
async analyzeRenderingPerformance() { if (!this.data.frontend) return;
const slowComponents = this.data.frontend.components .filter(c => c.avgRenderTime > 16) // Slower than 60fps .sort((a, b) => b.totalTime - a.totalTime);
if (slowComponents.length > 0) { this.bottlenecks.push({ type: 'rendering', severity: slowComponents[0].avgRenderTime > 50 ? 'critical' : 'high', impact: slowComponents[0].totalTime / this.data.totalTime * 100, details: { slowComponents: slowComponents.slice(0, 5) }, recommendations: [ { title: 'Optimize Re-renders', description: 'Use React.memo and useMemo', example: `const MemoizedComponent = React.memo(Component, (prevProps, nextProps) => { return prevProps.id === nextProps.id;});` }, { title: 'Virtualize Lists', description: 'Use virtual scrolling for long lists', example: 'Implement react-window or react-virtualized' }, { title: 'Code Split', description: 'Lazy load components', example: `const HeavyComponent = React.lazy(() => import('./HeavyComponent'));` } ] }); } }
generateReport() { return { summary: { totalBottlenecks: this.bottlenecks.length, criticalCount: this.bottlenecks.filter(b => b.severity === 'critical').length, estimatedImpact: this.bottlenecks.reduce((sum, b) => sum + b.impact, 0) }, bottlenecks: this.bottlenecks, actionPlan: this.generateActionPlan() }; }
generateActionPlan() { const plan = [];
// Group by type and severity const critical = this.bottlenecks.filter(b => b.severity === 'critical'); const high = this.bottlenecks.filter(b => b.severity === 'high');
if (critical.length > 0) { plan.push({ phase: 'Immediate', tasks: critical.map(b => ({ type: b.type, action: b.recommendations[0].title, impact: `${b.impact.toFixed(1)}% improvement expected` })) }); }
if (high.length > 0) { plan.push({ phase: 'Short-term', tasks: high.map(b => ({ type: b.type, action: b.recommendations[0].title, impact: `${b.impact.toFixed(1)}% improvement expected` })) }); }
return plan; }}
Optimization Strategies
Section titled “Optimization Strategies”Query Optimization
Section titled “Query Optimization”> Optimize our database queries:> - Analyze slow queries> - Add appropriate indexes> - Rewrite inefficient queries> - Implement query caching> Show before/after performance metrics
class QueryOptimizer { constructor(db) { this.db = db; }
async optimizeQueries(slowQueries) { const optimizations = [];
for (const query of slowQueries) { const optimization = await this.optimizeQuery(query); if (optimization) { optimizations.push(optimization); } }
return optimizations; }
async optimizeQuery(queryInfo) { const { query, avgTime, count } = queryInfo;
// Analyze query structure const analysis = this.analyzeQuery(query);
// Generate optimization strategies const strategies = [];
// 1. Index optimization if (analysis.missingIndexes.length > 0) { strategies.push({ type: 'index', description: 'Add missing indexes', implementation: analysis.missingIndexes.map(idx => `CREATE INDEX ${idx.name} ON ${idx.table}(${idx.columns.join(', ')});` ), estimatedImprovement: 70 }); }
// 2. Query rewriting if (analysis.inefficientPatterns.length > 0) { strategies.push({ type: 'rewrite', description: 'Rewrite inefficient query patterns', implementation: this.rewriteQuery(query, analysis.inefficientPatterns), estimatedImprovement: 50 }); }
// 3. Caching if (count > 10 && analysis.cacheable) { strategies.push({ type: 'cache', description: 'Implement query result caching', implementation: this.generateCachingCode(query), estimatedImprovement: 90 }); }
// Test optimizations const results = await this.testOptimizations(query, strategies);
return { original: { query, avgTime, count }, optimizations: results, bestStrategy: results.reduce((best, current) => current.improvement > best.improvement ? current : best ) }; }
analyzeQuery(query) { const analysis = { tables: [], joins: [], whereConditions: [], orderBy: [], missingIndexes: [], inefficientPatterns: [], cacheable: true };
// Extract query components const tableMatches = query.match(/FROM\s+(\w+)/gi); if (tableMatches) { analysis.tables = tableMatches.map(m => m.replace(/FROM\s+/i, '')); }
// Check for missing indexes on WHERE conditions const whereMatch = query.match(/WHERE\s+(.+?)(?:ORDER|GROUP|LIMIT|$)/i); if (whereMatch) { const conditions = whereMatch[1].split(/\s+AND\s+/i); for (const condition of conditions) { const columnMatch = condition.match(/(\w+)\s*=|(\w+)\s+IN|(\w+)\s+LIKE/i); if (columnMatch) { const column = columnMatch[1] || columnMatch[2] || columnMatch[3]; analysis.whereConditions.push(column);
// Check if index exists (simplified check) const indexExists = await this.checkIndexExists(analysis.tables[0], column); if (!indexExists) { analysis.missingIndexes.push({ name: `idx_${analysis.tables[0]}_${column}`, table: analysis.tables[0], columns: [column] }); } } } }
// Check for inefficient patterns if (query.includes('SELECT *')) { analysis.inefficientPatterns.push('select_all'); }
if (query.match(/NOT\s+IN\s*\(/i)) { analysis.inefficientPatterns.push('not_in'); }
if (query.match(/LIKE\s+'%[^']+'/i)) { analysis.inefficientPatterns.push('leading_wildcard'); }
// Check if cacheable if (query.match(/NOW\(\)|CURRENT_|RAND\(\)/i)) { analysis.cacheable = false; }
return analysis; }
async checkIndexExists(table, column) { // Simplified check - in practice, query system catalogs try { const result = await this.db.query(` SELECT 1 FROM pg_indexes WHERE tablename = $1 AND indexdef LIKE '%${column}%' `, [table]); return result.rows.length > 0; } catch { return false; } }
rewriteQuery(query, patterns) { let optimized = query;
for (const pattern of patterns) { switch (pattern) { case 'select_all': // Replace SELECT * with specific columns optimized = optimized.replace( /SELECT\s+\*/i, 'SELECT id, name, created_at' // Example columns ); break;
case 'not_in': // Replace NOT IN with LEFT JOIN optimized = optimized.replace( /WHERE\s+(\w+)\s+NOT\s+IN\s*\(([^)]+)\)/i, (match, column, subquery) => `LEFT JOIN (${subquery}) excluded ON main.${column} = excluded.${column} WHERE excluded.${column} IS NULL` ); break;
case 'leading_wildcard': // Suggest full-text search optimized = '-- Consider using full-text search instead of LIKE with leading wildcard\n' + optimized; break; } }
return optimized; }
generateCachingCode(query) { const cacheKey = this.generateCacheKey(query);
return `// Redis caching implementationconst cacheKey = '${cacheKey}';const cacheTTL = 3600; // 1 hour
async function getCachedQuery(params) { // Check cache first const cached = await redis.get(cacheKey); if (cached) { return JSON.parse(cached); }
// Execute query const result = await db.query(\`${query}\`, params);
// Cache result await redis.setex(cacheKey, cacheTTL, JSON.stringify(result.rows));
return result.rows;}
// Invalidate cache on data changesasync function invalidateCache() { await redis.del(cacheKey);}`; }
generateCacheKey(query) { // Generate a stable cache key from query const normalized = query .replace(/\s+/g, ' ') .trim() .toLowerCase();
return `query:${crypto.createHash('md5').update(normalized).digest('hex')}`; }
async testOptimizations(originalQuery, strategies) { const results = [];
for (const strategy of strategies) { try { // Test the optimization const testResult = await this.benchmarkQuery( strategy.type === 'rewrite' ? strategy.implementation : originalQuery, strategy );
results.push({ ...strategy, actualImprovement: testResult.improvement, newAvgTime: testResult.avgTime, successful: true }); } catch (error) { results.push({ ...strategy, error: error.message, successful: false }); } }
return results; }
async benchmarkQuery(query, optimization) { const iterations = 10; const times = [];
// Run query multiple times for (let i = 0; i < iterations; i++) { const start = performance.now(); await this.db.query(query); times.push(performance.now() - start); }
const avgTime = times.reduce((a, b) => a + b) / times.length; const improvement = optimization.estimatedImprovement || 0;
return { avgTime, improvement }; }}
Code Optimization
Section titled “Code Optimization”> Optimize our application code for performance:> - Identify CPU-intensive functions> - Optimize algorithms> - Implement caching> - Add parallelization> - Reduce memory allocations
class CodeOptimizer { analyzeAndOptimize(code, profilerData) { const optimizations = [];
// 1. Algorithm optimization const algorithmOptimizations = this.optimizeAlgorithms(code, profilerData); optimizations.push(...algorithmOptimizations);
// 2. Caching opportunities const cachingOptimizations = this.addCaching(code, profilerData); optimizations.push(...cachingOptimizations);
// 3. Parallelization const parallelOptimizations = this.addParallelization(code); optimizations.push(...parallelOptimizations);
// 4. Memory optimization const memoryOptimizations = this.optimizeMemory(code); optimizations.push(...memoryOptimizations);
return optimizations; }
optimizeAlgorithms(code, profilerData) { const optimizations = [];
// Example: Replace O(n²) with O(n log n) if (code.includes('filter') && code.includes('includes')) { optimizations.push({ type: 'algorithm', pattern: 'nested array search', original: `// O(n²) complexityconst filtered = array1.filter(item => array2.includes(item.id));`, optimized: `// O(n) complexity using Setconst array2Set = new Set(array2);const filtered = array1.filter(item => array2Set.has(item.id));`, improvement: 'From O(n²) to O(n)' }); }
// Example: Optimize sorting if (code.match(/sort\([^)]*\).*sort\([^)]*\)/)) { optimizations.push({ type: 'algorithm', pattern: 'multiple sorts', original: `// Multiple sortsdata.sort((a, b) => a.date - b.date) .sort((a, b) => a.priority - b.priority);`, optimized: `// Single combined sortdata.sort((a, b) => { if (a.priority !== b.priority) { return a.priority - b.priority; } return a.date - b.date;});`, improvement: 'Reduced from 2 passes to 1' }); }
return optimizations; }
addCaching(code, profilerData) { const optimizations = [];
// Memoization for expensive functions const expensiveFunctions = profilerData.functions .filter(f => f.avgTime > 10 && f.calls > 10);
for (const func of expensiveFunctions) { optimizations.push({ type: 'caching', pattern: 'memoization', original: `function ${func.name}(input) { // Expensive computation return result;}`, optimized: `const ${func.name} = (() => { const cache = new Map();
return function(input) { const key = JSON.stringify(input); if (cache.has(key)) { return cache.get(key); }
// Expensive computation const result = computeResult(input); cache.set(key, result);
// LRU eviction if (cache.size > 1000) { const firstKey = cache.keys().next().value; cache.delete(firstKey); }
return result; };})();`, improvement: 'Avoid repeated calculations' }); }
return optimizations; }
addParallelization(code) { const optimizations = [];
// Look for sequential async operations if (code.match(/await.*\n.*await/)) { optimizations.push({ type: 'parallelization', pattern: 'sequential awaits', original: `// Sequential executionconst user = await fetchUser(id);const posts = await fetchPosts(id);const comments = await fetchComments(id);`, optimized: `// Parallel executionconst [user, posts, comments] = await Promise.all([ fetchUser(id), fetchPosts(id), fetchComments(id)]);`, improvement: '3x faster for independent operations' }); }
// Worker threads for CPU-intensive tasks if (code.includes('for') && code.includes('compute')) { optimizations.push({ type: 'parallelization', pattern: 'cpu-intensive loop', original: `// Single-threaded processingconst results = [];for (const item of largeArray) { results.push(expensiveComputation(item));}`, optimized: `// Multi-threaded processingconst { Worker } = require('worker_threads');const os = require('os');
async function parallelProcess(items) { const numWorkers = os.cpus().length; const chunkSize = Math.ceil(items.length / numWorkers); const chunks = [];
for (let i = 0; i < items.length; i += chunkSize) { chunks.push(items.slice(i, i + chunkSize)); }
const workers = chunks.map(chunk => new Promise((resolve, reject) => { const worker = new Worker('./computation-worker.js'); worker.postMessage(chunk); worker.on('message', resolve); worker.on('error', reject); }) );
const results = await Promise.all(workers); return results.flat();}`, improvement: `${os.cpus().length}x speedup for CPU-bound tasks` }); }
return optimizations; }
optimizeMemory(code) { const optimizations = [];
// Object pooling if (code.includes('new') && code.includes('loop')) { optimizations.push({ type: 'memory', pattern: 'object allocation in loop', original: `// Creates garbagefor (let i = 0; i < 1000000; i++) { const obj = new ExpensiveObject(); process(obj);}`, optimized: `// Object poolclass ObjectPool { constructor(createFn, resetFn, size = 100) { this.createFn = createFn; this.resetFn = resetFn; this.pool = []; this.available = [];
// Pre-populate pool for (let i = 0; i < size; i++) { const obj = createFn(); this.pool.push(obj); this.available.push(obj); } }
acquire() { if (this.available.length > 0) { return this.available.pop(); } return this.createFn(); }
release(obj) { this.resetFn(obj); if (this.available.length < this.pool.length) { this.available.push(obj); } }}
const objectPool = new ObjectPool( () => new ExpensiveObject(), (obj) => obj.reset());
for (let i = 0; i < 1000000; i++) { const obj = objectPool.acquire(); process(obj); objectPool.release(obj);}`, improvement: 'Reduced GC pressure and allocation overhead' }); }
// String concatenation if (code.match(/\+=.*string|str\s*\+/)) { optimizations.push({ type: 'memory', pattern: 'string concatenation', original: `// Inefficient string buildinglet result = '';for (const item of items) { result += item + ',';}`, optimized: `// Efficient string buildingconst parts = [];for (const item of items) { parts.push(item);}const result = parts.join(',');
// Or for very large stringsconst { StringDecoder } = require('string_decoder');const decoder = new StringDecoder('utf8');const buffers = [];// ... add buffersconst result = Buffer.concat(buffers).toString();`, improvement: 'Avoid intermediate string allocations' }); }
return optimizations; }}
Load Testing
Section titled “Load Testing”Automated Load Testing
Section titled “Automated Load Testing”> Create comprehensive load testing suite:> - Simulate realistic user behavior> - Test under various load conditions> - Identify breaking points> - Generate performance reports
const autocannon = require('autocannon');const { Worker } = require('worker_threads');
class LoadTester { constructor(config) { this.config = { baseUrl: config.baseUrl || 'http://localhost:3000', duration: config.duration || 60, scenarios: config.scenarios || [] };
this.results = []; }
async runLoadTests() { console.log('Starting load tests...\n');
// Run different load scenarios for (const scenario of this.config.scenarios) { console.log(`Running scenario: ${scenario.name}`); const result = await this.runScenario(scenario); this.results.push(result);
// Cool down between scenarios await this.coolDown(10); }
return this.generateReport(); }
async runScenario(scenario) { const instance = autocannon({ url: this.config.baseUrl + scenario.endpoint, connections: scenario.connections || 10, pipelining: scenario.pipelining || 1, duration: scenario.duration || this.config.duration, requests: scenario.requests, setupClient: this.setupClient.bind(this), // Custom scenario logic requests: this.generateRequests(scenario) });
return new Promise((resolve) => { instance.on('done', (results) => { resolve({ scenario: scenario.name, results: this.processResults(results), metrics: this.calculateMetrics(results) }); }); }); }
setupClient(client) { // Add authentication if needed client.on('headers', (headers) => { headers['Authorization'] = 'Bearer ' + this.config.authToken; }); }
generateRequests(scenario) { const requests = [];
// Generate realistic request patterns if (scenario.type === 'user-journey') { requests.push( // Login { method: 'POST', path: '/api/auth/login', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ email: 'test@example.com', password: 'password123' }) }, // Browse products { method: 'GET', path: '/api/products?page=1&limit=20', setupRequest: (req, context) => { req.headers['Authorization'] = `Bearer ${context.token}`; } }, // Add to cart { method: 'POST', path: '/api/cart/add', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ productId: '123', quantity: 1 }) } ); }
return requests; }
processResults(results) { return { requests: { total: results.requests.total, average: results.requests.average, mean: results.requests.mean, stddev: results.requests.stddev, min: results.requests.min, max: results.requests.max, p99: results.requests.p99, p95: results.requests.p95, p50: results.requests.p50 }, throughput: { average: results.throughput.average, mean: results.throughput.mean, stddev: results.throughput.stddev, min: results.throughput.min, max: results.throughput.max, p99: results.throughput.p99 }, latency: { average: results.latency.average, mean: results.latency.mean, stddev: results.latency.stddev, min: results.latency.min, max: results.latency.max, p99: results.latency.p99, p95: results.latency.p95, p50: results.latency.p50 }, errors: results.errors, timeouts: results.timeouts, duration: results.duration, connections: results.connections, pipelining: results.pipelining }; }
calculateMetrics(results) { const metrics = { // Requests per second rps: results.requests.average,
// Success rate successRate: ((results.requests.total - results.errors) / results.requests.total) * 100,
// Average response time avgResponseTime: results.latency.mean,
// Percentile response times p50ResponseTime: results.latency.p50, p95ResponseTime: results.latency.p95, p99ResponseTime: results.latency.p99,
// Throughput throughputMBps: results.throughput.mean / 1024 / 1024,
// Error rate errorRate: (results.errors / results.requests.total) * 100,
// Timeout rate timeoutRate: (results.timeouts / results.requests.total) * 100 };
// Calculate SLA compliance metrics.slaCompliance = this.calculateSLACompliance(metrics);
return metrics; }
calculateSLACompliance(metrics) { const sla = { maxResponseTime: 500, // ms minSuccessRate: 99.9, // % maxErrorRate: 0.1 // % };
const compliance = { responseTime: metrics.p95ResponseTime <= sla.maxResponseTime, successRate: metrics.successRate >= sla.minSuccessRate, errorRate: metrics.errorRate <= sla.maxErrorRate, overall: true };
compliance.overall = compliance.responseTime && compliance.successRate && compliance.errorRate;
return compliance; }
async coolDown(seconds) { console.log(`Cooling down for ${seconds} seconds...`); await new Promise(resolve => setTimeout(resolve, seconds * 1000)); }
generateReport() { const report = { timestamp: new Date().toISOString(), summary: this.generateSummary(), scenarios: this.results, recommendations: this.generateRecommendations() };
// Generate visualizations report.charts = this.generateCharts();
return report; }
generateSummary() { const summary = { totalScenarios: this.results.length, overallSuccess: true, failedScenarios: [], performance: { avgRPS: 0, avgResponseTime: 0, avgSuccessRate: 0 } };
let totalRPS = 0; let totalResponseTime = 0; let totalSuccessRate = 0;
for (const result of this.results) { totalRPS += result.metrics.rps; totalResponseTime += result.metrics.avgResponseTime; totalSuccessRate += result.metrics.successRate;
if (!result.metrics.slaCompliance.overall) { summary.overallSuccess = false; summary.failedScenarios.push(result.scenario); } }
summary.performance.avgRPS = totalRPS / this.results.length; summary.performance.avgResponseTime = totalResponseTime / this.results.length; summary.performance.avgSuccessRate = totalSuccessRate / this.results.length;
return summary; }
generateRecommendations() { const recommendations = [];
for (const result of this.results) { const metrics = result.metrics;
// Response time recommendations if (metrics.p95ResponseTime > 1000) { recommendations.push({ scenario: result.scenario, issue: 'High response time', recommendation: 'Optimize slow endpoints, add caching, or scale horizontally', severity: 'high' }); }
// Error rate recommendations if (metrics.errorRate > 1) { recommendations.push({ scenario: result.scenario, issue: 'High error rate', recommendation: 'Investigate error logs, fix bugs, add retry logic', severity: 'critical' }); }
// Throughput recommendations if (metrics.rps < 100) { recommendations.push({ scenario: result.scenario, issue: 'Low throughput', recommendation: 'Increase connection pool, optimize database queries, add caching', severity: 'medium' }); } }
return recommendations; }
generateCharts() { // Generate chart data for visualization return { responseTimeChart: { type: 'line', data: this.results.map(r => ({ scenario: r.scenario, p50: r.results.latency.p50, p95: r.results.latency.p95, p99: r.results.latency.p99 })) }, throughputChart: { type: 'bar', data: this.results.map(r => ({ scenario: r.scenario, rps: r.metrics.rps })) }, errorRateChart: { type: 'bar', data: this.results.map(r => ({ scenario: r.scenario, errorRate: r.metrics.errorRate })) } }; }}
// Usageconst loadTester = new LoadTester({ baseUrl: 'http://localhost:3000', duration: 60, scenarios: [ { name: 'Light Load', endpoint: '/api/products', connections: 10, duration: 30 }, { name: 'Normal Load', endpoint: '/api/products', connections: 100, duration: 60 }, { name: 'Heavy Load', endpoint: '/api/products', connections: 500, duration: 60 }, { name: 'Spike Test', endpoint: '/api/products', connections: 1000, duration: 30 }, { name: 'User Journey', type: 'user-journey', connections: 50, duration: 120 } ]});
loadTester.runLoadTests().then(report => { console.log(JSON.stringify(report, null, 2));});
Performance Monitoring
Section titled “Performance Monitoring”Continuous Performance Monitoring
Section titled “Continuous Performance Monitoring”> Set up continuous performance monitoring:> - Real-time metrics collection> - Performance budgets> - Automated alerts> - Trend analysis> - Regression detection
class PerformanceDashboard { constructor() { this.metrics = { realtime: new Map(), historical: [], budgets: this.loadBudgets(), alerts: [] };
this.startMonitoring(); }
loadBudgets() { return { responseTime: { p50: 200, p95: 500, p99: 1000 }, errorRate: 0.1, // 0.1% cpu: 70, // 70% memory: 80, // 80% bundleSize: { js: 500 * 1024, // 500KB css: 100 * 1024, // 100KB total: 1024 * 1024 // 1MB }, webVitals: { LCP: 2500, FID: 100, CLS: 0.1 } }; }
startMonitoring() { // Collect metrics every minute setInterval(() => { this.collectMetrics(); this.checkBudgets(); this.detectRegressions(); }, 60000);
// Real-time monitoring this.setupRealtimeMonitoring(); }
async collectMetrics() { const timestamp = Date.now(); const metrics = { timestamp, server: await this.collectServerMetrics(), database: await this.collectDatabaseMetrics(), frontend: await this.collectFrontendMetrics(), business: await this.collectBusinessMetrics() };
this.metrics.historical.push(metrics);
// Keep only last 24 hours const dayAgo = timestamp - 24 * 60 * 60 * 1000; this.metrics.historical = this.metrics.historical .filter(m => m.timestamp > dayAgo); }
async collectServerMetrics() { // Collect from monitoring endpoints const response = await fetch('/metrics'); const data = await response.json();
return { responseTime: { p50: data.http_request_duration_p50, p95: data.http_request_duration_p95, p99: data.http_request_duration_p99 }, requestRate: data.http_requests_per_second, errorRate: data.http_error_rate, cpu: data.cpu_usage_percent, memory: data.memory_usage_percent, activeConnections: data.active_connections }; }
async collectDatabaseMetrics() { const response = await fetch('/metrics/database'); const data = await response.json();
return { queryTime: { avg: data.avg_query_time, p95: data.p95_query_time }, activeConnections: data.active_connections, slowQueries: data.slow_query_count, deadlocks: data.deadlock_count }; }
checkBudgets() { const latest = this.metrics.historical[this.metrics.historical.length - 1]; if (!latest) return;
const violations = [];
// Check response time budgets if (latest.server.responseTime.p95 > this.metrics.budgets.responseTime.p95) { violations.push({ metric: 'Response Time P95', value: latest.server.responseTime.p95, budget: this.metrics.budgets.responseTime.p95, severity: 'high' }); }
// Check error rate if (latest.server.errorRate > this.metrics.budgets.errorRate) { violations.push({ metric: 'Error Rate', value: latest.server.errorRate, budget: this.metrics.budgets.errorRate, severity: 'critical' }); }
// Check resource usage if (latest.server.cpu > this.metrics.budgets.cpu) { violations.push({ metric: 'CPU Usage', value: latest.server.cpu, budget: this.metrics.budgets.cpu, severity: 'medium' }); }
if (violations.length > 0) { this.handleBudgetViolations(violations); } }
handleBudgetViolations(violations) { for (const violation of violations) { // Create alert const alert = { id: Date.now() + Math.random(), timestamp: Date.now(), type: 'budget_violation', ...violation };
this.metrics.alerts.push(alert);
// Send notifications if (violation.severity === 'critical') { this.sendAlert(alert); } } }
detectRegressions() { if (this.metrics.historical.length < 10) return;
// Compare current performance with historical baseline const recent = this.metrics.historical.slice(-5); const baseline = this.metrics.historical.slice(-50, -10);
const recentAvg = this.calculateAverages(recent); const baselineAvg = this.calculateAverages(baseline);
// Detect significant regressions const regressions = [];
// Response time regression const responseTimeIncrease = (recentAvg.responseTime - baselineAvg.responseTime) / baselineAvg.responseTime;
if (responseTimeIncrease > 0.2) { // 20% increase regressions.push({ metric: 'Response Time', baseline: baselineAvg.responseTime, current: recentAvg.responseTime, change: `+${(responseTimeIncrease * 100).toFixed(1)}%`, severity: responseTimeIncrease > 0.5 ? 'high' : 'medium' }); }
// Error rate regression if (recentAvg.errorRate > baselineAvg.errorRate * 2) { regressions.push({ metric: 'Error Rate', baseline: baselineAvg.errorRate, current: recentAvg.errorRate, change: `+${((recentAvg.errorRate - baselineAvg.errorRate) * 100).toFixed(2)}%`, severity: 'high' }); }
if (regressions.length > 0) { this.handleRegressions(regressions); } }
calculateAverages(metrics) { const avg = { responseTime: 0, errorRate: 0, cpu: 0, memory: 0 };
for (const metric of metrics) { avg.responseTime += metric.server.responseTime.p95; avg.errorRate += metric.server.errorRate; avg.cpu += metric.server.cpu; avg.memory += metric.server.memory; }
const count = metrics.length; avg.responseTime /= count; avg.errorRate /= count; avg.cpu /= count; avg.memory /= count;
return avg; }
setupRealtimeMonitoring() { // WebSocket connection for real-time metrics const ws = new WebSocket('ws://localhost:3000/metrics/stream');
ws.on('message', (data) => { const metric = JSON.parse(data);
// Update real-time metrics this.metrics.realtime.set(metric.type, metric);
// Check for immediate issues if (metric.type === 'error_spike') { this.handleErrorSpike(metric); } }); }
generateDashboard() { const latest = this.metrics.historical[this.metrics.historical.length - 1]; const realtime = Object.fromEntries(this.metrics.realtime);
return { timestamp: Date.now(), current: { server: latest?.server || {}, database: latest?.database || {}, realtime }, trends: this.calculateTrends(), alerts: this.metrics.alerts.slice(-10), recommendations: this.generateRecommendations() }; }
calculateTrends() { if (this.metrics.historical.length < 2) return {};
const current = this.metrics.historical[this.metrics.historical.length - 1]; const hourAgo = this.metrics.historical[this.metrics.historical.length - 60] || this.metrics.historical[0];
return { responseTime: { current: current.server.responseTime.p95, hourAgo: hourAgo.server.responseTime.p95, trend: current.server.responseTime.p95 > hourAgo.server.responseTime.p95 ? 'up' : 'down' }, errorRate: { current: current.server.errorRate, hourAgo: hourAgo.server.errorRate, trend: current.server.errorRate > hourAgo.server.errorRate ? 'up' : 'down' }, traffic: { current: current.server.requestRate, hourAgo: hourAgo.server.requestRate, trend: current.server.requestRate > hourAgo.server.requestRate ? 'up' : 'down' } }; }
generateRecommendations() { const recommendations = []; const latest = this.metrics.historical[this.metrics.historical.length - 1];
if (!latest) return recommendations;
// High CPU usage if (latest.server.cpu > 80) { recommendations.push({ type: 'scaling', priority: 'high', message: 'CPU usage is high. Consider horizontal scaling or code optimization.', action: 'Scale up instances or optimize CPU-intensive operations' }); }
// Slow database queries if (latest.database.slowQueries > 10) { recommendations.push({ type: 'database', priority: 'medium', message: `${latest.database.slowQueries} slow queries detected.`, action: 'Review and optimize slow database queries' }); }
// Memory pressure if (latest.server.memory > 85) { recommendations.push({ type: 'memory', priority: 'high', message: 'Memory usage is approaching limits.', action: 'Investigate memory leaks or increase memory allocation' }); }
return recommendations; }}
Related Lessons
Section titled “Related Lessons”Next Steps
Section titled “Next Steps”You’ve learned how to leverage Claude Code for comprehensive performance analysis and optimization. The key is treating performance as a continuous process, not a one-time optimization. Build performance awareness into every line of code, every architecture decision, every deployment.
Remember: Performance is a feature. Users don’t care how clever your code is if it’s slow. Use Claude Code to find and fix bottlenecks systematically, ensuring your application delights users with its speed and responsiveness.