Pattern Recognition
AI identifies error patterns across millions of codebases
Ta treść nie jest jeszcze dostępna w Twoim języku.
Debugging is where AI assistance transforms from helpful to revolutionary. By combining systematic approaches with AI’s pattern recognition and code analysis capabilities, you can track down even the most elusive bugs in minutes rather than hours. This guide provides battle-tested debugging patterns for real-world scenarios.
AI changes debugging from a frustrating hunt to a systematic investigation:
Pattern Recognition
AI identifies error patterns across millions of codebases
Strategic Logging
Automatically adds optimal debug statements
Root Cause Analysis
Traces errors through complex call stacks
Fix Validation
Ensures fixes don’t introduce new bugs
The most powerful debugging pattern - let AI instrument your code intelligently.
Describe the Problem
Bug: User authentication fails intermittentlyError: "Token validation failed"Frequency: ~30% of login attemptsPattern: No clear correlation with user type or time
AI Adds Strategic Logging
// Cursor Agent modeAgent: "Add debug logging to trace the authentication flow.Focus on token generation, validation timing, and state changes"
// AI adds targeted loggingasync function validateToken(token) { console.log('[AUTH] Token validation started', { tokenLength: token?.length, timestamp: Date.now(), tokenPrefix: token?.substring(0, 10) + '...' });
try { const decoded = jwt.verify(token, SECRET); console.log('[AUTH] Token decoded', { userId: decoded.userId, exp: decoded.exp, timeUntilExpiry: decoded.exp * 1000 - Date.now() });
// Check if token is about to expire if (decoded.exp * 1000 - Date.now() < 60000) { console.warn('[AUTH] Token expiring soon', { expiresIn: Math.floor((decoded.exp * 1000 - Date.now()) / 1000) + 's' }); }
return decoded; } catch (error) { console.error('[AUTH] Token validation failed', { error: error.message, tokenAge: token ? Date.now() - jwt.decode(token)?.iat * 1000 : null }); throw error; }}
# Claude analyzes and adds loggingclaude "Add strategic debug logging to auth.js to diagnoseintermittent token validation failures. Focus on timing issuesand race conditions"
# Run with enhanced loggingNODE_ENV=debug npm start 2>&1 | tee debug.log
# Analyze collected logsclaude "Analyze debug.log for patterns in authentication failures"
Collect and Analyze
# Run tests to trigger the issuenpm test -- --grep "authentication" --repeat 100
# Feed logs to AI@debug.log "Analyze these logs and identify the root causeof intermittent failures"
AI Identifies Root Cause
Analysis: Token validation fails when:1. Request takes >5s (token expires during validation)2. System clock drift between services3. Race condition when refreshing near expiry
Solution: Implement clock skew tolerance and pre-emptive refresh
Debug production issues without breaking things.
// Feed production error to AI"Production error from Sentry:TypeError: Cannot read property 'id' of undefined at UserService.processOrder (/app/services/user.js:145:23) at async OrderController.create (/app/controllers/order.js:67:18) at async /app/middleware/errorHandler.js:12:5
Analyze and provide fix that handles this edge case safely"
// AI response with defensive fixclass UserService { async processOrder(userId, orderData) { // Add defensive checks if (!userId) { throw new ValidationError('User ID required for order processing'); }
const user = await this.getUser(userId); if (!user) { throw new NotFoundError(`User ${userId} not found`); }
// Safe property access const customerId = user?.stripeCustomer?.id; if (!customerId) { // Handle missing Stripe customer gracefully logger.warn('User missing Stripe customer', { userId }); const customer = await this.createStripeCustomer(user); user.stripeCustomer = customer; }
// Continue with order processing return this.createOrder(user, orderData); }}
// Diagnose memory issues"App memory usage grows continuously. Heap snapshot shows:- Arrays: 2.3GB (45%)- Strings: 1.8GB (35%)- Objects: 1.0GB (20%)
Help identify memory leaks"
// AI identifies common patternsclass EventManager { constructor() { // ❌ LEAK: Event listeners never removed this.listeners = []; }
// ✅ FIX: Implement cleanup addListener(event, callback) { const listener = { event, callback }; this.listeners.push(listener);
// Return cleanup function return () => { const index = this.listeners.indexOf(listener); if (index > -1) { this.listeners.splice(index, 1); } }; }
// Add lifecycle cleanup destroy() { this.listeners = []; this.removeAllListeners(); }}
Identify and fix timing-related bugs.
// Describe the symptoms"Users report duplicate charges when rapidly clicking submit.Happens ~5% of the time under load"
// AI creates race condition testdescribe('Payment Processing Race Conditions', () => { it('should handle concurrent submissions', async () => { const userId = 'test-user'; const paymentData = { amount: 100, currency: 'USD' };
// Simulate rapid clicks const promises = Array(5).fill(null).map(() => processPayment(userId, paymentData) );
const results = await Promise.allSettled(promises);
// Only one should succeed const successful = results.filter(r => r.status === 'fulfilled'); expect(successful).toHaveLength(1);
// Others should be rejected with idempotency error const rejected = results.filter(r => r.status === 'rejected'); expect(rejected).toHaveLength(4); rejected.forEach(r => { expect(r.reason.message).toContain('Payment already processing'); }); });});
// AI suggests idempotency solutionclass PaymentService { constructor() { this.processingPayments = new Map(); }
async processPayment(userId, paymentData) { const idempotencyKey = `${userId}-${Date.now()}`;
// Check if already processing if (this.processingPayments.has(userId)) { throw new ConflictError('Payment already processing'); }
// Mark as processing this.processingPayments.set(userId, idempotencyKey);
try { // Process payment const result = await this.chargeCard(paymentData); return result; } finally { // Always cleanup this.processingPayments.delete(userId); } }}
AI-Powered Performance Analysis
// Profile slow endpoint"POST /api/reports/generate takes 45s on average.Profile and optimize this endpoint"
// AI adds detailed timingasync function generateReport(params) { const metrics = new PerformanceMetrics('report-generation');
try { // Time each phase metrics.start('data-fetch'); const data = await fetchReportData(params); metrics.end('data-fetch');
metrics.start('processing'); const processed = await processData(data); metrics.end('processing');
metrics.start('rendering'); const report = await renderReport(processed); metrics.end('rendering');
// Log performance breakdown console.log('Performance breakdown:', metrics.getSummary());
return report; } catch (error) { metrics.recordError(error); throw error; }}
// AI identifies bottleneck"Analysis:- Data fetch: 2s (OK)- Processing: 40s (PROBLEM - N+1 queries)- Rendering: 3s (OK)
Solution: Batch database queries in processData()"
Debug issues spanning multiple services.
Trace Request Flow
// Add correlation IDs"Add distributed tracing to track request flow:Frontend → API Gateway → User Service → Payment Service → Database"
Correlate Logs
# Gather logs from all serviceskubectl logs -l app=user-service --since=1h > user.logkubectl logs -l app=payment-service --since=1h > payment.log
# AI correlates by request IDclaude "Correlate these logs by requestId and trace thefailed payment flow for request-id: abc-123"
Identify Service Issues
Timeline for request abc-123:- 10:15:23.100 - Frontend: Payment initiated- 10:15:23.150 - API Gateway: Request received- 10:15:23.200 - User Service: User validated- 10:15:23.250 - Payment Service: Processing started- 10:15:28.250 - Payment Service: Timeout waiting for DB- 10:15:28.300 - All services: Cascade failure
Root cause: Database connection pool exhausted
For bugs that disappear when you try to observe them.
// Non-invasive debugging"Add minimal logging that won't change timing behaviorfor this race condition in WebSocket handling"
// AI suggests async loggingclass AsyncDebugger { constructor() { this.buffer = []; this.flushInterval = setInterval(() => this.flush(), 1000); }
log(message, data) { // Non-blocking push to buffer this.buffer.push({ timestamp: process.hrtime.bigint(), message, data: JSON.stringify(data) }); }
flush() { if (this.buffer.length === 0) return;
// Async write to avoid blocking const logs = this.buffer.splice(0); process.nextTick(() => { logs.forEach(log => console.log(log)); }); }}
Track down memory leaks systematically.
// Memory leak detection workflow"Help me find memory leaks in our Express app.Memory grows by 100MB/hour"
// AI creates memory profiling setupclass MemoryProfiler { constructor() { this.baseline = process.memoryUsage(); this.snapshots = []; }
captureSnapshot(label) { const current = process.memoryUsage(); const diff = { heapUsed: current.heapUsed - this.baseline.heapUsed, external: current.external - this.baseline.external, arrayBuffers: current.arrayBuffers - this.baseline.arrayBuffers };
this.snapshots.push({ label, timestamp: Date.now(), memory: current, diff });
// Alert on suspicious growth if (diff.heapUsed > 50 * 1024 * 1024) { // 50MB console.warn('Memory leak suspected:', { label, growth: `${Math.round(diff.heapUsed / 1024 / 1024)}MB` }); } }
findLeaks() { // Analyze growth patterns const growth = this.snapshots.map((s, i) => { if (i === 0) return null; return { label: s.label, heapGrowth: s.memory.heapUsed - this.snapshots[i-1].memory.heapUsed }; }).filter(Boolean);
return growth.filter(g => g.heapGrowth > 10 * 1024 * 1024); }}
Reproduce First
Always create a minimal reproduction before fixing
Test the Fix
Write a test that fails without the fix, passes with it
Document Findings
Create runbooks for similar issues in the future
Monitor Recurrence
Add alerts to catch if the issue returns
{ "version": "0.2.0", "configurations": [ { "name": "Debug with AI Assistance", "type": "node", "request": "launch", "program": "${workspaceFolder}/app.js", "env": { "DEBUG": "*", "LOG_LEVEL": "trace" }, "outputCapture": "std", "skipFiles": ["<node_internals>/**"] } ]}
// Sentry + AI debuggingSentry.init({ beforeSend(event, hint) { if (event.level === 'error') { // Send complex errors to AI for analysis analyzeWithAI({ error: event, context: hint.originalException, breadcrumbs: hint.breadcrumbs }); } return event; }});
Master debugging with: