Code migration documentation and technical debt assessment
- Published
Code migration is often treated as a one-off project: you move systems, document what you've done, then declare victory. In reality, it's a process that reveals technical debt faster than almost anything else. As you pull apart legacy code and rebuild it elsewhere, you discover undocumented dependencies, inconsistent patterns, and architectural decisions that nobody remembers making.
The problem is that assessing and documenting this debt is tedious work. You need to track what's changing, identify problematic patterns in the old codebase, generate documentation for the new one, and somehow keep it all synchronised as migration progresses. Most teams do this manually: engineers write migration logs, paste them into Slack, someone copies them into a spreadsheet, and documentation lags weeks behind reality.
This workflow shows how to eliminate that friction entirely. When code changes trigger migrations, this system automatically assesses technical debt using burnrate, generates fresh documentation with Mintlify, validates the work with Windsurf, and keeps everything updated without human intervention. You move from scattered notes to a live, evolving migration record and debt inventory in hours, not weeks.
The Automated Workflow
This workflow uses n8n as the orchestration backbone because it handles complex data transformations well and sits happily in self-hosted or cloud environments. The flow works like this: a Git webhook detects migration commits, burnrate analyses the legacy code for debt signals, Windsurf validates the migrated code against quality standards, Mintlify generates or updates documentation, and everything feeds into a shared database for tracking.
Architecture Overview
The workflow sits between your Git repository and three specialised tools. When a developer pushes migration work, n8n receives the webhook and initiates a four-step process. First, it extracts the changeset from Git, passes it to burnrate for technical debt analysis, sends the clean version to Windsurf for code validation, then uses Mintlify to generate or refresh documentation. Results feed into a database that becomes your source of truth for migration progress and debt inventory.
This is an advanced setup because it requires managing API credentials, parsing complex Git data structures, and handling scenarios where one tool's output becomes another tool's input. If you're new to this kind of orchestration, start by getting one tool working end-to-end before adding the others.
Setting Up the n8n Workflow
Create a new workflow in n8n and add a Webhook node first. This will receive POST requests from your Git provider.
{
"event": "push",
"repository": "acme-core-migration",
"branch": "main",
"commits": [
{
"id": "abc123def",
"message": "Migrate auth service to new architecture",
"files": ["src/auth/legacy.ts", "src/auth/new.ts"],
"timestamp": "2024-01-15T14:32:00Z"
}
]
}
Add a Git node to fetch the actual file contents from your repository. Configure it with your repository URL and branch name. This needs authentication via SSH key or personal access token.
Repository URL: https://github.com/yourorg/repo.git
Branch: main
Authentication: Personal Access Token
The output will be a structured object containing file contents. Use a Function node to extract only the migration-related files and format them for burnrate analysis.
const commits = $input.all()[0].json.commits;
const migrationFiles = commits.flatMap(commit =>
commit.files.map(file => ({
filename: file,
commit_id: commit.id,
commit_message: commit.message,
timestamp: commit.timestamp
}))
);
return { migrationFiles };
Sending Code to burnrate for Debt Analysis
burnrate is a tool that analyses code for technical debt signals: complexity, test coverage gaps, refactoring opportunities, and architectural problems. Use the HTTP node to call its API with your migrated code.
POST https://api.burnrate.io/v1/analyse
Headers:
Authorization: Bearer YOUR_BURNRATE_API_KEY
Content-Type: application/json
Body:
{
"repository": "acme-core-migration",
"branch": "main",
"files": [
{
"path": "src/auth/new.ts",
"content": "<file contents from Git>",
"language": "typescript"
}
],
"context": "migration",
"compare_against": "src/auth/legacy.ts"
}
burnrate returns a detailed report. Store the key insights in a database via the PostgreSQL node (or MySQL, depending on your setup).
INSERT INTO migration_debt_report (
repository, branch, file_path, debt_score,
complexity_rating, test_coverage, issues_found,
analysis_timestamp
) VALUES (
'acme-core-migration', 'main', 'src/auth/new.ts',
$input.json.debt_score,
$input.json.complexity,
$input.json.test_coverage,
JSON.stringify($input.json.issues),
NOW()
);
Running Windsurf Code Validation
Windsurf performs static analysis and suggests improvements. It's particularly useful during migration because it can compare old and new code patterns. Call it after burnrate completes.
POST https://api.windsurf.dev/v1/validate
Headers:
X-API-Key: YOUR_WINDSURF_KEY
Content-Type: application/json
Body:
{
"files": [
{
"name": "src/auth/new.ts",
"source": "<migrated code>",
"legacy_source": "<original code>",
"language": "typescript"
}
],
"rules": ["security", "performance", "patterns"],
"migration_context": true
}
Windsurf returns suggestions. Map high-priority issues back to your team and store medium/low priority ones for later review.
const validationResult = $input.json;
const issues = validationResult.suggestions || [];
const highPriority = issues.filter(i => i.severity === 'high');
const lowPriority = issues.filter(i => i.severity !== 'high');
return {
blocking_issues: highPriority.length,
warnings: lowPriority.length,
issues: issues
};
Generating Documentation with Mintlify
Once code is validated, use Mintlify to generate or update your documentation. Mintlify can ingest code comments, extract function signatures, and build reference docs automatically.
POST https://api.mintlify.com/v1/generate
Headers:
Authorization: Bearer YOUR_MINTLIFY_API_KEY
Content-Type: application/json
Body:
{
"project": "acme-core",
"type": "code_reference",
"language": "typescript",
"source_files": [
{
"path": "src/auth/new.ts",
"content": "<migrated code>"
}
],
"metadata": {
"migration_from": "src/auth/legacy.ts",
"completion_date": "2024-01-15",
"status": "migrated"
}
}
Mintlify's response includes a documentation URL and updated file structure. Store this reference in your tracking database.
INSERT INTO migration_documentation (
file_path, doc_url, generated_at, mintlify_project_id
) VALUES (
'src/auth/new.ts',
$input.json.doc_url,
NOW(),
'acme-core'
);
Conditional Logic and Error Handling
Add an If node after burnrate to conditionally proceed. If debt score exceeds your threshold, send an alert instead of moving forward immediately.
Condition: $input.json.debt_score > 65
If true: Send Slack notification and pause workflow
If false: Continue to Windsurf validation
For Windsurf, add another If node to check for blocking issues.
Condition: $input.json.blocking_issues > 0
If true: Send alert with issue list; don't proceed to documentation
If false: Continue to Mintlify
Use Try/Catch nodes around each API call to handle network failures gracefully.
Try:
HTTP request to burnrate
Catch:
Log error to database
Send alert to Slack
Retry with exponential backoff (max 3 times)
Putting It Together in n8n
Your complete workflow structure should look like this:
- Webhook node receives Git push
- Git node fetches repository contents
- Function node extracts migration files
- HTTP node calls burnrate API
- If node checks debt score
- HTTP node calls Windsurf API
- If node checks for blocking issues
- HTTP node calls Mintlify API
- PostgreSQL node stores all results
- Slack node sends summary notification
Connect the nodes in sequence with error handlers branching off from each HTTP call. Use a final Slack notification to confirm completion and provide links to the new documentation and debt report.
The Manual Alternative
If orchestration feels like overkill for your current workflow, you can run these tools manually with a much simpler process. Clone the repository locally, run burnrate from the command line against your migrated files, then review Windsurf's suggestions in your IDE (it has VSCode and JetBrains plugins). Finally, run Mintlify's documentation generator and commit the output. For more on this, see Automated legal document review and client summary genera.... For more on this, see Legal contract review and client summary document generation. For more on this, see Legal contract review and executive summary generation fo....
This works fine if migrations happen infrequently or if you prefer human review at each step. The downside is that you'll spend 2-3 hours per migration doing the same repetitive work: formatting code, running analysis, copying results around, and manually updating tracking spreadsheets. For teams doing multiple migrations over several months, automation pays for itself in the first week.
If you want something in between, use Zapier instead of n8n. Zapier has pre-built connectors for many tools and doesn't require self-hosting, though it has tighter rate limits on the free tier and less flexibility for complex logic.
Pro Tips
Debt Score Thresholds are Context-Dependent. A debt score of 60 might be acceptable for a internal admin tool but unacceptable for customer-facing code. Define different thresholds per service and store them in a config file that your workflow reads. This lets you adjust tolerances without rebuilding the workflow.
Cache Windsurf Results to Save API Calls. If the same file hasn't changed since the last run, skip validation. Use an MD5 hash of the file content and store it in your tracking database. Compare hashes before calling Windsurf.
Set Rate Limits Explicitly in n8n. burnrate and Mintlify have per-minute API quotas. Use n8n's Rate Limit node to throttle requests; 10 requests per minute is a safe starting point. Monitor your usage in the tool dashboards and adjust as needed.
Generate Slack Notifications with Debt Summaries. Instead of a generic "migration complete" message, parse the burnrate output and include specific debt findings. This keeps your team informed without requiring them to check a separate tool.
const debt = $input.json.debt_analysis;
const message = `
*Migration Complete*
Debt Score: ${debt.debt_score}/100
Issues Found: ${debt.issues.length}
Test Coverage: ${debt.test_coverage}%
Next: Review at <${debt.report_url}|Burnrate Dashboard>
`;
return { message };
Archive Old Reports for Trend Analysis. Keep all historical debt reports in a database table with timestamps. After six months, you'll have enough data to chart debt trends and see whether migrations actually reduce technical debt long-term or just move it elsewhere.
Cost Breakdown
| Tool | Plan Needed | Monthly Cost | Notes |
|---|---|---|---|
| n8n | Cloud Pro or Self-Hosted Open | £25 or £0 | Self-hosted requires infrastructure; Cloud Pro includes 100k executions/month. Each workflow run costs 1-2 executions per file analysed. |
| burnrate | Team Plan | £180 | Unlimited analyses; API access included. 10 concurrent analyses. |
| Windsurf | Professional | £120 | API access to validation engine. Includes 1M analysis units/month; each file costs 1-10 units depending on size. |
| Mintlify | Pro | £150 | Unlimited documentation projects and API generation. 10 concurrent generations. |
| PostgreSQL Database | Managed service (AWS RDS) or Self-Hosted | £20-50 | Minimal storage for migration tracking; only a few GB needed. |
| Total for small team | £465-545 | One-time setup; costs remain stable regardless of migration frequency. |
This covers unlimited migrations across all your repositories. For a team doing 20 migrations per year, that's roughly £27 per migration; doing it manually would cost significantly more in engineering time.
More Recipes
Automated Podcast Production Workflow
Automated Podcast Production Workflow: From Raw Audio to Published Episode
Build an Automated YouTube Channel with AI
Build an Automated YouTube Channel with AI
Medical device regulatory documentation from technical specifications
Medtech companies spend significant resources translating technical specs into regulatory-compliant documentation.