Introduction
You receive bank statements scattered across multiple PDFs, investment reports from different brokers, and scattered transaction data. Every month, you spend an evening copying numbers into a spreadsheet, cross-referencing account balances, and trying to spot spending patterns that might matter. It is tedious, error-prone, and takes time you could spend on actual financial decisions.
This workflow automates that entire process. Instead of manual copy-paste, your bank statements and investment reports flow directly into a unified dashboard that updates itself. No spreadsheet formulas to maintain, no manual data entry, no wondering if you miscalculated something important.
The workflow uses four AI tools connected through an orchestration layer to turn messy documents into actionable financial intelligence. Once set up, it runs on a schedule with zero human intervention. You upload new documents, or connect directly to your bank's API, and the system handles the rest.
The Automated Workflow
The workflow has four main steps: extract transaction data from bank statements, analyse investment performance, aggregate everything into a single view, and generate a visual dashboard. An orchestration tool ties them together and handles the data flow between each step.
Which Orchestration Tool to Use
For this workflow, n8n is the best choice. It runs on your own server or cloud environment, handles file uploads directly, and integrates cleanly with all four tools without rate limit issues. Zapier would work but costs more for API calls; Make has less flexible file handling; Claude Code is excellent for one-off tasks but not for recurring automation.
n8n's self-hosted option means your financial data stays on your infrastructure, which matters for bank statements and investment reports.
Architecture Overview
The workflow follows this sequence:
-
Trigger: A webhook receives new bank statement PDFs or a scheduled job checks for new files in cloud storage (Google Drive, Dropbox, etc.)
-
Extraction: ChatGPT-Writer parses the PDF to extract transactions, balances, and account information
-
Investment Analysis: Finster-AI processes investment reports to calculate returns, holdings, and allocation percentages
-
Aggregation: Terrakotta-AI combines the extracted data into structured JSON, identifies spending categories, and flags anomalies
-
Visualisation: Text2Infographic converts the aggregated data into a dashboard image or HTML template
-
Storage: Results save to a database, email, or cloud storage for access
Let us walk through each step with actual configuration.
Step 1:
Set Up the Webhook and File Handler
First, create an n8n workflow that listens for new files. Start with a webhook trigger that receives file uploads or connects to Google Drive/Dropbox.
POST https://your-n8n-instance.com/webhook/bank-statements
Content-Type: application/json
{
"filename": "statement_march_2024.pdf",
"file_url": "https://drive.google.com/file/d/abc123xyz/view",
"account_type": "checking"
}
In n8n, add a Webhook node set to POST. Then add a Google Drive node (or Dropbox) to retrieve the file binary. Configure it like this:
{
"operation": "download",
"fileId": "{{ $json.file_id }}",
"googleDriveApiVersion": "v3"
}
The node outputs the PDF as binary data that passes to the next step.
Step 2:
Extract Transactions with ChatGPT-Writer
ChatGPT-Writer's API accepts the PDF binary and a prompt asking for structured extraction. Configure an HTTP request node in n8n:
POST https://api.chatgpt-writer.com/v1/extract
Authorization: Bearer YOUR_API_KEY
Content-Type: application/json
{
"document": "{{ $binary.data }}",
"document_type": "application/pdf",
"prompt": "Extract all transactions from this bank statement. Return JSON with: transaction_date (YYYY-MM-DD), description, amount (negative for debits, positive for credits), balance_after, category (infer from description). Also extract: account_number, statement_period_start, statement_period_end, opening_balance, closing_balance.",
"response_format": "json",
"temperature": 0.2
}
Store the response in a variable called extractedTransactions. The API returns structured JSON:
{
"account_number": "****1234",
"statement_period_start": "2024-03-01",
"statement_period_end": "2024-03-31",
"opening_balance": 5420.50,
"closing_balance": 6180.75,
"transactions": [
{
"transaction_date": "2024-03-02",
"description": "STARBUCKS #1234 NYC",
"amount": -5.40,
"balance_after": 5415.10,
"category": "Dining"
},
{
"transaction_date": "2024-03-05",
"description": "DEPOSIT CHK #5678",
"amount": 2500.00,
"balance_after": 7915.10,
"category": "Income"
}
]
}
Set temperature to 0.2 so the extraction is consistent; higher values introduce variability that breaks downstream parsing.
Step 3:
Analyse Investments with Finster-AI
If the uploaded file is an investment report, route it to Finster-AI in a separate branch. Use an n8n Switch node to check the file type:
{
"condition": "{{ $json.account_type === 'investment' }}",
"true_path": "finster_analysis",
"false_path": "continue_with_checking"
}
For the investment branch, call Finster-AI:
POST https://api.finster-ai.com/v1/portfolio-analysis
Authorization: Bearer YOUR_FINSTER_API_KEY
Content-Type: application/json
{
"document": "{{ $binary.data }}",
"document_type": "application/pdf",
"analysis_type": "holdings_and_performance",
"metrics": [
"total_value",
"total_cost_basis",
"unrealised_gains",
"ytd_return_percent",
"allocation_by_sector",
"top_holdings"
]
}
Finster-AI returns portfolio metrics:
{
"total_portfolio_value": 127450.00,
"total_cost_basis": 95200.00,
"unrealised_gains": 32250.00,
"unrealised_gains_percent": 33.87,
"ytd_return_percent": 8.2,
"allocation_by_sector": {
"Technology": 42.5,
"Healthcare": 28.3,
"Consumer Discretionary": 15.2,
"Other": 14.0
},
"top_holdings": [
{
"symbol": "AAPL",
"shares": 45,
"value": 7650.00,
"cost_basis": 5400.00,
"gain_percent": 41.67
}
]
}
Store this as investmentMetrics.
Step 4:
Aggregate and Categorise with Terrakotta-AI
Now combine the extracted checking account data and investment metrics into a unified financial snapshot. Terrakotta-AI excels at structured data aggregation and anomaly detection:
POST https://api.terrakotta-ai.com/v1/aggregate
Authorization: Bearer YOUR_TERRAKOTTA_API_KEY
Content-Type: application/json
{
"checking_transactions": {{ extractedTransactions.transactions }},
"checking_balance": {{ extractedTransactions.closing_balance }},
"investment_data": {{ investmentMetrics }},
"aggregation_rules": {
"categorise_spending": true,
"group_by_merchant": true,
"identify_recurring": true,
"flag_anomalies": true
},
"anomaly_sensitivity": 0.15
}
The anomaly sensitivity parameter flags transactions that deviate from your typical pattern by 15 percent or more. Terrakotta-AI returns:
{
"total_liquid_assets": 6180.75,
"total_investment_assets": 127450.00,
"total_net_worth": 133630.75,
"spending_by_category": {
"Dining": 145.30,
"Groceries": 320.50,
"Utilities": 89.99,
"Transportation": 127.45,
"Shopping": 234.67,
"Other": 89.34
},
"total_spending": 987.25,
"net_monthly_inflow": 1512.75,
"recurring_transactions": [
{
"description": "SPOTIFY SUBSCRIPTION",
"amount": -12.99,
"frequency": "monthly"
},
{
"description": "GYM MEMBERSHIP",
"amount": -45.00,
"frequency": "monthly"
}
],
"anomalies": [
{
"transaction": "ELECTRONICS STORE #999",
"amount": -680.00,
"reason": "Amount is 340% above typical shopping transaction",
"severity": "high"
}
]
}
Store this as aggregatedData.
Step 5:
Generate Dashboard with Text2Infographic
Convert the aggregated data into a visual dashboard. Text2Infographic accepts structured data and renders it as HTML or image:
POST https://api.text2infographic.com/v1/generate
Authorization: Bearer YOUR_TEXT2INFOGRAPHIC_API_KEY
Content-Type: application/json
{
"data": {{ aggregatedData }},
"template": "financial_dashboard",
"layout": {
"header": "Monthly Financial Summary - March 2024",
"sections": [
{
"type": "metric_cards",
"metrics": [
{
"label": "Net Worth",
"value": "{{ aggregatedData.total_net_worth }}",
"currency": "USD"
},
{
"label": "Monthly Spending",
"value": "{{ aggregatedData.total_spending }}",
"currency": "USD"
},
{
"label": "Net Inflow",
"value": "{{ aggregatedData.net_monthly_inflow }}",
"currency": "USD"
}
]
},
{
"type": "pie_chart",
"title": "Spending by Category",
"data": "{{ aggregatedData.spending_by_category }}"
},
{
"type": "bar_chart",
"title": "Portfolio Allocation",
"data": "{{ investmentMetrics.allocation_by_sector }}"
},
{
"type": "alerts",
"title": "Anomalies Detected",
"items": "{{ aggregatedData.anomalies }}"
}
]
},
"output_format": "html",
"styling": {
"colour_scheme": "professional",
"responsive": true
}
}
The API returns HTML that renders a dashboard. You can also request output_format: "png" for a static image suitable for email or Slack notifications.
Step 6:
Save and Distribute
Add final nodes in n8n to save the results:
Option A: Database Storage (PostgreSQL, MongoDB, etc.)
{
"operation": "insert",
"table": "financial_snapshots",
"data": {
"snapshot_date": "{{ $now.toISOString() }}",
"net_worth": "{{ aggregatedData.total_net_worth }}",
"spending": "{{ aggregatedData.total_spending }}",
"raw_data": "{{ JSON.stringify(aggregatedData) }}",
"dashboard_html": "{{ $json.html }}"
}
}
Option B: Email with Dashboard Attachment
{
"to": "you@example.com",
"subject": "Your Monthly Financial Dashboard - March 2024",
"text": "Your financial summary is ready. See attached.",
"html": "{{ $json.html }}",
"attachments": [
{
"filename": "financial_dashboard.png",
"content": "{{ $binary.data }}"
}
]
}
Option C: Cloud Storage (Google Drive, S3)
{
"operation": "upload",
"bucket": "financial-dashboards",
"key": "dashboards/{{ $now.toISOString().split('T')[0] }}/dashboard.html",
"body": "{{ $json.html }}"
}
Configure the n8n workflow to run on a schedule (daily, weekly, or monthly) using a Cron trigger node:
{
"trigger_type": "cron",
"cron_expression": "0 9 1 * *",
"timezone": "Europe/London"
}
This runs the workflow at 9 AM on the first day of each month.
The Manual Alternative
If you prefer more control over each step, or need custom logic, you can run parts of this workflow manually using Claude Code or n8n's script nodes.
For example, after extracting transactions with ChatGPT-Writer, you might want to review and adjust categories before moving to aggregation. Create an n8n node that pauses the workflow and sends you an email with extracted data for approval. After you review and make changes, a webhook resumes the workflow with your corrections.
{
"node_type": "pause",
"pause_reason": "Manual review required",
"notification": {
"send_email": true,
"email_body": "Review extracted transactions: {{ $json }}"
}
}
This approach trades some automation for accuracy. Useful for high-stakes financial data where you want human judgment in the loop.
Alternatively, use Claude Code for one-off analysis. After your dashboard generates, you could paste the aggregated data into Claude and ask custom questions like, "Which spending category increased most compared to last month?" or "What is my projected net worth if I maintain this savings rate for a year?"
Pro Tips
1. Handle Errors Gracefully
Bank statements vary wildly in format. Some PDFs are scanned images, others are native digital. Add error handling in n8n to catch parsing failures:
{
"node_type": "error_handler",
"on_error": "send_alert",
"alert": {
"type": "email",
"to": "you@example.com",
"subject": "Financial extraction failed for {{ $json.filename }}",
"body": "ChatGPT-Writer could not parse this statement. It may be in an unexpected format. Please review manually."
}
}
Set ChatGPT-Writer's temperature slightly higher (0.3 instead of 0.2) for more adaptive parsing if initial extraction fails.
2. Respect Rate Limits
Finster-AI and Terrakotta-AI have rate limits. If you process multiple statements at once, add delays between API calls:
{
"node_type": "delay",
"milliseconds": 2000
}
A 2-second delay between requests avoids hitting rate limits. Check each API's documentation for specific thresholds.
3. Test with Dummy Data
Before connecting live bank statements, test the entire workflow with fabricated data. Create a test bank statement PDF and run it through the extraction step to verify ChatGPT-Writer handles your bank's format correctly.
4. Use Environment Variables for Secrets
Store all API keys in n8n's environment variables, not hardcoded in the workflow:
CHATGPT_WRITER_KEY=sk-...
FINSTER_AI_KEY=fk-...
TERRAKOTTA_AI_KEY=tk-...
TEXT2INFOGRAPHIC_KEY=ti-...
Reference them in API nodes as {{ $env.CHATGPT_WRITER_KEY }}.
5. Archive Raw PDFs
Keep original bank statements and investment reports in cloud storage for compliance and reference. Configure an n8n node to move processed files to an archive folder:
{
"operation": "move",
"source": "Google Drive/Inbox",
"destination": "Google Drive/Archive/{{ $now.toISOString().split('T')[0] }}"
}
This ensures you have an audit trail without cluttering active folders.
Cost Breakdown
| Tool | Plan Needed | Monthly Cost | Notes |
|---|---|---|---|
| ChatGPT-Writer | Pay-as-you-go | £5–£15 | ~3000 tokens per extraction; process 2–4 statements monthly |
| Finster-AI | Professional | £25–£40 | Includes advanced metrics; 50 analyses per month included |
| Terrakotta-AI | Starter | £20–£30 | Aggregation and anomaly detection; 100 aggregations included |
| Text2Infographic | Standard | £15–£25 | HTML and image rendering; 50 dashboards monthly |
| n8n | Self-hosted | £0 (server costs only) | Run on your own infrastructure; minimal CPU/memory for monthly workflows |
| Cloud Storage (Google Drive, S3) | Standard | £0–£10 | Store PDFs and dashboards; rarely exceeds free tier for personal use |
| Total | — | £65–£120 | One-time setup; runs unattended after configuration |
The workflow pays for itself if you currently pay for accounting software or spend more than a couple of hours monthly on financial spreadsheets.