Introduction
Real estate professionals waste enormous amounts of time compiling market analysis reports. A typical workflow involves downloading listings from multiple sources, cleaning the data in a spreadsheet, running calculations in Python, then manually formatting everything into a presentable document. If your market data updates weekly or daily, this becomes an endless cycle of repetitive work.
The problem gets worse when you need to track trends over time. You're stuck choosing between outdated reports or spending hours each week regenerating the same analysis. Most agents and analysts end up with dozens of half-finished spreadsheets, inconsistent data formats, and analysis that's weeks out of date by the time anyone reads it.
This workflow automates the entire process. You'll pull live listing data, clean and analyse it, generate visualisation charts, and produce a finished report without touching a single spreadsheet. Run it once weekly or daily; the system handles everything else.
The Automated Workflow
We'll build this using n8n as the orchestration backbone. n8n gives you better visibility than Zapier for complex workflows, works well with Python-based tools, and offers generous free tiers. The workflow follows this path: fetch listings from accio-ai, process and analyse them in Deepnote, generate charts with Terrakotta-AI, then compile into a final report.
Architecture Overview
The workflow runs on a schedule (weekly recommended) and executes these steps in sequence:
- Accio-AI fetches current listings matching your market criteria
- Data lands in Deepnote for cleaning, filtering, and statistical analysis
- Terrakotta-AI generates market trend charts and visualisations
- Final report compiles and distributes automatically
Each step feeds into the next with zero manual intervention. Errors are logged and you'll receive notifications if anything breaks.
Setting Up n8n
First, deploy n8n. You can self-host it, use n8n Cloud, or run Docker locally. For production use, n8n Cloud handles updates and security patches for you.
docker run -it --rm \
-p 5678:5678 \
-e DB_TYPE=sqlite \
n8n/n8n
Access the editor at http://localhost:5678. Create a new workflow and name it "Real Estate Market Analysis".
Step 1:
Trigger and Accio-AI Data Fetch
Start with a Cron trigger that runs weekly. Set it for Monday at 6 AM so reports are ready before the work week starts.
In n8n, add a Cron node:
0 6 * * 1
This translates to 06:00 UTC every Monday.
Next, add an HTTP Request node to call Accio-AI's listing endpoint. You'll need an API key from your Accio-AI account.
GET https://api.accio-ai.com/v1/listings
Headers:
Authorization: Bearer YOUR_ACCIO_API_KEY
Content-Type: application/json
Query Parameters:
location: your_market_city
status: active
limit: 500
sort: date_listed:desc
Accio-AI returns paginated JSON. If you need more than 500 listings, add pagination logic to the HTTP node. Set "Pagination Type" to "Offset" and configure it for 500 items per page.
The response looks like this:
{
"data": [
{
"id": "listing_12345",
"address": "123 Market Street",
"price": 450000,
"bedrooms": 3,
"bathrooms": 2,
"sqft": 2100,
"days_on_market": 14,
"list_date": "2024-01-15T00:00:00Z",
"latitude": 40.7128,
"longitude": -74.0060
}
],
"total": 247,
"page": 1
}
Map the output to a clean object for the next step:
{
"listings": $response.data,
"total_count": $response.total,
"fetch_timestamp": Now()
}
Step 2:
Transform and Send to Deepnote
Deepnote is where the real analysis happens. You'll create a notebook that accepts data via API, processes it, and outputs cleaned results.
First, create a new Deepnote notebook. In the first cell, add an HTTP endpoint to receive data from n8n:
import pandas as pd
import numpy as np
from datetime import datetime, timedelta
import json
# Access it via the integration API
incoming_listings = incoming_data.get('listings', [])
# Convert to DataFrame for analysis
df = pd.DataFrame(incoming_listings)
# Data cleaning and preparation
df['list_date'] = pd.to_datetime(df['list_date'])
df['price_per_sqft'] = df['price'] / df['sqft']
df['list_week'] = df['list_date'].dt.isocalendar().week
# Remove obvious outliers (prices beyond 3 standard deviations)
price_mean = df['price'].mean()
price_std = df['price'].std()
df = df[(df['price'] > price_mean - 3*price_std) &
(df['price'] < price_mean + 3*price_std)]
# Calculate market metrics
metrics = {
'total_listings': len(df),
'average_price': df['price'].mean(),
'median_price': df['price'].median(),
'average_dom': df['days_on_market'].mean(),
'average_price_per_sqft': df['price_per_sqft'].mean(),
'price_range': {
'min': df['price'].min(),
'max': df['price'].max()
}
}
# Segment by bedrooms for detailed analysis
bedroom_analysis = df.groupby('bedrooms').agg({
'price': ['mean', 'median', 'count'],
'days_on_market': 'mean',
'price_per_sqft': 'mean'
}).round(2)
# Save results to CSV for next step
results_df = df[['id', 'address', 'price', 'bedrooms', 'bathrooms',
'sqft', 'days_on_market', 'price_per_sqft']]
results_df.to_csv('/tmp/cleaned_listings.csv', index=False)
# Return summary for the workflow
output = {
'status': 'success',
'metrics': metrics,
'bedroom_analysis': bedroom_analysis.to_dict(),
'records_processed': len(df),
'cleaned_data_url': '/tmp/cleaned_listings.csv'
}
output
In n8n, add an HTTP Request node to call Deepnote:
POST https://api.deepnote.com/v1/run_notebook
Headers:
Authorization: Bearer YOUR_DEEPNOTE_API_KEY
Content-Type: application/json
Body:
{
"notebook_id": "your_notebook_id",
"variables": {
"incoming_data": {{ $json.body }}
}
}
To get your notebook ID, open Deepnote, navigate to Settings, and copy the notebook ID from the URL or settings panel.
Deepnote returns execution results including all output variables. Map this to pass analysis results forward:
{
"analysis_metrics": $response.body.output.metrics,
"bedroom_breakdown": $response.body.output.bedroom_analysis,
"processed_records": $response.body.output.records_processed
}
Step 3:
Generate Charts with Terrakotta-AI
Terrakotta-AI specialises in creating publication-quality charts from data. You'll send it the analysed metrics and get back image URLs.
Add another HTTP Request node in n8n:
POST https://api.terrakotta-ai.com/v1/charts/generate
Headers:
Authorization: Bearer YOUR_TERRAKOTTA_API_KEY
Content-Type: application/json
Body:
{
"chart_type": "bar",
"title": "Average Price by Bedrooms",
"data": [
{
"label": "1 Bed",
"value": {{ $json.body.analysis_metrics.bedroom_analysis['1']['price']['mean'] }}
},
{
"label": "2 Bed",
"value": {{ $json.body.analysis_metrics.bedroom_analysis['2']['price']['mean'] }}
},
{
"label": "3 Bed",
"value": {{ $json.body.analysis_metrics.bedroom_analysis['3']['price']['mean'] }}
},
{
"label": "4 Bed",
"value": {{ $json.body.analysis_metrics.bedroom_analysis['4']['price']['mean'] }}
}
],
"style": "professional",
"width": 1200,
"height": 600
}
Generate a second chart for price distribution:
POST https://api.terrakotta-ai.com/v1/charts/generate
Headers:
Authorization: Bearer YOUR_TERRAKOTTA_API_KEY
Content-Type: application/json
Body:
{
"chart_type": "histogram",
"title": "Price Distribution",
"data": {{ $json.body.analysis_metrics.price_range }},
"bins": 20,
"style": "professional"
}
Both calls return image URLs. Store these for the final report:
{
"chart_by_bedrooms": $response[0].body.image_url,
"price_distribution_chart": $response[1].body.image_url,
"charts_generated_at": Now()
}
Step 4:
Compile and Distribute Report
The final step pulls everything together into a formatted document. Use the Gmail or Slack node to distribute, or save to cloud storage.
For email distribution, add a Gmail node:
To: your_team@company.com
Subject: Market Analysis Report - {{ Now().format('MMMM DD, YYYY') }}
Body:
<h2>Real Estate Market Report</h2>
<p>Generated: {{ Now() }}</p>
<h3>Market Overview</h3>
<ul>
<li>Total Active Listings: {{ $json.body.processed_records }}</li>
<li>Average Price: ${{ $json.body.analysis_metrics.average_price.toLocaleString() }}</li>
<li>Median Price: ${{ $json.body.analysis_metrics.median_price.toLocaleString() }}</li>
<li>Average Days on Market: {{ $json.body.analysis_metrics.average_dom.toFixed(1) }}</li>
<li>Average Price per Sqft: ${{ $json.body.analysis_metrics.average_price_per_sqft.toFixed(2) }}</li>
</ul>
<h3>Price Range</h3>
<p>Lowest: ${{ $json.body.analysis_metrics.price_range.min.toLocaleString() }}</p>
<p>Highest: ${{ $json.body.analysis_metrics.price_range.max.toLocaleString() }}</p>
<h3>Charts</h3>
<p><img src="{{ $json.body.chart_by_bedrooms }}" width="800" /></p>
<p><img src="{{ $json.body.price_distribution_chart }}" width="800" /></p>
Alternatively, save to Google Drive using the Google Drive node, or to Dropbox for version control. This gives you a permanent archive of reports over time, useful for spotting seasonal trends.
Add error handling at every step. In n8n, click each node and set up error handling: if the Accio-AI call fails, send a Slack message instead of crashing silently.
The Manual Alternative
If you want more granular control over each step, you can break this into manual checkpoints. After Accio-AI fetches data, download the CSV and review it before sending to Deepnote. After Deepnote completes analysis, manually verify the metrics make sense before generating charts.
This approach trades automation for oversight. You'll catch data issues early but lose the hands-off benefits. Reasonable for monthly reports where errors are more costly than time investment.
To implement manual review steps, add Approval nodes in n8n between each step. The workflow pauses and waits for you to click "Approve" in the n8n interface before proceeding.
Pro Tips
Watch API rate limits. Accio-AI typically allows 100 requests per minute on standard plans. If you're analysing large geographies with thousands of listings, paginate results across multiple workflow runs or request a higher tier. Check your API dashboard weekly.
Cache Deepnote results. If you run the same analysis twice in one day, Deepnote might bill you twice. Add a conditional node that checks whether data has changed since the last run. Only call Deepnote if new listings arrived since yesterday.
Use n8n's built-in error notifications. Every node can send Slack or email alerts on failure. Set these up for production workflows; a silent failure is worse than a loud one. Include the error message and node name so you know exactly what broke.
Store raw data for auditing. Before any transformation, save the raw Accio-AI response to cloud storage or a database. This lets you reprocess historical data if you later discover a calculation error, without re-fetching everything.
Terrakotta charts are cacheable. Charts don't change unless the underlying data changes. Store image URLs in a database and only regenerate if metrics shift by more than a threshold (e.g. more than 5% change in average price).
Cost Breakdown
| Tool | Plan Needed | Monthly Cost | Notes |
|---|---|---|---|
| Accio-AI | Professional | $99 | 5,000 API calls/month, usually sufficient for weekly analysis of 500-1000 listings |
| Deepnote | Starter | $0 | Free tier includes 1 notebook and monthly execution limits; upgrade to Professional ($50/month) if running daily |
| Terrakotta-AI | Standard | $49 | Covers up to 500 chart generations monthly |
| n8n Cloud | Basic | $0 | Free tier sufficient for weekly execution; Professional ($30/month) for additional features and execution history |
| Gmail or Slack | Existing | $0 | Uses accounts you likely have already |
| Total | $148–$228/month | Scales down significantly if running monthly instead of weekly |
The lowest-cost version uses free tiers and costs nothing until you exceed basic limits. For most real estate teams analysing one market weekly, stay under $100/month.
To go live: set up API keys for each service, configure n8n with those credentials, test the workflow with sample data, then enable the Cron trigger. Monitor the first run closely, then check weekly to verify the email arrives on schedule. After a month of consistent reports, you'll notice gaps in manual reporting disappear and your team's market knowledge stays current.