Real estate market analysis report from listing data
- Published
Real estate professionals spend hours each week manually collecting listing data from multiple sources, copying it into spreadsheets, and then writing analysis reports. A property manager might pull data from their MLS feed, cross-reference it with market comps, and try to spot trends. Meanwhile, that same data sits in their email inbox, their CRM, and three different web portals. By the time the report is written, the market has already moved............. For more on this, see Real estate listing automation from property inspection r....
What if the entire process ran on a schedule, without any of your team touching a spreadsheet? New listings come in, get analysed, and a finished report lands in your inbox every morning. This is exactly what an automated workflow using accio-ai, Deepnote, and Terrakotta-ai can do. You define the rules once, and the system keeps working.
This guide walks you through building a real estate market analysis pipeline that pulls listing data, processes it with AI, and generates a polished report every single time. We'll focus on zero manual handoff; every step feeds directly into the next.
The Automated Workflow
Why This Combination of Tools
Accio-ai excels at extracting structured data from unstructured sources. Deepnote provides a notebook environment that talks to APIs and databases. Terrakotta-ai specialises in generating natural language reports from data tables. Together, they form a three-stage pipeline: extract, analyse, report.
The orchestration layer (Zapier, n8n, Make, or Claude Code) acts as the conductor. It schedules jobs, passes data between tools, and monitors for errors.
Choosing Your Orchestration Tool
For this workflow, n8n is the best fit. It handles conditional logic well, runs on a schedule, and doesn't require monthly token limits like Make. If you're already in the Zapier ecosystem, that works too. Claude Code is excellent if you want a single Python script that handles everything.
Let's build with n8n.
Step 1: Schedule and Trigger
Create a new workflow in n8n and add a Cron trigger to run daily at 08:00 UTC.
{
"type": "n8n-nodes-base.cron",
"typeVersion": 1,
"position": [250, 300],
"parameters": {
"mode": "recurrence",
"triggerAtHour": 8,
"triggerAtMinute": 0,
"triggerAtDayOfWeek": [1, 2, 3, 4, 5]
}
}
This fires Monday through Friday at 08:00 UTC. Adjust the triggerAtDayOfWeek array if you need different days.
Step 2: Fetch Listing Data via Accio-ai
Accio-ai's HTTP node in n8n lets you call their extraction API. First, set up an Accio-ai account and generate an API key from your dashboard. Then add an HTTP request node to n8n.
[POST](/tools/post) https://api.accio-ai.com/v1/extract
Content-Type: application/json
Authorization: Bearer YOUR_ACCIO_API_KEY
{
"source_url": "https://your-mls-feed.example.com/listings",
"extract_fields": [
"property_address",
"listing_price",
"square_footage",
"bedrooms",
"bathrooms",
"days_on_market",
"listing_agent",
"listing_date"
],
"output_format": "json",
"pagination": {
"max_pages": 5
}
}
Replace your-mls-feed.example.com with your actual MLS or listing data source. Accio-ai will extract all listings matching your fields and return a clean JSON array.
Store the response in an n8n variable called listing_data:
{
"variable_name": "listing_data",
"value": "{{ $node.Accio_Extract.json }}"
}
Step 3: Prepare Data and Call Deepnote for Analysis
Deepnote is a collaborative notebook tool with a built-in API. You'll create a Deepnote notebook that accepts listing data as input, runs analysis, and returns results.
First, create a new Deepnote notebook and add this Python code block:
import pandas as pd
import json
from datetime import datetime, timedelta
listings_json = json.loads(incoming_data)
df = pd.DataFrame(listings_json)
# Data cleaning
df['listing_price'] = pd.to_numeric(df['listing_price'], errors='coerce')
df['square_footage'] = pd.to_numeric(df['square_footage'], errors='coerce')
df['days_on_market'] = pd.to_numeric(df['days_on_market'], errors='coerce')
# Calculate key metrics
analysis = {
"report_date": datetime.now().isoformat(),
"total_listings": len(df),
"average_price": float(df['listing_price'].mean()),
"median_price": float(df['listing_price'].median()),
"price_per_sqft": float((df['listing_price'] / df['square_footage']).mean()),
"average_days_on_market": float(df['days_on_market'].mean()),
"listings_by_bedroom": df.groupby('bedrooms')['property_address'].count().to_dict(),
"price_range": {
"min": float(df['listing_price'].min()),
"max": float(df['listing_price'].max())
},
"market_trend": "stable" if df['days_on_market'].mean() < 30 else "slow"
}
# Generate summary insights
insights = []
if analysis['average_days_on_market'] < 20:
insights.append("Market is moving quickly; buyer demand is strong.")
elif analysis['average_days_on_market'] > 60:
insights.append("Listings are staying on market longer; inventory may be high.")
if len(df) > 50:
insights.append(f"High inventory week with {len(df)} new listings.")
analysis['insights'] = insights
print(json.dumps(analysis, indent=2))
Save this notebook and note its ID (found in the URL: deepnote.com/workspace/your-workspace/notebook/abc123xyz).
Now, back in n8n, add an HTTP request node to call the Deepnote notebook API:
POST https://api.deepnote.com/v1/notebooks/abc123xyz/run
Authorization: Bearer YOUR_DEEPNOTE_API_KEY
Content-Type: application/json
{
"cells": {
"cell_python_1": {
"code": "incoming_data = '" + {{ $node.Accio_Extract.json }} + "'"
}
}
}
The notebook runs, and its output is stored in analysis_results.
Step 4: Generate the Report with Terrakotta-ai
Terrakotta-ai takes structured data and writes prose. Call its API with your analysis results:
POST https://api.terrakotta-ai.com/v1/generate-report
Authorization: Bearer YOUR_TERRAKOTTA_API_KEY
Content-Type: application/json
{
"data": {
"report_type": "market_analysis",
"metrics": {
"total_listings": {{ $node.Deepnote_Analysis.json.total_listings }},
"average_price": {{ $node.Deepnote_Analysis.json.average_price }},
"median_price": {{ $node.Deepnote_Analysis.json.median_price }},
"price_per_sqft": {{ $node.Deepnote_Analysis.json.price_per_sqft }},
"average_days_on_market": {{ $node.Deepnote_Analysis.json.average_days_on_market }},
"market_trend": "{{ $node.Deepnote_Analysis.json.market_trend }}"
},
"insights": {{ $node.Deepnote_Analysis.json.insights }},
"report_date": "{{ $node.Deepnote_Analysis.json.report_date }}"
},
"style": "professional",
"tone": "executive_summary",
"include_sections": [
"executive_summary",
"market_metrics",
"insights_and_trends",
"recommendations"
]
}
Terrakotta-ai returns a formatted report as a string. Store it in final_report.
Step 5: Send the Report
Add a Gmail node (or any email service) to send the report to your stakeholders:
To: your-team@example.com
Subject: Daily Real Estate Market Analysis Report for {{ now.toFormat("yyyy-MM-dd") }}
Body: {{ $node.Terrakotta_Report.json.report_text }}
Putting It All Together
Your n8n workflow now looks like this:
- Cron trigger (daily at 08:00)
- HTTP request to Accio-ai (extract listing data)
- Set variable
listing_data - HTTP request to Deepnote (analyse data)
- Set variable
analysis_results - HTTP request to Terrakotta-ai (generate report)
- Gmail send
When the workflow runs, each step waits for the previous one to complete. No manual handoff anywhere.
The Manual Alternative
If you want more control over the analysis or prefer not to rely on scheduled automation, you can run this workflow on demand:
- Create a Deepnote notebook with the same analysis code, but leave it open in your browser.
- Manually paste listing data into a cell (or read it from a CSV file you upload).
- Run the notebook and review the analysis.
- Copy the results and paste them into Terrakotta-ai's web interface.
- Copy the generated report and email it.
This takes about 15 to 20 minutes per report and introduces several manual steps where errors can creep in. The automated workflow does the same thing in seconds and is reproducible every time.
Pro Tips
Handle Accio-ai Rate Limits Gracefully
Accio-ai allows 100 API calls per hour on a free plan. If your MLS feed is large, you might hit this limit. Add a retry node in n8n:
{
"type": "n8n-nodes-base.http",
"parameters": {
"url": "https://api.accio-ai.com/v1/extract",
"method": "POST",
"retryWithOptions": {
"maxRetries": 3,
"backoff": "exponential",
"backoffMultiplier": 2
}
}
}
If Accio-ai returns a 429 (rate limit), n8n waits 2 seconds, then 4 seconds, then 8 seconds before retrying.
Cache Data Between Runs
Deepnote can take 10 to 15 seconds to spin up and execute. If you're running your workflow frequently, cache the listing data in n8n using a Google Sheet or a simple database table. Only fetch fresh data from Accio-ai once per hour, and reuse it for multiple analyses.
Monitor Terrakotta-ai Output Quality
Terrakotta-ai's output is good, but it sometimes hallucinates numbers or invents trends. Add a sanity-check step in n8n:
import re
report_text = "{{ $node.Terrakotta_Report.json.report_text }}"
# Check that the report mentions actual metrics from our analysis
if str(analysis['average_price']) in report_text:
print("Report references actual data: OK")
else:
print("WARNING: Report may not reference real metrics")
If the check fails, send an alert email instead of distributing the report.
Save Reports to Cloud Storage
Add a Google Drive or S3 upload step at the end of your workflow. Terrakotta-ai outputs plain text, but you can pipe it through Pandoc to convert it to PDF:
pandoc -f markdown -t pdf input.txt -o report.pdf
Then upload the PDF to Google Drive. You now have a historical archive of all reports.
Cost Optimisation
Terrakotta-ai charges per report generated. If you run this daily for 30 days, that's 30 reports per month minimum. Check if they offer a monthly subscription plan; it's often cheaper than pay-per-use. The same goes for Accio-ai and Deepnote.
Cost Breakdown
| Tool | Plan Needed | Monthly Cost | Notes |
|---|---|---|---|
| Accio-ai | Starter | $29 | 100 API calls/hour; includes 10,000 extractions/month |
| Deepnote | Professional | $49 | Unlimited notebooks and API calls; needed for scheduled runs |
| Terrakotta-ai | Pay-as-you-go | $10–30 | Approximately £0.30 per report; 30 reports/month = ~£9 |
| n8n | Community (self-hosted) | £0 | Free to run on your own server; cloud tier is £20/month if you prefer managed hosting |
| Gmail | Free | £0 | Included with Google Workspace (£7.20/month if purchased separately) |
| Total (minimum) | £57–72/month | Assumes self-hosted n8n and free Gmail tier |
If you're already paying for Google Workspace and Deepnote for other projects, the incremental cost is just Accio-ai (£20) plus Terrakotta-ai (£9) plus the additional Deepnote API calls.
Final Thought
Once this workflow is running, your team stops thinking about data collection and reporting. They focus on acting on insights. A partner sees the report arrive in their inbox, spots a trend, and makes a decision. That's the point of automation: not to remove people from the process, but to remove the busy work so people can do what they're actually hired to do.
More Recipes
User onboarding video series from feature documentation
SaaS companies need to convert technical documentation into engaging onboarding videos for different user segments.
Course curriculum and assessment generation from subject outline
Educators spend weeks designing course materials and assessments when they could generate them from a high-level curriculum outline.
Technical documentation generation from code
Developers struggle to maintain up-to-date documentation alongside code changes.