Back to Alchemy
Alchemy RecipeIntermediateautomation

Automated restaurant menu engineering from sales data and supplier costs

24 March 2026

Introduction

Running a restaurant means balancing two opposing forces: customer demand and supplier constraints. Your bestselling dish might become unprofitable overnight if ingredient costs spike. Conversely, a high-margin plate nobody orders is just inventory sitting in your cooler. Menu engineers have always done this analysis manually, pulling sales reports from their POS system, checking supplier invoices, calculating margins, and updating menus in a spreadsheet. The entire process takes hours each week and usually happens only once a month, if at all.

What if your menu adjusted itself? Not dramatically, but intelligently. What if every time you received a new supplier quote or your sales data shifted meaningfully, your system recalculated which dishes should be featured, repriced, or quietly retired? That's what this workflow does. It connects your sales data, supplier costs, and menu decisions into a single automated pipeline that runs on a schedule or triggers whenever new information arrives.

This isn't about replacing your menu planning expertise. It's about removing the busywork so you actually have time to think strategically. The workflow we'll build here shows you how to combine Deepnote for data analysis, Smmry for distilling insights into actionable recommendations, and Terrakotta AI for menu engineering decisions, all coordinated by an orchestration tool that handles the plumbing.

The Automated Workflow

The workflow has five main stages: collect sales data, fetch supplier costs, perform margin analysis, generate recommendations, and update your menu documentation. We'll walk through each step and show you how to wire them together.

Architecture Overview

The entire system runs on a schedule (daily or weekly, depending on your needs) or triggers when new data arrives. Here's the data flow:

  1. Your orchestration tool (Zapier, n8n, or Make) pulls yesterday's sales data from your POS system.
  2. The same orchestration tool fetches updated supplier costs from your vendor APIs or a shared spreadsheet.
  3. These datasets flow into Deepnote, which runs a Python notebook that calculates food costs, margins, and item performance metrics.
  4. The analysis results are summarised by Smmry to extract the most important insights.
  5. Terrakotta AI reviews those insights and makes specific menu recommendations (adjust pricing, promote items, reduce portion sizes, etc.).
  6. The orchestration tool formats these recommendations and saves them to a shared document, Slack message, or your PMS system.

Let's start building.

Step 1:

Set Up Your Orchestration Tool

We'll use n8n for this example because it offers good balance between power and ease of use, plus it can run on your own infrastructure if you prefer. The same logic applies to Zapier or Make, just with different naming conventions.

In n8n, create a new workflow. Start with a Cron trigger set to run daily at 6:00 AM.


{
  "nodes": [
    {
      "parameters": {
        "rule": "0 6 * * *"
      },
      "name": "Daily Menu Analysis",
      "type": "n8n-nodes-base.cron",
      "typeVersion": 1,
      "position": [250, 300]
    }
  ]
}

This trigger will fire every morning. From here, your workflow branches into two parallel data collection tasks.

Step 2:

Pull Sales Data from Your POS System

Most modern POS systems (Toast, Square, Lightspeed, TouchBistro) have APIs. We'll use a generic HTTP request node to fetch the data. Adjust the endpoint and parameters to match your system.

Here's an example using Square's POS API:


GET https://connect.squareup.com/v2/orders

Headers:
Authorization: Bearer YOUR_SQUARE_ACCESS_TOKEN
Content-Type: application/json

Query Parameters:
begin_time=2024-01-14T00:00:00Z
end_time=2024-01-14T23:59:59Z
limit=200

In n8n, add an HTTP Request node after your Cron trigger. Configure it like this:


{
  "parameters": {
    "authentication": "oAuth2",
    "method": "GET",
    "url": "https://connect.squareup.com/v2/orders",
    "options": {
      "params": {
        "begin_time": "{{ $now.minus(1, 'day').toISO() }}",
        "end_time": "{{ $now.toISO() }}",
        "limit": "200"
      }
    }
  },
  "name": "Fetch POS Sales Data",
  "type": "n8n-nodes-base.httpRequest",
  "typeVersion": 4.1,
  "position": [450, 150]
}

This pulls all orders from the last 24 hours. The response includes line items, prices, and quantities. Save this data to a variable so you can pass it to Deepnote later.

If your POS doesn't have an API, you can automate CSV export to Google Drive or Dropbox, then have n8n download and process it. It's slower but workable.

Step 3:

Fetch Current Supplier Costs

Simultaneously, fetch your supplier costs. If your suppliers maintain a shared sheet or API, great. Otherwise, you can maintain a simple Google Sheet that you update manually, and n8n will read from it.

Using Google Sheets API:


GET https://sheets.googleapis.com/v4/spreadsheets/YOUR_SHEET_ID/values/SupplierCosts!A:F

Headers:
Authorization: Bearer YOUR_GOOGLE_ACCESS_TOKEN

In n8n:


{
  "parameters": {
    "authentication": "oAuth2",
    "method": "GET",
    "url": "https://sheets.googleapis.com/v4/spreadsheets/{{ $env.GOOGLE_SHEET_ID }}/values/SupplierCosts!A:F"
  },
  "name": "Fetch Supplier Costs",
  "type": "n8n-nodes-base.httpRequest",
  "typeVersion": 4.1,
  "position": [450, 450]
}

This returns rows like: Chicken Breast | 12.50/kg | Updated: 2024-01-14. Parse this into a structured format (JSON array) that Deepnote can consume.

Step 4:

Run Analysis in Deepnote

Deepnote is a collaborative notebook environment. You'll create a notebook that accepts two inputs: sales data and supplier costs. Then it calculates margins, velocity, and profitability for each menu item.

First, create a new Deepnote notebook. Use environment variables or Deepnote's built-in parameter input to accept data from n8n.

In your Deepnote notebook:

import pandas as pd
import json
from datetime import datetime

# Deepnote can receive JSON via webhook or environment variables
# For this example, we'll assume data arrives as JSON

sales_data = json.loads('''{{ sales_json }}''')
supplier_costs = json.loads('''{{ costs_json }}''')

# Convert to DataFrames
sales_df = pd.DataFrame(sales_data)
costs_df = pd.DataFrame(supplier_costs)

# Parse sales data
sales_df['item_name'] = sales_df['line_item_name']
sales_df['quantity'] = sales_df['quantity_sold']
sales_df['revenue'] = sales_df['gross_sales_amount']

# Calculate per-item metrics
item_performance = sales_df.groupby('item_name').agg({
    'quantity': 'sum',
    'revenue': 'sum'
}).reset_index()

item_performance['price_per_unit'] = item_performance['revenue'] / item_performance['quantity']

# Merge with supplier costs
merged = item_performance.merge(costs_df, left_on='item_name', right_on='ingredient_item', how='left')

# Calculate food cost
merged['food_cost'] = merged['quantity'] * merged['cost_per_unit']
merged['gross_margin'] = merged['revenue'] - merged['food_cost']
merged['margin_percentage'] = (merged['gross_margin'] / merged['revenue'] * 100).round(2)

# Rank by profitability
merged['profit_rank'] = merged['gross_margin'].rank(ascending=False)

# Generate insights
top_performers = merged.nlargest(5, 'gross_margin')[['item_name', 'quantity', 'gross_margin', 'margin_percentage']]
bottom_performers = merged.nsmallest(5, 'gross_margin')[['item_name', 'quantity', 'gross_margin', 'margin_percentage']]

# Format output
insights = {
    'analysis_date': datetime.now().isoformat(),
    'total_items_analyzed': len(merged),
    'top_performers': top_performers.to_dict('records'),
    'bottom_performers': bottom_performers.to_dict('records'),
    'margin_trends': merged[['item_name', 'margin_percentage']].to_dict('records')
}

# Output as JSON for next step
print(json.dumps(insights, indent=2, default=str))

To trigger this notebook from n8n, use Deepnote's webhook API. Create a notebook that accepts parameters, then call it:


POST https://api.deepnote.com/v0/notebooks/YOUR_NOTEBOOK_ID/run

Headers:
Authorization: Bearer YOUR_DEEPNOTE_API_KEY
Content-Type: application/json

Body:
{
  "variables": {
    "sales_json": "{{ JSON.stringify($node['Fetch POS Sales Data'].json.orders) }}",
    "costs_json": "{{ JSON.stringify($node['Fetch Supplier Costs'].json.values) }}"
  }
}

Wait for the notebook to complete, then capture its output.

Step 5:

Summarise Insights with Smmry

The Deepnote analysis produces a lot of data. Smmry distils this into plain language recommendations. Send your analysis JSON to Smmry's API:


POST https://api.smmry.com/SM_API

Content-Type: application/x-www-form-urlencoded

sm_api_key=YOUR_SMMRY_KEY
sm_length=5
sm_with_break=1
sm_query={{ base64_encode(JSON.stringify(deepnote_insights)) }}

Smmry will respond with a condensed summary. For example:

"Top performer: Ribeye Steak (£45 per unit, 58% margin). Bottom performer: Seasonal Salad (£12 per unit, 12% margin). Recommendation: Review salad ingredients or increase price. Chicken breast margins declining due to supplier cost increase."

Store this summary for the next step.

Step 6:

Get Menu Recommendations from Terrakotta AI

Terrakotta AI specialises in restaurant menu optimisation. Send your sales analysis and Smmry summary to Terrakotta:


POST https://api.terrakotta.ai/v1/menu-optimisation

Headers:
Authorization: Bearer YOUR_TERRAKOTTA_API_KEY
Content-Type: application/json

Body:
{
  "restaurant_id": "YOUR_RESTAURANT_ID",
  "analysis_data": {
    "top_items": [
      {
        "name": "Ribeye Steak",
        "units_sold": 145,
        "margin_percentage": 58,
        "current_price": 45.00
      }
    ],
    "bottom_items": [
      {
        "name": "Seasonal Salad",
        "units_sold": 32,
        "margin_percentage": 12,
        "current_price": 12.00
      }
    ],
    "market_summary": "Top performer: Ribeye Steak..."
  },
  "constraints": {
    "min_price_increase": 0.5,
    "max_price_increase": 2.0,
    "exclude_items": []
  }
}

Terrakotta responds with specific recommendations:

{
  "recommendations": [
    {
      "item_name": "Ribeye Steak",
      "action": "PROMOTE",
      "reason": "High margin and strong sales velocity",
      "suggested_menu_position": "FEATURED",
      "suggested_price_increase": null
    },
    {
      "item_name": "Seasonal Salad",
      "action": "ADJUST_RECIPE",
      "reason": "Low margin despite reasonable sales",
      "suggested_price_increase": 1.50,
      "suggested_cost_reduction": "Switch to seasonal supplier for lettuce"
    },
    {
      "item_name": "Pasta Carbonara",
      "action": "INCREASE_VISIBILITY",
      "reason": "Good margin trending upward",
      "suggested_price_increase": 0.75
    }
  ]
}

Step 7:

Format and Deliver Recommendations

Finally, format these recommendations and send them to your team. You have several options:

Option A: Save to Google Docs

from googleapiclient.discovery import build

# Create a formatted document with today's recommendations
doc_content = f"""
MENU ANALYSIS REPORT - {datetime.now().strftime('%Y-%m-%d')}

ACTIONS REQUIRED:
- Feature Ribeye Steak (promote in menu and on website)
- Increase Seasonal Salad price by £1.50 (consider sourcing lettuce from local supplier)
- Increase Pasta Carbonara price by £0.75

MARGIN TRENDS:
Top Performers: Ribeye Steak (58%), Grilled Salmon (52%)
Bottom Performers: Seasonal Salad (12%), Vegetarian Pasta (18%)

Next review: {(datetime.now() + timedelta(days=7)).strftime('%Y-%m-%d')}
"""

# Use Google Docs API to append this to a document
service = build('docs', 'v1', credentials=creds)
requests = [
    {
        'insertText': {
            'text': doc_content,
            'location': {'index': 1}
        }
    }
]
service.documents().batchUpdate(documentId=doc_id, body={'requests': requests}).execute()

In n8n, add a Google Docs node pointing to your shared menu analysis document.

Option B: Send to Slack

{
  "parameters": {
    "method": "POST",
    "url": "https://hooks.slack.com/services/YOUR/WEBHOOK/URL",
    "body": {
      "channel": "#menu-engineering",
      "text": "Daily Menu Analysis Complete",
      "blocks": [
        {
          "type": "section",
          "text": {
            "type": "mrkdwn",
            "text": "*Menu Recommendations*\n• Feature: Ribeye Steak\n• Adjust: Seasonal Salad (+£1.50)\n• Increase visibility: Pasta Carbonara"
          }
        }
      ]
    }
  },
  "name": "Send Slack Notification",
  "type": "n8n-nodes-base.slack",
  "typeVersion": 2
}

Option C: Update Your PMS Directly

If your PMS supports it, send recommendations directly back:


PATCH https://api.yourpms.com/v1/menu-items/ribeye-steak

Headers:
Authorization: Bearer YOUR_PMS_TOKEN

Body:
{
  "featured": true,
  "featured_until": "2024-01-21",
  "notes": "High margin item. Promote on website."
}

Complete n8n Workflow Summary

Your final workflow should look like this in n8n:

  1. Cron trigger (6:00 AM daily)
  2. Parallel fetch: POS sales data
  3. Parallel fetch: Supplier costs
  4. Deepnote HTTP request (run analysis notebook)
  5. Wait for Deepnote completion
  6. Smmry HTTP request (summarise insights)
  7. Terrakotta AI HTTP request (get recommendations)
  8. Format and send recommendations (Slack, Docs, or PMS)

Set error handling on each node so that if one step fails, you get a Slack alert rather than a silent failure.

The Manual Alternative

If you're not ready to fully automate, you can use these tools individually. Download your POS data as CSV, upload it to Deepnote, run the analysis notebook manually each week, copy the top and bottom performers into an email to Smmry for summarisation, then manually enter those insights into Terrakotta for recommendations. It's slower, but you maintain complete control and only run analysis when you choose. Most restaurants start here, then automate once they're confident in the workflow.

Pro Tips

1. Rate Limiting and API Quotas

Most APIs have rate limits. Smmry allows 100 requests per day on free plans. If you run this workflow daily, you'll hit that limit quickly. Either upgrade to a paid Smmry plan or cache results; run the full workflow every three days and use cached data on off days. n8n has built-in rate limiter nodes.


{
  "name": "Rate Limiter",
  "type": "n8n-nodes-base.rateLimit",
  "parameters": {
    "maxRequests": 100,
    "timeWindow": "1 day"
  }
}

2. Error Handling

Add a catch-all error handler to your n8n workflow. If any step fails, log it and notify you via Slack. Never let a failed workflow run silently; your team won't update the menu if they never know the analysis ran.


{
  "name": "Error Handler",
  "type": "n8n-nodes-base.slack",
  "onError": "continueOnFail",
  "parameters": {
    "text": "Menu analysis failed: {{ $error }}"
  }
}

3. Test with Dummy Data First

Before connecting to live POS data, run the entire workflow with test datasets. Create a small JSON file with 10 menu items and dummy sales/costs. Verify every step produces output before switching to live data.

4. Schedule During Quiet Hours

If your workflow takes 5 minutes to run, don't schedule it for 9 AM when your staff are busy. Pick 6 AM or 11 PM. Use n8n's Cron editor to find off-peak hours.

5. Monitor Supplier API Changes

Suppliers update their APIs without warning. If Terrakotta or Deepnote change their endpoints, your workflow breaks. Set monthly calendar reminders to check each API's changelog. Better yet, subscribe to API status pages (most services offer email alerts).

Cost Breakdown

ToolPlan NeededMonthly CostNotes
DeepnoteFree or Pro (£20)£0–20Free tier supports basic notebooks; Pro adds collaboration and longer execution. For this workflow, Free is usually adequate.
SmmryStarter (100 requests/day)£0–20Free allows 100 API calls daily; Starter plan (£20/month) gives 1000 calls. If you run daily, Starter is necessary.
Terrakotta AIStandard£50Flat rate for up to 100 menu items and daily optimisation. Add £25 per month for advanced features like seasonal modelling.
n8nCloud Basic or Self-Hosted£20–50 or £0Cloud Basic (£20/month) covers up to 100k executions; Self-hosted (open source) is free but requires your own server. For most restaurants, Cloud Basic is simpler.
Google Sheets APIFree£0Included in Google Workspace; no additional cost.
Square/Toast/Your POS APIIncluded£0Most modern POS systems include API access in standard plans. Check your existing contract.
Total Monthly£70–140Assumes daily scheduling, modest automation, no custom development. Adjust based on your POS and supplier APIs.

This cost is roughly equivalent to paying a menu analyst 3–4 hours per week. Most restaurants save time and money within the first month.

By connecting these tools through n8n (or Zapier or Make), you've built a system that continuously monitors your profitability and nudges your team toward better decisions. The workflow runs while you sleep, and your team wakes up to actionable recommendations instead of raw data. That's what automated menu engineering looks like.