Back to Alchemy
Alchemy RecipeIntermediateworkflow

E-commerce product description generation and image enhancement at scale

sku,product_name,category,brand,key_features,image_url SKU-001,Wireless Earbuds,Electronics,TechSound,"noise cancellation, 48-hour battery, IPX7 waterproof",https://your-cdn.com/sku-001.jpg SKU-002,Cotton T-Shirt,Apparel,ComfortWear,"organic cotton, machine washable, crew neck",https://your-cdn.com/sku-002.jpg

 In n8n, create a new workflow and start with a "Webhook" trigger node. Configure it to accept POST requests containing your product batch. This endpoint becomes your entry point; you can call it from your inventory system, a scheduled task, or manually upload a CSV. 

POST /webhook/product-batch Content-Type: application/json { "products": [ { "sku": "SKU-001", "product_name": "Wireless Earbuds", "category": "Electronics", "brand": "TechSound", "key_features": "noise cancellation, 48-hour battery, IPX7 waterproof", "image_url": "https://your-cdn.com/sku-001.jpg" } ] }

 Once data arrives, use an "Item Lists" node to iterate over each product. For each one, you'll execute parallel processes: description generation and image enhancement. Start with the description node. Add a Copy.ai node configured with your API key. The request should look like this: 

json { "model": "gpt-4o-mini", "prompt": "Write a compelling e-commerce product description for: {{$json.product_name}} by {{$json.brand}}. Category: {{$json.category}}. Key features: {{$json.key_features}}. The description should be 80-120 words, highlight benefits over features, and include a call-to-action. Return only the description text.", "temperature": 0.7, "max_tokens": 200 }

 Copy.ai will generate a unique description for each product. Store the output in a variable so you can reference it later. In parallel, process the image. Add a Pixelcut AI node to remove the background from the source image. Configure it like this: 

json { "image_url": "{{$json.image_url}}", "operation": "remove_background", "output_format": "png" }

 Pixelcut returns a cleaned image URL. Pass that immediately into an AI Boost node for upscaling and enhancement: 

json { "image_url": "{{$node.Pixelcut.data.enhanced_image_url}}", "operation": "upscale", "upscale_factor": 2, "enhancement_type": "auto_enhance" }

 This gives you a high-resolution, enhanced image ready for catalogue display. Now wire everything together. After both the Copy.ai and AI Boost nodes complete, use a "Function" node to construct your final output object: 

javascript return { sku: $json.sku, product_name: $json.product_name, description: $node.CopyAI.data.description, enhanced_image_url: $node.AIBoost.data.image_url, category: $json.category, brand: $json.brand, timestamp: new Date().toISOString() };

 From here, you have two choices for final output. Option 1: write to a database table where your e-commerce platform's import process pulls data. Option 2: format the output as a CSV or JSON file, upload it to your storage system, and trigger your platform's batch import endpoint. For Shopify, WooCommerce, or custom platforms, this looks different, so adjust accordingly. If you're using a database, add a "Database" node (configured for your system, PostgreSQL, MySQL, etc.) with an INSERT or UPSERT statement: 

sql INSERT INTO product_catalogue (sku, product_name, description, image_url, category, brand, generated_at) VALUES ($1, $2, $3, $4, $5, $6, $7) ON CONFLICT (sku) DO UPDATE SET description = EXCLUDED.description, image_url = EXCLUDED.image_url, generated_at = EXCLUDED.generated_at;

 Add error handling at each stage. Use "Try/Catch" nodes to capture failures from API calls. If Copy.ai times out, retry twice before logging the product SKU to a failure list. If image processing fails, fall back to the original image and flag it for manual review. Store failures in a separate table so you can audit what went wrong. Finally, add a "Notification" node at the end. Send yourself an email summary: 

Batch processing complete. Total products: {{$json.total_count}} Successfully processed: {{$json.success_count}} Failed: {{$json.failure_count}} Processing time: {{$json.duration_seconds}}s


## The Manual Alternative

If you prefer more control over descriptions or want to cherry-pick which images get enhanced, use a semi-automated approach. Generate descriptions in Copy.ai first, review them, then export approved ones. Upload those SKUs to a separate n8n workflow that only handles image processing. This slows things down but gives you checkpoints to catch quality issues before they hit your live catalogue. For more on this, see [Automated legal document review and client summary genera...](/blog/automated-legal-document-review-and-client-summary-generation).

## Pro Tips 

### Rate limiting and costs.

Copy.ai and AI Boost both have rate limits.

Copy.ai allows around 100 requests per minute on standard plans; AI Boost is similar. If you're processing 500 products, space requests out using n8n's "Throttle" node (set to 50 requests per minute) to avoid hitting limits and incurring overage charges. This extends processing time from 5 minutes to 10, but it's cheaper. 

### Image file size.

Original product images often exceed 5 MB. Pixelcut and AI Boost work faster with images under 3 MB. Add a "HTTP Request" node before Pixelcut to download and compress the original using a free service like TinyPNG's API, then pass the optimised version into Pixelcut. This saves both API processing time and bandwidth costs. 

### Batch scheduling.

Don't process all 347 products at once. Run workflows in batches of 50 during off-peak hours (late evening or early morning) so API calls don't compete with live traffic. n8n's "Schedule Trigger" node lets you set this up; run the webhook at 2 AM daily with the next 50 SKUs. 

### Fallback descriptions.

If Copy.ai fails to generate a description (API outage, rate limit hit), have a fallback. Add a conditional node that checks the response; if it's empty, populate the description field with a template like "{{brand}} {{product_name}} in {{category}}. {{key_features}}." It's basic, but it ensures no product is left without text. 

### Audit trail.

Log every API call and response to a spreadsheet or database. Include the prompt sent, the response received, the timestamp, and any errors. This helps you spot if a particular tool is producing poor results and adjust your approach before it goes live.

## Cost Breakdown

| Tool | Plan Needed | Monthly Cost | Notes |
|------|-------------|--------------|-------|
| Copy.ai | Pro | £49 | 100k words/month; overage £0.001 per word |
| Pixelcut AI | Professional | £29 | 500 image operations/month; each background removal counts as 1 operation |
| AI Boost | Growth | £99 | Unlimited upscaling and enhancement within fair-use limits |
| n8n | Pro Cloud | £30 | 5k workflow executions/month; 500 SKUs × 2 operations = 1k executions, plenty of headroom |
| **Total** | | **£207** | Enough [capacity](/tools/capacity) for ~1,500 products/month | If your volume grows beyond 1,500 products monthly, Pixelcut becomes the bottleneck. Upgrade to their Agency plan (£299/month) for 5,000 operations. At that point, consider self-hosting n8n to save the cloud fee and redirect those savings toward image processing.