Back to Alchemy
Alchemy RecipeBeginnerworkflow

Podcast episode to social clips, show notes, and email newsletter in one workflow

Your latest podcast episode is recorded, edited, and uploaded. Now comes the work that doesn't show in the RSS feed: transcribing 90 minutes of audio, identifying the five best moments for short-form clips, writing show notes that actually get read, and crafting social media posts that don't sound like every other podcast creator's. Most shows do this work manually, which means the creator loses 8 to 12 hours per episode to administrative work that machines should handle. You can cut that time to under an hour by chaining together four specialist tools and one orchestration platform. The workflow runs almost entirely without human intervention; you upload the episode file, and 48 hours later you have clips, show notes, an email-ready summary, and social media carousels ready to post. This post walks through the setup using n8n, which offers the best balance of free-tier usability and native integrations for podcast workflows.

The Automated Workflow

The workflow operates in four phases: transcription and analysis, clip generation, content creation, and distribution preparation. Data flows in one direction; each tool passes its output as input to the next.

Phase 1: Transcript and Summary

Start with your podcast file in cloud storage. Trigger the workflow via webhook when a new file lands in a designated folder (Google Drive, Dropbox, or S3 work equally well). Shownotes AI handles transcription and generates a structured summary in one call.

POST https://api.shownotes.ai/v1/transcribe
Authorization: Bearer YOUR_SHOWNOTES_API_KEY
Content-Type: application/json { "audio_url": "https://storage.googleapis.com/your-bucket/episode-142.mp3", "language": "en", "format": "detailed", "include_timestamps": true
}

Shownotes AI returns a JSON object containing the full transcript with timestamps, a 200-word summary, and key talking points flagged by topic. Store this response in n8n as a variable; you'll reference it across the next three phases.

Phase 2: Clip Extraction

Clipwing works best with video files, but podcast creators typically export an MP4 with a static image. If your episode is audio-only, convert it to video using FFmpeg (a free command-line tool) before sending to Clipwing. Feed Clipwing the video file URL and the transcript from Phase 1. Clipwing's API accepts a list of highlight moments; you can either let it auto-detect interesting segments or pass specific timestamps from the Shownotes AI output.

POST https://api.clipwing.com/v1/clips/generate
Authorization: Bearer YOUR_CLIPWING_API_KEY
Content-Type: application/json { "video_url": "https://storage.googleapis.com/your-bucket/episode-142.mp4", "transcript": "Full transcript text from Shownotes AI", "auto_detect": true, "max_clips": 6, "clip_duration_seconds": [30, 45, 60], "output_format": "mp4"
}

Clipwing generates 4 to 8 short clips (you set the target count) and uploads them to your storage bucket. It returns URLs and timecode data for each clip.

Phase 3: Social Media Content

This is where the workflow multiplies value. You now have clips; Copy.ai and Mirra turn them into distribution-ready social posts. First, send the episode summary and clip metadata to Copy.ai. Ask it to generate five captions suited to different platforms: LinkedIn (professional angle), Twitter (punchy thread), Instagram (storytelling), TikTok (trend-forward), and email subject lines.

POST https://api.copy.ai/v1/generate
Authorization: Bearer YOUR_COPYAI_API_KEY
Content-Type: application/json { "input": { "episode_title": "The Future of Podcast Monetisation", "summary": "200-word summary from Shownotes AI", "clip_durations": [45, 60, 30], "tone": "informative, conversational" }, "templates": [ "social_caption_linkedin", "social_caption_twitter_thread", "social_caption_instagram", "social_caption_tiktok", "email_subject_lines" ]
}

Copy.ai returns platform-specific captions. Store these in a spreadsheet or content calendar tool for your review (this is your single manual checkpoint; you'll spend 10 minutes approving or tweaking captions). Next, feed your clips to Mirra to generate carousel videos and short-form content variations. Mirra excels at taking one clip and adapting it for different formats.

POST https://api.mirra.ai/v1/create_carousel
Authorization: Bearer YOUR_MIRRA_API_KEY
Content-Type: application/json { "clip_url": "https://storage.googleapis.com/your-bucket/clip-001.mp4", "platforms": ["instagram", "tiktok", "youtube_shorts"], "style": "dynamic_captions", "music": "energetic_background_track"
}

Mirra returns multiple versions of each clip, optimised for aspect ratio, caption placement, and audio levels per platform.

Phase 4: Orchestration in n8n

Wire everything together using n8n. Start with a webhook trigger that fires whenever a new audio file appears in your cloud storage.

HTTP Webhook Trigger: Event: File created in Google Drive folder "Podcast Episodes" Extracted data: file_url, file_name, upload_timestamp

Create n8n nodes in this sequence: 1. HTTP Request node: Call Shownotes AI transcription endpoint, store response in variable transcript_data 2. Code node: Extract timestamps and key moments from transcript_data, format as JSON array for Clipwing 3. HTTP Request node: Call Clipwing API with transcript and timestamps, store clip URLs in variable clip_urls 4. HTTP Request node: Call Copy.ai with episode summary, collect captions in variable social_captions 5. Loop node: Iterate over each clip URL from clip_urls 6. HTTP Request node (inside loop): Call Mirra for each clip, collect carousel URLs in array carousel_outputs 7. Email node: Draft email with Shownotes AI summary, Copy.ai captions, and Mirra carousel links; send to podcast creator for approval 8. Google Sheets node: Log all output URLs, captions, and timestamps into a master content calendar spreadsheet Here's a simplified pseudocode structure for the n8n workflow:

Webhook Trigger (new file in Google Drive) ↓
Get File URL from Trigger ↓
Call Shownotes AI (transcribe + summarise) ↓
Extract Key Moments from Transcript ↓
Call Clipwing (generate clips) ↓
For Each Clip URL: ├─ Call Mirra (create carousel variations) └─ Store carousel URLs ↓
Call Copy.ai (generate captions) ↓
Send Email with All Assets (summary, clips, captions, carousels) ↓
Log Everything to Google Sheets Content Calendar ↓
Done

The entire sequence runs asynchronously; you don't wait for it to complete. Email arrives 30 to 45 minutes after you upload the episode file. You review captions (optional but recommended), approve one click in the email, and everything publishes to your content calendar.

The Manual Alternative

If you prefer more control or need custom edits at each stage, replace the n8n orchestration with individual tool calls. This approach takes 2 to 3 hours per episode but lets you inspect every clip, rewrite every caption, and adjust Mirra's carousel design. Process: Transcribe with Shownotes AI, manually identify clip moments in the transcript, upload clips to Clipwing separately, write captions in Copy.ai, generate carousels in Mirra, then collate everything into a spreadsheet. You gain flexibility and lose time; most successful shows move to full automation after one or two manual runs.

Pro Tips Webhook stability and retries. n8n's webhook can occasionally miss events if your cloud storage has latency spikes.

Enable retry logic on all HTTP requests; set exponential backoff with a maximum of 3 retries. Test your webhook with a sample file before going live.

Rate limiting.

Shownotes AI and Copy.ai both enforce rate limits on their free tiers. If you publish two episodes in a week, you'll hit Copy.ai's limit. Use n8n's built-in delay nodes to stagger requests across a day, or upgrade to their paid plans if you produce more than two episodes weekly.

Clip quality filtering.

Clipwing sometimes flags moments you don't want clipped (dead air, audio glitches, off-topic tangents). Add a Code node after Clipwing that filters out clips under 20 seconds or without meaningful speech; this reduces false positives.

Mirror captions to multiple formats.

Copy.ai generates captions once; use n8n's duplicate node feature to create variants for LinkedIn, Twitter, and email simultaneously rather than making separate API calls. This saves quota and speeds execution.

Cost control.

The per-minute cost of transcription (Shownotes AI) dominates your spend. If budget is tight, process one episode per week rather than daily. n8n's free tier covers up to 10 executions daily, which covers two to three episodes per month comfortably.

Cost Breakdown

ToolPlan NeededMonthly CostNotes
Shownotes AIStarter£25Includes 10 hours transcription; overage £2.50 per hour
ClipwingPro£45Unlimited clips, 1,000 minutes video processing per month
Copy.aiStandard£49100,000 words generation monthly; includes API access
MirraCreator£39Unlimited carousels, 500 minutes processing per month
n8n CloudFree tier£0Sufficient for 2–3 episodes per week; upgrade to Pro (£20) if you run daily workflows
TotalAll combined£178–£198Assumes one episode per week; scales linearly with volume