Stock Trading Analysis Report from Market Data and News
- Published
Stock market investors and analysts face a persistent problem: market-moving information arrives constantly, but synthesising it into actionable analysis takes hours of manual work. You're pulling data from multiple sources, reading through earnings reports and news articles, cross-referencing technical indicators, and then writing it all up into a coherent report. By the time you finish, the market has moved on.
What if you could automate this entire process? Instead of manually gathering data, reading news, and writing analysis, you could set up a workflow that runs on a schedule, pulls fresh market data and relevant news, analyses both together using AI, and generates a professionally formatted report with visualisations, all without touching a single file yourself. This is where combining multiple AI tools into an automated workflow becomes genuinely valuable.
In this guide, we'll build an advanced workflow that pulls stock data from financial APIs, fetches relevant news, analyses everything through multiple AI lenses, and outputs a complete trading analysis report with charts and infographics. We'll use four specialist tools and one of three orchestration platforms to connect them without any manual handoff.
The Automated Workflow
This workflow requires some technical setup, which is why we've marked it as advanced. However, the payoff is significant: a production-ready system that generates fresh analysis reports daily or on-demand.
The Data Flow
The workflow moves through four distinct stages. First, a trigger (either a schedule or API call) initiates the process. Second, financial data collection pulls market prices, volume data, and technical indicators. Third, news aggregation and analysis gathers relevant market news and sentiment. Fourth, the combined data flows through multiple AI analysis steps, which culminate in infographic generation.
Which Orchestration Tool to Use
For this advanced workflow, we recommend n8n or Make (Integromat) over Zapier. Here's why: both Zapier and Make can handle this, but n8n gives you more granular control over data transformation between steps, which matters when you're combining financial data with news and running multiple parallel analyses. n8n also runs on your own infrastructure if you want, reducing long-term costs for frequent executions.
That said, if your team knows Make well, it will work. Zapier works too, but you'll hit more limitations around data manipulation and parallel processing.
For this guide, we'll show n8n examples, with notes on how to adapt for Make.
Step 1: Trigger and Data Collection
Set up an n8n workflow triggered by a schedule node (daily at 08:00 GMT) or a manual webhook.
Workflow Trigger (Schedule Node)
├─ Cron: 0 8 * * * (daily at 08:00 GMT)
└─ Output: timestamp and run_id
Next, use Finster AI or FinChat to pull current stock data. Both have REST APIs. Here's the Finster AI approach:
GET /api/v1/stock/quote
Headers:
Authorization: Bearer YOUR_FINSTER_API_KEY
Content-Type: application/json
Query Parameters:
symbol=AAPL
metrics=price,volume,pe_ratio,market_cap,52week_high,52week_low
In n8n, add an HTTP Request node configured like this:
Method: GET
URL: https://api.finster.ai/v1/stock/quote
Authentication: Bearer Token (YOUR_FINSTER_API_KEY)
Query Params:
symbol: {{ $env.STOCK_SYMBOL }}
metrics: price,volume,pe_ratio,market_cap,52week_high,52week_low
Parse Response: JSON
Store the response in a variable for later use. The response should include current price, historical data points, and key metrics.
Step 2: Fetch and Analyse News
Parallel to the stock data pull, fetch relevant news using FinChat's news API or a general news aggregation API.
GET /api/v1/news/search
Headers:
Authorization: Bearer YOUR_FINCHAT_API_KEY
Query Parameters:
query=AAPL earnings OR AAPL acquisition
limit=10
time_range=7d
sort=relevance
In n8n, add another HTTP Request node:
Method: GET
URL: https://api.finchat.io/v1/news/search
Authentication: Bearer Token (YOUR_FINCHAT_API_KEY)
Query Params:
query: {{ $env.STOCK_SYMBOL }} earnings OR product
limit: 10
time_range: 7d
sort: relevance
Parse Response: JSON
This runs in parallel with the stock data fetch, so both complete before moving to the next step.
Step 3: Aggregate and Prepare Data for Analysis
Use an n8n Set node or Code node to combine the stock data and news into a single structured prompt for ChatGPT Writer.
const stockData = $input.first().json.stock;
const newsArticles = $input.last().json.articles;
const analysisPrompt = `
You are a professional stock analyst. Analyse the following market data and news.
CURRENT STOCK DATA:
- Symbol: ${stockData.symbol}
- Current Price: $${stockData.price}
- 52-Week High: $${stockData.high_52week}
- 52-Week Low: $${stockData.low_52week}
- P/E Ratio: ${stockData.pe_ratio}
- Market Cap: $${stockData.market_cap}
- Volume (Today): ${stockData.volume}
- Change (1D): ${stockData.change_1d}%
RECENT NEWS:
${newsArticles.map(article => `
- ${article.title}
Date: ${article.date}
Sentiment: ${article.sentiment}
Summary: ${article.summary}
`).join('\n')}
Generate a concise but thorough trading analysis covering:
1. Current market sentiment based on news and price action
2. Key technical levels and support/resistance zones
3. Risk factors identified in recent news
4. Short-term trading thesis (next 5-10 trading days)
5. Recommended actions for traders
Format as structured, numbered sections suitable for a professional report.
`;
return {
json: {
analysisPrompt: analysisPrompt,
stockSymbol: stockData.symbol,
reportDate: new Date().toISOString().split('T')[0],
stockData: stockData,
newsCount: newsArticles.length
}
};
This Code node prepares your data in a format that ChatGPT Writer expects, making the next step cleaner.
Step 4: Generate Analysis via ChatGPT Writer
ChatGPT Writer (or any API-compatible GPT wrapper) takes the aggregated prompt and generates the written analysis.
[POST](/tools/post) /api/v1/completions
Headers:
Authorization: Bearer YOUR_CHATGPT_WRITER_API_KEY
Content-Type: application/json
Body:
{
"prompt": "{{ $node['Prepare Data'].json.analysisPrompt }}",
"model": "gpt-4o",
"temperature": 0.7,
"max_tokens": 1200,
"top_p": 0.9
}
In n8n:
Method: POST
URL: https://api.chatgptwriter.ai/v1/completions
Authentication: Bearer Token (YOUR_CHATGPT_WRITER_API_KEY)
Body (JSON):
{
"prompt": {{ $node['Prepare Data'].json.analysisPrompt }},
"model": "gpt-4o",
"temperature": 0.7,
"max_tokens": 1200
}
Parse Response: JSON
Store the generated analysis text as reportText.
Step 5: Create Infographic via Text2Infographic
This is the visual component. Take key metrics from the stock data and the main points from the analysis, and generate an infographic.
POST /api/v1/generate/infographic
Headers:
Authorization: Bearer YOUR_TEXT2INFOGRAPHIC_API_KEY
Content-Type: application/json
Body:
{
"title": "Trading Analysis: {{ stockSymbol }} - {{ reportDate }}",
"sections": [
{
"title": "Current Price Action",
"data": {
"current_price": "{{ stockData.price }}",
"change_1d": "{{ stockData.change_1d }}%",
"52w_high": "{{ stockData.high_52week }}",
"52w_low": "{{ stockData.low_52week }}"
},
"chart_type": "metric_cards"
},
{
"title": "Key Analysis Points",
"data": "{{ reportText }}",
"chart_type": "text_summary"
},
{
"title": "News Sentiment",
"data": {
"positive": "{{ positiveNewsCount }}",
"neutral": "{{ neutralNewsCount }}",
"negative": "{{ negativeNewsCount }}"
},
"chart_type": "pie_chart"
}
],
"style": "professional",
"format": "pdf"
}
In n8n:
Method: POST
URL: https://api.text2infographic.com/v1/generate
Authentication: Bearer Token (YOUR_TEXT2INFOGRAPHIC_API_KEY)
Body (JSON):
{
"title": "Trading Analysis: {{ $node['Stock Data'].json.symbol }} - {{ now.format('YYYY-MM-DD') }}",
"sections": [
{
"title": "Current Price Action",
"data": {
"current_price": {{ $node['Stock Data'].json.price }},
"change_1d": {{ $node['Stock Data'].json.change_1d }},
"52w_high": {{ $node['Stock Data'].json.high_52week }},
"52w_low": {{ $node['Stock Data'].json.low_52week }}
},
"chart_type": "metric_cards"
},
{
"title": "Analysis Summary",
"data": {{ $node['ChatGPT Analysis'].json.text }},
"chart_type": "text_summary"
}
],
"format": "pdf"
}
Parse Response: JSON
The API returns a PDF file URL and metadata.
Step 6: Store and Distribute
Finally, save the generated report and send notifications.
Add a Google Drive or AWS S3 node to store the PDF:
Save to: /Trading Reports/{{ stockSymbol }}/{{ reportDate }}.pdf
Then add an email or Slack notification node:
To: your-email@example.com
Subject: Trading Analysis Report: {{ stockSymbol }} - {{ reportDate }}
Body: Your automated analysis report is ready. PDF attached or available at [link].
Attachment: {{ infographicPdfUrl }}
Full Workflow Summary in n8n
Schedule Trigger
├─ Fetch Stock Data (Finster AI)
├─ Fetch News Data (FinChat) [parallel]
├─ Prepare Aggregated Prompt (Code Node)
├─ Generate Analysis (ChatGPT Writer)
├─ Create Infographic (Text2Infographic)
├─ Save to Google Drive
└─ Send Email Notification
Each arrow represents data passed forward; parallel branches execute simultaneously, speeding up the total workflow time.
The Manual Alternative
If you prefer more control or your situation requires flexibility, you can run parts of this workflow manually whilst automating others.
For instance, automate the data collection and news gathering (steps 1-2), then manually review the data before writing your analysis. This hybrid approach works well if you want the workflow to prepare your raw materials but want human judgement on interpretation.
Alternatively, automate data collection and report generation (steps 1-5), but have a human review the generated analysis before distribution (step 6). This adds a quality gate without losing the efficiency gains.
The advantage of a fully automated approach is consistency and speed; the advantage of a manual review step is that you catch errors, bias, or market conditions that the AI might miss.
Pro Tips
Rate Limiting and Throttling
Financial APIs often rate-limit requests. Finster AI typically allows 100 requests per minute on their standard plan; FinChat allows 50 per minute. If you're running this workflow multiple times daily or across multiple stock symbols, queue them sequentially rather than in parallel.
In n8n, add a Wait node between each API call:
Wait 2 seconds between Finster calls
Wait 2 seconds between FinChat calls
This keeps you well below rate limits and avoids expensive retry logic.
Error Handling
Add Try-Catch blocks (or n8n's Error Handling node) around each external API call. If the stock data fetch fails, you might want to halt the workflow or use cached data from the previous run. If news fetching fails, the workflow can proceed with just stock data.
HTTP Request Node
├─ Success Path → Continue to next step
└─ Error Path → Log error, check for cached data, or notify admin
Cost Optimisation
Each tool charges based on usage. To reduce costs:
- Batch workflows: instead of running once daily, run weekly and analyse 5 stocks in one execution.
- Cache static data: store sector indices and market-wide data once, reuse across multiple stock analyses.
- Use cheaper models where appropriate: GPT-4o mini is cheaper than GPT-4o and often sufficient for straightforward analysis.
- Set token limits on ChatGPT Writer to prevent runaway costs; 1200 tokens is usually enough.
Handling Missing or Stale Data
Sometimes a financial API returns incomplete data (missing volume, delayed prices, or news gaps). Add validation:
if (!stockData.price || stockData.price === null) {
throw new Error('Missing current price data');
}
if (newsArticles.length === 0) {
console.warn('No recent news found; continuing with price data only');
}
This ensures your workflow doesn't silently generate a report based on incomplete information.
Testing Before Full Automation
Before scheduling this workflow to run daily, test it end-to-end manually. Run it once, review the output, and verify that each tool produced what you expected. Check that the PDF is readable, that the email arrives, and that the analysis makes sense. Only then schedule it.
Cost Breakdown
| Tool | Plan Needed | Monthly Cost | Notes |
|---|---|---|---|
| n8n | Self-hosted or Cloud Professional | £0-29 | Self-hosted is free; Cloud Pro includes 10k workflow executions |
| Finster AI | API Starter | $49 | 100 requests/min, covers 5-10 daily stock analyses |
| FinChat | API Standard | $79 | 50 requests/min, includes news and fundamental data |
| ChatGPT Writer | API pay-as-you-go | $10-20 | Based on tokens; 1200 tokens per report, ~0.03 per execution |
| Text2Infographic | API Standard | $39 | Up to 100 infographics/month |
| Storage (Google Drive/AWS) | Free tier or standard | £0-10 | Free for <100GB; PDFs are small (1-3MB each) |
| Email/Slack notification | Free tier | £0 | Built into n8n; no additional cost |
| Total | £177-257/month | Suitable for active traders analysing 5-10 stocks regularly |
If you analyse only one stock weekly, costs drop by 50-70%. If you're running this for a team of traders, enterprise plans reduce per-report costs significantly.
This workflow represents advanced orchestration, but the payoff is tangible: instead of spending 2-3 hours manually assembling and writing analysis, you get a professional report in minutes. The initial setup takes a few hours, but it pays for itself within weeks through recovered time alone.
More Recipes
Competitive market intelligence dashboard from pricing and product data
E-commerce and SaaS companies lack real-time visibility into competitor pricing, features, and positioning to inform their own strategy.
Academic paper digesting pipeline for research synthesis
Researchers spend hours reading and synthesising papers when they need to extract key findings and citations quickly.
Build an AI Research Assistant Stack
Build an AI Research Assistant Stack