Meridian runs a fleet of 6 AI agents that continuously monitor, audit, and optimize your site's SEO. Here's what they do, when they run, and how the system works under the hood.
Six specialist agents work together on your SEO. Each one handles a different part of the process.
Crawls your site via sitemap and audits every page for title tags, meta descriptions, H1s, schema markup, and CTR issues. Produces copy-paste fixes you can implement immediately. Runs daily batch audits (number of pages depends on your plan tier).
Analyzes keyword opportunities, competitor gaps, and trending topics in your niche. Creates content briefs with target keywords, search intent analysis, and detailed outlines.
Writes SEO-optimized drafts in your brand voice. Follows anti-AI-slop rules to keep content natural and readable. All drafts go to your review queue for approval before publishing.
Three-phase pipeline: discovers prospects (Haiku), qualifies with web research and real backlink metrics (Sonnet), then writes hyper-personalized outreach (Opus). Enriches every prospect with domain rank and referring domains from DataForSEO. Includes competitor backlink gap analysis.
Monitors industry news, algorithm updates, and competitor moves daily. Surfaces opportunities and threats so you can act before the rest of your market catches on.
Synthesizes all agent outputs, prioritizes actions, and handles your messages. Acts as the single point of contact between you and the rest of the agent fleet.
Everything runs on autopilot. Here's what happens and when.
| Task | Schedule | Description |
|---|---|---|
| Memory consolidation | Sunday 1 AM | Merge related memories, remove duplicates, cap at 30 per site |
| Sitemap refresh | Monday 2 AM | Re-fetch sitemaps and discover new pages |
| Rankings pull | Monday 4 AM | DataForSEO SERP positions for all tracked keywords |
| Backlinks pull | Monday 5 AM | DataForSEO backlink profile snapshots (domain rank, referring domains) |
| GA4 analytics pull | Monday 5:30 AM | Google Analytics 4 sessions, engagement, conversions, page-level data |
| Tech Auditor | Monday 9 AM | AI analysis of recently crawled pages + GSC data |
| Content Pipeline | Tuesday 8 AM | Trends → Briefs → Drafts (three agents in sequence) |
| Prospecting cycle | Wednesday 10 AM | Link Builder → auto-send outreach → follow-up checks |
| Daily crawl batch | Daily 3 AM | Crawl N pages per tier (5–20 pages/day) |
| Trend Scout | Daily 7 AM | Industry monitoring — news, competitors, algorithm updates |
| Follow-up check | Daily 9 AM | Send follow-ups to prospects past their delay window |
Connected via OAuth. Provides clicks, impressions, CTR, and position data for your tracked properties. Pulled weekly with 7-day current and previous period comparison. Feeds into the Tech Auditor, Coordinator, and Trend Scout agents.
Connected via OAuth. Provides sessions, users, engagement rate, bounce rate, conversions, and page-level behavior data. Pulled weekly. Feeds into the Coordinator and Content Strategist agents for conversion-weighted recommendations.
Provides SERP position tracking, keyword research data, and backlink metrics. Weekly ranking pulls track your position vs competitors across all tracked keywords. Backlink snapshots track domain rank, referring domains, and total backlinks over time.
Domain rank, total backlinks, referring domains, and broken backlinks tracked weekly via DataForSEO. Used for prospect scoring, competitor gap analysis, and site health monitoring. Each prospect is enriched with real backlink metrics during discovery.
Your sitemap.xml is parsed and pages are audited incrementally. The system discovers your sitemap automatically when you add a site. Crawl queue is site-scoped so multi-site tenants get independent queues.
| Feature | Starter ($49) | Growth ($99) | Pro ($199) | Agency ($499) |
|---|---|---|---|---|
| Websites | 1 | 3 | 10 | 50 |
| Pages audited/day | 5 | 10 | 15 | 20 |
| Max auditable pages | 100 | 500 | 2,000 | 5,000 |
| Content drafts/mo | 5 | 15 | Unlimited | Unlimited |
| Link prospects/mo | 20 | 75 | 200 | Unlimited |
| Agent frequency | Weekly | 3x/week | Daily | Daily |
| Team seats | 1 | 3 | 10 | 25 |
| Integrations | GSC | GSC, Slack, Telegram, WP | All | All + white-label |
The Tech Auditor combines automated crawling with AI analysis to find issues and generate exact fixes you can copy-paste into your site.
When you add a site, we automatically fetch your sitemap.xml (handles sitemap indexes, checks robots.txt for Sitemap: directives). URLs are stored in a crawl queue — up to 200–2,000 pages depending on your plan.
Each day, we crawl the next batch of pages from your queue:
For each page we extract and store: title tag, meta description, H1/H2 tags, canonical URL, meta robots, JSON-LD schema, and OG tags.
After all pages are audited, the queue loops back to the oldest pages and re-audits them (round-robin).
We re-fetch your sitemap and add any new pages that weren't in the queue before. Existing pages are not duplicated.
The AI Tech Auditor receives:
It analyzes each page for issues and generates exact copy-paste fixes:
Issues appear on the Technical Issues page with:
Every Tuesday at 8 AM UTC, three agents run in sequence to produce publish-ready content.
Scans Google News, Reddit, HackerNews, and industry publications for trending topics relevant to your business. Identifies competitor content moves and algorithm updates.
Receives trend intelligence and analyzes your keyword gaps. Creates 2–3 detailed content briefs with:
Writes full, publish-ready articles from the highest-priority brief. Follows strict anti-AI-slop rules:
Drafts appear in your Content page ready for review.
Every conversation with the Coordinator extracts key learnings about you and your business.
Prompt injection attempts, system probing, and adversarial behavior are blocked from storage. Users can test the system without affecting future interactions.
Agents run via two distinct execution paths, chosen based on task complexity. Each agent receives a multi-layer context stack including persona, tenant memories, onboarding intake, brand voice, recent agent handoffs, knowledge base, and live data feeds (GSC, GA4, crawl data, trends).
Used by: Tech Auditor, Content Strategist, Content Creator, Link Builder (phases 1–2), Trend Scout
Used by: Coordinator, Link Builder (phase 3 outreach)
Different models are used for different tasks to optimize cost and quality:
All data is scoped per-site:
Switch sites in the sidebar dropdown — each site shows only its own data. Add sites via the + button (plan limits apply).
Invite team members and control what they can access.
The Link Builder uses three AI models in sequence for optimal cost and quality:
Fast, cheap model identifies 15–20 candidate domains from competitor backlinks, resource pages, guest post opportunities, and industry directories.
Visits each prospect's site via web fetch, evaluates relevance, finds contact info, reviews recent content. Each prospect is then enriched with real backlink metrics from DataForSEO (domain rank, backlinks, referring domains).
Writes hyper-personalized outreach emails referencing the prospect's specific recent articles. Under 150 words, conversational tone. Ready to send.
On the Links page, click "Find Backlink Gaps" to discover domains linking to your competitors but not to you. Enter up to 5 competitor domains (suggested competitors auto-populate from your rank tracking data). Each gap prospect is enriched with real backlink metrics and scored automatically.
Every prospect can be enriched on-demand with real DataForSEO data: domain rank, total backlinks, and referring domains. Scores are recalculated using a weighted blend of AI assessment (40%), domain rank (35%), and referring domain count (25%).