Analysis

How AI-Powered SEO Systems Tripled One Client's Monthly Organic Revenue

Matt Diggity's Search Initiative grew a client from $166K to $491K monthly organic revenue using a three-layer AI automation system built on Make.com, ChatGPT, and Ahrefs.

Jamie Taylor7 min read
Published
Listen to this article0:00 min
Share this article:
How AI-Powered SEO Systems Tripled One Client's Monthly Organic Revenue
AI-generated illustration

Tripling a client's monthly organic revenue sounds like a headline borrowed from a pitch deck. At The Search Initiative, Matt Diggity's SEO agency, it happened in measurable, reproducible steps, and the playbook is documented in enough detail that any agency can replicate it. Monthly revenue grew from $166K to $491K, a 196% increase. Organic traffic climbed 255% from 21,600 to 76,900 sessions per month. Keywords ranking in Google's top 10 reached a record high of 3,288. None of it happened through guesswork. It happened through a tightly engineered AI system built on three automation layers, each one targeting a different bottleneck in the SEO funnel.

The Client Problem: Stalled Performance and a Lost Trust in SEO

The client came in with a clear and common diagnosis. Their products weren't ranking for key search terms, their content strategy wasn't built for scale, and they had previously invested heavily in SEO with little return. That last detail matters: restoring confidence in the channel was as important as fixing the technical gaps. The core mandate was to expand the site's visibility by creating new pages targeting keywords the client was missing, then securing the backlinks needed to make those pages rank. Failure to do either meant continued lost revenue and stalled growth.

Baseline First: Measuring Before Automating

Before any automation was switched on, The Search Initiative established clear baselines across three performance dimensions: monthly organic revenue ($166K), monthly organic sessions (21,600), and keyword rankings in the top 10 positions. These numbers served as the proof of concept anchor throughout the campaign. Agencies rolling out a similar system should document the same three metrics from day one, since they form the reporting narrative that justifies the AI investment to clients and tracks whether each automation layer is actually moving the needle.

Automation Layer One: Closing Content Gaps at Scale

The first pillar was content gap analysis, using Ahrefs to identify the keywords the client was missing relative to competitors. Bridging content gaps between a site and its competition is one of the most effective ways to grow keyword reach and build topical authority. Covering key topics more comprehensively signals to search engines that a site is a trusted source, which was critical for recovering the trust the client's prior campaigns had failed to build.

The gap analysis fed directly into a content production system built around three tools: Google Sheets, ChatGPT, and Make.com, a no-code automation platform. The workflow worked as follows:

1. Target keywords and supporting page briefs were loaded into a Google Sheets document.

2. Make.com connected Google Sheets to ChatGPT via an automated workflow, triggering AI-generated content drafts at scale.

3. The metadata automation ran as a parallel workflow: a Make.com module connected the keyword spreadsheet to an OpenAI module, generated optimized titles, descriptions, and meta tags, then wrote the results back to their corresponding rows in the sheet automatically.

The critical human checkpoint in this layer was prompt refinement. As the case study notes, small changes to the ChatGPT prompt can significantly improve the relevance and quality of output, and the team iterated prompts until results met quality thresholds before scaling volume. Human editors reviewed drafts before publication, a safeguard Matt Diggity's broader work consistently reinforces: AI can generate its own data and introduce errors, so human review is a non-negotiable gate in any production pipeline.

Automation Layer Two: Metadata at Scale Without Manual Entry

Metadata optimization, normally a page-by-page manual slog, became a fully automated background process. Once the Google Sheets content database was live, Make.com ran a module that pulled each row, sent the keyword and page context to ChatGPT, received the metadata output, and wrote the title and description back into the sheet. Agencies can clone this workflow directly: the logic is tool-agnostic, and the same approach works for existing pages being refreshed as well as new pages being published.

The efficiency gain here is significant. What might take an SEO team hours per week to maintain manually becomes a system that runs on a schedule. The reusability is the point: once built, the metadata workflow applies to both legacy pages and every new page added to the site.

Automation Layer Three: AI-Powered Backlink Outreach

The third pillar was the most labor-intensive to set up but produced outsized leverage: automated blogger outreach. Earning high-quality, niche-relevant backlinks is essential for SEO rankings, but sourcing relevant sites, personalizing pitches, and following up at scale is one of the most time-consuming tasks in any agency's workflow. Manual outreach works for a handful of targets. Scaling it requires automation.

The workflow again started in Ahrefs, where the team sourced niche-relevant websites capable of linking back to the client's site. Those targets were loaded into the automation system, and Make.com was configured to generate personalized outreach pitches using ChatGPT, with the AI output written into an outreach pitch field in the connected spreadsheet. Prompt tuning was again the lever for quality: the team iterated on the pitch prompt until outputs were genuinely personalized rather than templated, then scaled volume once the quality bar was met.

The result was a scalable link-building operation that could produce unique pitches at a pace no human team could match manually, while still reflecting the niche specificity that separates earned backlinks from ignored emails.

The Rollout Blueprint: Steps Agencies Can Copy

For agencies looking to implement this system with a new client, the pilot structure follows a clear sequence:

1. Pilot selection: Choose a client with an established baseline, at least six months of Search Console and revenue data, and a product set with clear keyword gaps relative to identifiable competitors.

2. Baseline documentation: Lock in month-zero metrics for organic revenue, organic sessions, and top-10 keyword count before any automation is activated.

3. Content gap audit: Run Ahrefs competitor gap analysis to build a keyword target list, then load it into Google Sheets as the campaign's central data layer.

4. Automation build: Set up the Make.com workflows for content drafting and metadata generation, connecting Google Sheets to ChatGPT with prompt templates refined for the client's niche.

5. Human review gate: Establish editorial checkpoints before any AI-generated content or metadata goes live. Prompts should be refined iteratively until output quality consistently meets the agency's standard.

6. Outreach automation: Once content is live and indexed, activate the Ahrefs-sourced outreach workflow to build backlinks to new pages systematically.

7. Reporting cadence: Report against the three baseline metrics monthly, with keyword position counts as the leading indicator and revenue as the lagging confirmation.

Risks and Pitfalls to Disclose to Clients

No AI automation system is without risk, and transparency with clients before launch protects both the relationship and the results. The most important risks to surface:

  • AI hallucination in content and data: ChatGPT can generate plausible-sounding but inaccurate information. Every piece of AI-generated content requires human fact-checking, especially in YMYL-adjacent niches.
  • Prompt drift: Prompts that produce high-quality output in testing can degrade over time as model behavior changes. Build prompt review into the monthly workflow, not just the launch phase.
  • Outreach deliverability: High-volume automated outreach can trigger spam filters if sending domains and sequences aren't properly warmed up. Clients should understand that quality and domain hygiene matter as much as volume.
  • Lead time on results: Content gaps don't close overnight. The keyword and revenue gains in this case study reflect a sustained campaign, not a quick win. Set client expectations around a six-to-twelve-month horizon for compounding results.

What This Means for AI-Era SEO

The $166K-to-$491K result isn't an outlier. It's an illustration of what becomes possible when automation handles the volume work and human expertise handles the judgment calls. The architecture, Google Sheets as the data layer, ChatGPT as the content engine, Make.com as the orchestration layer, and Ahrefs as the intelligence source, is replicable without custom software or an engineering team. What it does require is disciplined baseline measurement, rigorous prompt engineering, and a human review process that treats AI as a capable but fallible collaborator. Agencies that build that infrastructure now are creating a durable competitive advantage in a search landscape where the volume and speed of content production have permanently changed the rules.

Sources:

Know something we missed? Have a correction or additional information?

Submit a Tip

Never miss a story.
Get SEO Agency Growth updates weekly.

The top stories delivered to your inbox.

Free forever · Unsubscribe anytime

Discussion

More SEO Agency Growth Articles