
the strategy behind the video: how we combine ai, google search console, and clean architecture to build a compounding seo flywheel.
share this post
stay in the loop
When we publish new experiments or playbooks, we’ll send you the highlights so you can apply them faster.
Your feedback helps us improve how we deliver practical playbooks.
Productized execution
Search behavior is changing, but the fundamentals remain: clearer intent, stronger proof, and consistent technical quality. Prism operationalizes that into predictable results.
Keep learning
More experiments and playbooks from the Prism team.
ai has compressed clicks—let’s rebuild your funnel with task‑first pages, rich context, and interactive wins.
By Enzo Sison - Founder, Prism
Hey, I'm Enzo. We build and maintain websites for founders who actually care about growth: more qualified leads, better conversion rates, and higher lifetime value per customer.
In the video below, I walk through a live example using a Beverly Hills dental practice. This post slows it down and shows, step-by-step, how we turn AI + Google Search Console (opens in a new tab) + clean site architecture into a compounding SEO (opens in a new tab) flywheel you can use on any website.
You can copy a lot of this yourself. If you'd rather have a team run it end-to-end for you, that's what we do at Prism.
Everything we do is aimed at moving three numbers:
Most "SEO work" gets lost in vanity metrics: traffic, rankings, impressions. We use those, but only as inputs into a system designed to move the three numbers above. That system is the SEO flywheel.
A flywheel is a loop that gets easier to spin the more you spin it.
For SEO, our flywheel looks like this:
Most founders stop at step 1.5. They glance at GSC, feel overwhelmed, and go back to guessing. Let's walk through the whole loop.
Here's what we use under the hood at Prism:
You can absolutely approximate this with GSC, a solid AI model in any interface, and either your own dev team or a good website builder. The framework matters more than the exact tools.
If you don't have Search Console set up, do that first. No way around it.
Once it's running and you've got at least a few weeks of data, look at:
In the video example, the dental practice had roughly 14.5k impressions in 28 days, 66 clicks, and a 0.5% CTR. Not amazing, but it's real data, and real data is what we want. Even "bad" data is better than guessing.
This is where AI takes the pain away.
Export the Queries table from GSC (or copy it out of the UI if you have to). Then feed it to your model with a prompt along these lines:
You are an SEO strategist.
Here is a Google Search Console export for [BUSINESS NAME].
Each row has: query, impressions, clicks, CTR, average position.
1. Group the queries by intent (informational, commercial, navigational, local, etc.).
2. Identify:
- which queries are already converting (high CTR, decent position),
- which queries are sleepers (high impressions, low CTR),
- which queries show unmet intent (we rank but do not fully answer the question).
3. Propose 10-20 new page or content ideas that would best capture and convert this demand.
4. Prioritize those ideas for business impact, not just traffic volume.
For our Beverly Hills client, the model came back with clusters like "veneers Beverly Hills," "cosmetic dentist," and "full mouth reconstruction," plus gaps where people were searching for specific procedures or FAQs that the site barely covered. The AI-generated roadmap included service pages, FAQ pages, location-optimized pages, and educational blog posts.
The important thing: you're no longer staring blindly at a CSV. The AI is summarizing, clustering, and ranking it for you.
Queries tell you what people want. Your sitemap tells you how your site is currently structured to answer (or ignore) that demand.
For any site, you can usually hit:
https://yourdomain.com/sitemap.xml
Copy or export that XML, then feed it to the model as a second piece of context:
Here is the current sitemap for the same website.
Each URL represents a page. Use this together with the GSC analysis you just did.
1. Map the existing URLs to the query clusters and page ideas you proposed.
2. Identify:
- which existing pages should be improved or expanded,
- which entirely new pages need to be created,
- which pages are redundant or confusing.
3. For each new page you recommend, suggest:
- URL slug,
- page type (service, blog, FAQ, location page, etc.),
- where it should live in the navigation / site hierarchy,
- the primary query and supporting queries it should target.
Here you're not just adding a blog post. You're re-architecting the site so Google and AI crawlers can instantly understand what you do, who you serve, where you operate, and why you're a good answer for specific search intents. AI does the heavy thinking; you decide what makes business sense.
One thing the model and I both strongly recommend: do not drop 50 new pages in one day.
Better:
Ask the model:
Given this list of recommended pages and our current authority,
design a 12-16 week publishing schedule.
Constraints:
- 1-2 new or significantly revised pages per week.
- Prioritize pages that are closest to purchase intent and most aligned with our core services.
- For each week, list:
- The page(s) to create/update.
- The main goal of the page.
- How we should measure success in Search Console.
Now you have a roadmap instead of a random list of ideas.
Ideas do not move metrics. Shipped pages do.
At Prism, this is where our Codeex tooling comes in. We generate initial drafts of each page (copy and structure) with the same model that designed the strategy, then use AI-assisted coding to build or update templates, wire up internal links, adjust URLs and redirects, keep the sitemap and robots directives clean, and handle technical SEO (titles, meta, schema, etc.).
If you're doing this yourself:
Keep a human in the loop for voice, accuracy, regulations, design, and UX.
Once new pages are live, give Google some time to crawl and index them. Then head back into Search Console and filter performance by page to see how each new asset is doing. Look again at queries:
Then you repeat the process:
Every loop makes your site easier for humans to navigate, easier for Google and AI crawlers to understand, and more aligned with real demand. That's the flywheel.
Let's be blunt about where this usually goes wrong:
Here's a minimal, realistic DIY plan you can execute without hiring anyone:
You will learn a ton just by doing that.
If you're a founder already juggling hiring, sales, product, and operations, you probably don't have the bandwidth to deep-dive GSC exports, architect sitemaps, hand-hold AI models, and ship technically clean pages every week.
That's literally what we do at Prism. We plug into your existing site (or rebuild it if needed), set up and clean up your analytics stack, build the AI-powered SEO flywheel on top, and run it continuously so your online presence compounds over time instead of stagnating.
If that sounds useful, start with our AI SEO services and reach out. If you're more in DIY mode, use this post and the video as your playbook.
Either way: this is the best time to build. Put your site out there, connect it to the right tools, and start letting AI help you design and execute a smarter content strategy. You're not too small to outrank large, slow incumbents; you're just one good flywheel away.