A programmatic SEO course in one post

@iannuttall // April 2, 2025 // 14.9K views

My programmatic SEO portfolio gets 80+ million impressions and 1+ million clicks a month. On auto-pilot.

So... here is everything I know about programmatic SEO (for the last time).

The basics

First off - programmatic SEO is still mostly SEO. 90% of SEO is just doing the basics really well.

Here's my SEO checklist:

  • Simple, focused title tag and description
  • User-friendly H1 (while still getting the keyword/topic in)
  • Simple H2/H3 structure
  • Good internal links (and even this I often ship without and do later!)
  • No SEO slop (notice how I didn't start this post with "what is pSEO? programmatic SEO benefits/examples" etc)
  • XML sitemaps for all pages (automatic with WP plugins, easy for devs to build)
  • Build or buy backlinks (my preference is acquiring sites that already have links)

That's it. Were you expecting more?

Sometimes I do images. Sometimes I add charts and tables. But not often.

What really matters for long term SEO success is actually taking the time and effort to create something that's genuinely useful.

My directory of MCP servers is getting almost 1k visitors per day at the time of writing this. I published it less than 2 weeks ago.

On the surface it's just a blog post explaining how to install and set up the server. But the true skill came in being able to find the data and automating that at scale, feeding it to AI to create content that is helpful.

Also: pro tip for pSEO - it's very powerful in new niches to help you get in there early as an author. MCP keywords and search volume don't even register yet - but they will soon.

If you've never heard the term programmatic SEO before then read this post before you continue.

Programmatic SEO in a world of LLMs

"But Manus can search and build directory sites in minutes"

"ChatGPT can write me 100,000 posts on every single topic"

True. But Manus still needs a data source to do what it does. And AI slop with no context doesn't rank. And on the rare chance it does, it never lasts.

AI needs context. Somebody still needs to scrape and collect it for them to build directory sites in seconds.

I am more bullish on programmatic SEO in the AI era than I ever was before.

Once upon a time I would spend days, weeks even, building up my template based on the data.

Every page was the same, with the variables replaced. Like Madlibs for content.

That still works. Most of my sites use that exact same format and it's the core of this course that I teach.

But AI made it easier and faster to build templates that are incredibly customised, well written, and unique from page to page.

This is not your typical AI spammer content where they give it a keyword and let it hallucinate happily.

With my MCP example, I scraped and collected data from a bunch of sources, including the Github API, to find as much data and information as I could.

I then fed that into Claude Sonnet 3.7 with examples of the format and writing style I wanted, how it should link into using the MCP with the Cursor IDE and I collected data on the number of downloads and stars as well.

The result is all killer, no filler, guides on how to set up each server. It's original, unique content.

Next, I can use AI to tag all of these servers and start showing related or similar servers as internal links.

The core pSEO principles

Most of my course content is from 2022 and the examples focus on blog post content (mostly).

The SEO landscape has changed. If I were ChatGPT, I might even say it's "an ever-changing landscape".

SEO blog content does not work any more UNLESS you already have a very well established site with a lot of authority.

Luckily, I have a few of those.

But the ones without authority? Crushed. Even though I would say the content on the smaller sites is generally better and more useful. Google, eh?

The principles that the course teaches don't change though.

Keyword research

I don't spend much time on this. Keywords are pretty easy.

  • seo agencies in {city}
  • {product1} vs {product2}
  • {competitor} alternatives
  • average salary for {profession}
  • cheap flights from {from} to {to}
  • ai {topic} photos
  • You get the idea...

Data collection

You either need to find a data source or build your own (either directly or as a by-product of your app).

Building a scraper with AI is so simple now. For my MCP directory I literally gave it URLs of a few sources and the Cursor agent fetched the page HTML, found the right elements to target, and built the scraper around it.

After some back and forth, I had something robust that could roll out and find all of the URLs needed to get the data.

My personal preference here would be to scrape and collate your data from different sources. I rarely use data sources for new projects.

You can also have user-generated content and crowdsource your data. A great example of this is Cursor Directory and their Cursor rules that were all submitted by users.

What sites would I build?

I've mentioned already that traditional blog posts are out.

Nobody cares if your {cat|dog|rabbit|wombat} can eat blueberries!

That's not to say that blog-like content can't do well. Informational content has to be specific, detailed, and up to date.

Basically it needs to be something that AI would hallucinate if it tried to one-shot write the post.

I know from personal experience, for example, that it will just make up commands to install packages for MCP servers.

You can probably figure out from the keyword examples above that the types of sites that work very well on with pSEO are:

  • Product landing pages (comparisons, alternatives, etc)
  • Tools (e.g flight times and prices)
  • Directory listings (like MCP servers, heh)

Directories are one of the most interesting options and I think a lot of people don't leverage them well enough.

Often you have the chance to rank very well for product brand names. Some of these are up and coming names that do not get searched yet, but will once they scale.

So you can have a landing page like mydirectory.test/{brand} which has all of the listing details, but if you build your app and data structure well, you could easily have a couple more URLs that target specific keywords users might type:

  • mydirectory.test/{brand}/alternatives
  • mydirectory.test/{brand}/reviews

Product Hunt have over 16,000 pages in Google with the /reviews URL and according to Ahrefs they get 52,000 visits a month from it.

I bet that figure is much higher because a lot of these products are niche and may not have accurate search volume data.

Number of pages (and indexing them)

Despite what John Mueller might tell you, programmatic SEO != spam. If it does, you're doing it wrong.

Some of my early experiments had literally BILLIONS of pages. But the content was useful to a lot of people.

But like anything at massive scale, you get diminishing returns. Number of indexed pages will often drop significantly at this scale.

I don't recommend building sites with 10s or 100s of thousands of pages any more.

It can work, but getting them indexed (and keeping them indexed) is a big challenge and I think the true power of programmatic SEO is to help you publish and index a smaller amount of pages that will actually convert - whatever that means for you.

Programmatic SEO is at least 10x faster than traditional content marketing.

My general rules for scaling and indexing:

  • Less than 10,000 pages unless you really need them
  • Create XML sitemaps and submit them to Google Search Console
  • Add the sitemaps to your robots.txt file
  • For very large sites, limit to 10k URLs each (yes, even though the limit is 50k - limiting this does help Google to crawl them better)

Common problems to solve

Most of this are general problems you might have with SEO, and programmatic SEO just helps you hit some scale issues much quicker.

Slow crawl speeds

Google indexing and crawling is a huge topic but the gist of it is that the more pages you have, the slower they will likely crawl and re-crawl them.

I get asked a lot about whether you should publish all at once or drip-publish your content. So my advice for launching pSEO sites or sections of your site:

  • Publish all of the content you have as soon as possible. This is anecdotal but in my experience I see a huge percentage of pages indexed very quickly when I publish them all at the same time.
  • Add lots of internal links. This can be as simple as breadcrumbs and a list of random or related posts in the sidebar.
  • Every page on your site should be findable within 3 clicks. This forces you to have a quite flat structure and to think about when and where you're linking into pages to help crawlers find them.
  • Submit your XML sitemap to Google Search Console. Is this really a tip? Seems like it's just common sense...
  • Use an indexing tool like URL Monitor. Fun fact, I built URL Monitor coding back/forth with the Claude web app, scaled it to $100k ARR and sold it for $250k after 4 months. It still works for indexing pages.

Duplicate content issues (and thin content)

Another very frequent question is whether you get penalised for duplicate content with programmatic SEO.

Short answer: no.

As long as your pages target different keywords in the title, H1, and throughout the page, it doesn't matter if your template is 80% the same across pages.

This really isn't an issue at all with AI because you can just create unique content for each page now. For my MCP directory of 3,000+ servers I think it cost me $40 using Sonnet 3.7.

The same applies for thin content. There's no excuse for it now.

Garage in will result in garbage out so it's a pre-requisite of pSEO to put in as much effort as you can to scrape and organise data and content examples to use as context for AI to write the pages.

Keyword cannibalisation

If you have a bunch of pages on your site that target the same keyword, Google can't figure out which one to rank, so neither end up ranking.

Let's say you had a page targeting programmatic seo at yoursite.test/programmatic-seo and on your blog you also added a tag for all posts about that topic yoursite.test/blog/tag/programmatic-seo.

This could be an issue since both are focused on the same keyword.

In that situation it's better to have a non-public way of tagging your articles.

To avoid keyword cannibilisation:

  • Pay attention to your keywords and any public URLs created by tags, categories, etc
  • This includes synonyms - pick one only (for example tools for seo and seo tools should be the same page)

The end

This is a complete summary of what I teach in the course and how I would apply programmatic SEO now.

If you have any questions reply to me on X or buy the course so you can contact me directly ;)