MyPhotoAI · pSEO case study

Sitemap automation

A pSEO surface lives and dies by its sitemap. Without one, the search engine has to find the leaves by crawling internal links. With one, every new slug is announced explicitly the moment a deploy lands.

Composition at build time

The Next.js app's sitemap.ts reads the slug manifest and the spine routes, then composes one XML document at build time. The composition is mechanical:

  • Spine routes get higher priority (0.8 to 1.0) because they are the editorial anchors.
  • Leaf routes get a moderate priority (0.6) because the manifest is curated, not infinite.
  • The lastModified field uses build time, not commit time, so a rebuild without content changes still freshens the file.

The sitemap is regenerated on every deploy. There is no separate cron, no manual ping, no second source of truth.

IndexNow

The IndexNow protocol lets sites notify search engines of new and updated URLs without waiting for a crawl. The implementation is a single POST to https://api.indexnow.org/indexnow carrying:

  • The site host.
  • An IndexNow key registered at the project root.
  • A list of URLs (capped at 10,000 per request).

This site sends the IndexNow ping as a post-deploy hook on the VPS. The list of URLs is computed from the current sitemap; only routes that actually changed since the last deploy are included, where "changed" is approximated by recent build artifacts. False positives (re-pinging an unchanged URL) are cheap; false negatives (forgetting to ping a new URL) are expensive.

Time-to-index targets

These targets reflect what is achievable, not what is guaranteed. Search engines decide indexing on their own schedule.

| Stage | Target | |---|---| | Sitemap discoverable | minutes (immediate; static file on the CDN) | | Google Search Console acknowledged | hours (after sitemap submission) | | First crawl of a new slug | hours to days | | New slug indexed | typically within the same week |

The IndexNow ping pulls the lower bounds in by a meaningful amount on the search engines that participate (Bing and Yandex out of the box; Google does not). For Google specifically, the leverage comes from sitemap freshness plus internal-link density, both of which the manifest controls.

What this layer does not do

  • It does not handle removal signals. If a slug is deleted from the manifest, the sitemap stops listing it; the search engine eventually drops it. There is no <lastmod> trick that forces a faster removal; for that, the URL would need to return 410, which is outside the scope of this layer.
  • It does not implement structured data. That lives on the per-leaf renderer as JSON-LD; see the architecture for where it fits.
  • It does not rank pages. Ranking is the search engine's job; this layer just makes sure the search engine has the data it needs to do its job.

The results page describes what the combination of sitemap automation plus IndexNow did to time-to-first-result for new slugs, in ranges.