·10 min read

The Developer's SEO Checklist: 20 Things to Ship Before Launch

Most developer-built sites ship with the same SEO gaps. Missing canonical tags, no sitemap, client-rendered content that search engines never see. This is a concrete, copy-paste-ready checklist of 20 items you should complete before any launch. Each one takes under 5 minutes.

Technical SEO

1

Unique title tag on every page

Why: The title tag is the strongest on-page ranking signal. Duplicate or missing titles waste it.

Fix: Set a unique title under 60 characters on every public page, with the primary keyword near the front.

// app/pricing/page.tsx
export const metadata: Metadata = {
  title: "Pricing", // renders as "Pricing | YourProduct" via template
}
2

Meta description on every page

Why: Descriptions appear in search results and influence click-through rate. Missing ones get auto-generated (usually badly).

Fix: Write a unique description under 160 characters for every public page. Write it like ad copy.

3

Canonical URL on every page

Why: Without canonicals, search engines may treat URL variants (with/without trailing slash, query params) as duplicate content.

Fix: Add alternates.canonical to every page's metadata export.

export const metadata: Metadata = {
  alternates: { canonical: "/pricing" },
}
4

robots.txt is live

Why: Without robots.txt, crawlers have no guidance. Some will index your dashboard or API routes.

Fix: Create app/robots.ts that allows / and disallows /dashboard/ and /api/.

// app/robots.ts
export default function robots(): MetadataRoute.Robots {
  return {
    rules: { userAgent: "*", allow: "/", disallow: ["/dashboard/", "/api/"] },
    sitemap: `${process.env.NEXT_PUBLIC_SITE_URL}/sitemap.xml`,
  }
}
5

sitemap.xml is live and submitted

Why: Sitemaps tell search engines which pages exist. Without one, discovery relies on crawling links, which is slower.

Fix: Create app/sitemap.ts. Submit the URL in Google Search Console.

// app/sitemap.ts
export default function sitemap(): MetadataRoute.Sitemap {
  return [
    { url: "https://yoursite.com", priority: 1.0 },
    { url: "https://yoursite.com/pricing", priority: 0.8 },
    { url: "https://yoursite.com/blog", priority: 0.7 },
  ]
}
6

HTTPS with no mixed content

Why: HTTP pages get a ranking penalty and browsers show security warnings. Mixed content (HTTP resources on HTTPS pages) triggers warnings too.

Fix: Serve everything over HTTPS. Search your HTML for http:// src or href attributes and replace them.

7

No broken internal links

Why: Broken links waste crawl budget and frustrate users. Search engines penalize sites with many 404s.

Fix: Run a crawl or scan to find all internal links returning 404. Fix or remove them.

Content & Structure

8

One H1 per page, containing the primary keyword

Why: The H1 is the strongest heading signal. Multiple H1s dilute it. No H1 wastes it entirely.

Fix: Ensure exactly one <h1> on every page with the target keyword inside it.

9

Logical heading hierarchy (H1 > H2 > H3)

Why: Skipping levels (H1 then H4) confuses both search engines and screen readers.

Fix: Audit your headings. Use H2 for major sections, H3 for subsections. Never skip a level.

10

Descriptive alt text on all images

Why: Search engines cannot see images. Alt text is how they index visual content. It also powers accessibility.

Fix: Add alt text to every <img> and next/image. Describe what the image shows, not the filename.

11

Open Graph tags (title, description, image)

Why: OG tags control how your pages look when shared on social media and in messaging apps. Missing ones show blank previews.

Fix: Add og:title, og:description, and og:image (1200x630px minimum) to every public page.

export const metadata: Metadata = {
  openGraph: {
    title: "Your Page Title",
    description: "One clear sentence.",
    images: [{ url: "/og-image.png", width: 1200, height: 630 }],
  },
}
12

JSON-LD structured data

Why: Structured data helps search engines understand your content type and can trigger rich results (FAQ snippets, product cards, etc).

Fix: Add SoftwareApplication schema to your landing page, BlogPosting to articles, FAQPage to FAQ sections.

13

URL structure is clean and readable

Why: Clean URLs rank better and get more clicks. /blog/seo-checklist is better than /blog?id=42&ref=xyz.

Fix: Use lowercase, hyphenated slugs under 100 characters. Remove query params from canonical URLs.

Performance

14

Key pages are server-rendered

Why: Client-side rendered pages may not be indexed by search engines. Googlebot handles JS, but not always completely.

Fix: Use SSG or SSR for every page you want indexed. Dashboard pages behind auth can stay client-rendered.

15

LCP under 2.5 seconds

Why: Largest Contentful Paint is a Core Web Vital and a direct ranking factor. Slow LCP means lower rankings.

Fix: Add priority to above-fold images, use next/font, avoid layout shifts from web fonts loading late.

<Image
  src="/hero.png"
  alt="Product screenshot"
  width={1200}
  height={630}
  priority // preloads the image for faster LCP
/>
16

No render-blocking scripts in the head

Why: Scripts in <head> without async or defer block rendering and hurt all Core Web Vitals.

Fix: Move scripts to the bottom of the body, or add async/defer. Use next/script with strategy='afterInteractive'.

17

Images use modern formats (WebP/AVIF)

Why: PNG and JPEG are 2-5x larger than WebP/AVIF at the same quality. Smaller images mean faster pages.

Fix: Use next/image (auto-converts to WebP). For static assets, convert manually with squoosh.app or sharp.

18

Viewport meta tag is present

Why: Without the viewport tag, mobile browsers render pages at desktop width. Google's mobile-first index will penalize this.

Fix: Next.js adds this automatically. If you are using a custom layout, add it to your <head>.

<meta name="viewport" content="width=device-width, initial-scale=1" />

AI Visibility

19

llms.txt file at your site root

Why: AI search engines (ChatGPT, Perplexity) use llms.txt to understand your site without crawling every page.

Fix: Create a public/llms.txt with your product name, description, pricing, and key links.

20

robots.txt allows AI retrieval bots

Why: Many default robots.txt configs block all bots including AI search. Your content becomes invisible to ChatGPT and Perplexity.

Fix: Allow ChatGPT-User and PerplexityBot in robots.txt. Block training bots (GPTBot, CCBot) separately if desired.

FAQ

How many of these items should I complete before launching?

All 20. These are not aspirational goals. They are baseline requirements that take 1-2 hours total for a typical site. Skipping any of them means leaving traffic on the table from day one.

Do I need an SEO tool to check these items?

You can check most of them manually using browser DevTools and View Source. However, automated tools like SEOLint catch issues you would miss, especially across multiple pages. A single scan takes 30 seconds and checks all 20 items plus more.

Does this checklist apply to single-page apps (SPAs)?

Yes, but SPAs have additional challenges. If your app renders entirely on the client side, search engines may not see your content. Use server-side rendering or static generation for any page you want indexed. Items 14 and 15 in this checklist address this directly.

Check all 20 items in one scan

SEOLint checks every item on this list automatically. Paste a URL, get a full report with fix instructions in 30 seconds.