Technical SEO Checklist for Developers: What to Verify Before You Hit Launch
Picture this. You’ve just finished a three-month build. The site looks sharp on every device, the code is clean, and the client is happy. You push to production, announce the launch on LinkedIn, and sit back to watch the traffic roll in.
Nothing comes.
A week passes. Two weeks. The pages sit at the bottom of Google. Or worse, they don’t appear at all. You open Google Search Console and discover the problem buried in the coverage report: a single robots.txt directive, pushed from staging to production, has been telling Googlebot to disallow the entire site. Every page from day one.
This isn’t a hypothetical. It happens constantly, and it’s one of dozens of technical mistakes that can silently kill a site’s organic visibility before it ever gets a chance to rank. The frustrating part? Most of these issues take minutes to fix during development and hours (sometimes months) to recover from after launch.
This checklist covers everything a developer needs to verify before shipping a site live, from crawlability fundamentals to Core Web Vitals to structured data. Think of it as a pre-flight check for search engines. Boring? Maybe. Necessary? Absolutely.
Quick Reference: Technical SEO Pre-Launch Checklist
Use the table below as your at-a-glance summary. Each area is expanded in detail in the sections that follow.
| Area | What to Check | Key Tool |
| Crawlability | robots.txt, sitemap, site architecture | Screaming Frog, GSC |
| Indexability | Canonicals, noindex tags, HTTP/HTTPS | GSC Pages Report |
| Core Web Vitals | LCP, INP, CLS on key templates | PageSpeed Insights |
| Security | HTTPS, mixed content, 404 page | SSL checker, browser dev tools |
| Structured Data | Schema markup, validation | Google’s Rich Results Test |
| Mobile & Rendering | Content parity, JS rendering | GSC URL Inspection Tool |
1. Crawlability: Can Google Find Your Pages?
Before Google can rank anything, it has to find it. Crawlability is about making sure search engine bots can actually access the pages you want in the index.
robots.txt
Your robots.txt file lives at yoursite.com/robots.txt and tells crawlers which parts of your site to access or avoid. The most dangerous mistake developers make here is pushing staging configuration to production.
Many staging environments use a blanket Disallow: / directive to prevent indexing during development. If that file gets deployed to live, Googlebot is locked out entirely.
Before launch, open the file, read it line by line, and confirm no critical directories (especially your main content, blog, or product pages) are being blocked.
XML Sitemap
Your sitemap acts as a guide, showing search engines which pages exist and when they’ve been updated. Every URL in your sitemap should:
- Return a 200 status code
- Be the canonical version of the page
- Be something you actually want indexed
Submit it via Google Search Console on launch day.
Site Architecture
Keep your hierarchy flat. Every important page should be reachable within three clicks of the homepage. Pages buried deeper than that receive less crawl priority and are often missed entirely.
Run a crawl tool like Screaming Frog to identify orphan pages (pages with zero internal links) before they go live.
2. Indexability: Should Google Index Your Pages?
Crawlability gets Google to a page. Indexability determines whether that page makes it into search results. A page can be crawled but never indexed, and you won’t know unless you check.
Canonical Tags
Every page should have a self-referencing canonical tag unless you intentionally want to consolidate its authority elsewhere.
Common pitfalls to audit:
- HTTP vs. HTTPS mismatches
- www vs. non-www conflicts
- Canonicals pointing to noindexed pages (a logical contradiction that forces Google to guess your intent)
Noindex Audit
As mentioned above, staging environments frequently carry noindex tags into production. Before launch, crawl every template and confirm noindex directives only exist on pages you intentionally want excluded, such as admin pages, thank-you pages, and internal search results. Not your homepage.
URL Parameters
If your framework generates dynamic URLs with parameters, such as filtering (e.g., by product color or category, creating a unique URL per combination), sorting (e.g., by price or newest, duplicating the same listing in a different order), or session IDs (appending a unique string per user to every URL), you risk creating thousands of near-identical pages that compete with each other in Google’s index.
Use canonical tags or URL parameter handling in Google Search Console to bring these under control.
Common launch day mistake
Leaving HTTP and HTTPS both accessible as separate versions of your site. In a Semrush study of 50,000 domains, 27% of websites had both versions accessible simultaneously, splitting crawl equity and confusing Google about which to index.
3. Core Web Vitals: Will Google Like the Experience?
Did you know? Only 48% of mobile websites pass all Core Web Vitals thresholds. – HTTP Archive
Google has three specific performance metrics it uses to measure page experience, and they are a confirmed ranking factor. Understanding all three before launch, not after, gives you the best chance of starting with a clean slate.
The 3 Metrics at a Glance
| Metric | What It Measures | Good Target | Needs Work | Poor |
| LCP | Largest element render time (loading) | ≤ 2.5s | 2.5s-4.0s | > 4.0s |
| INP | Responsiveness to clicks/taps (interactivity) | ≤ 200 ms | 200-500 ms | > 500 ms |
| CLS | Unexpected layout shifts (visual stability) | ≤ 0.1 | 0.1-0.25 | > 0.25 |
LCP (Largest Contentful Paint)
Target: 2.5 seconds or faster
LCP measures the render time of the biggest content block a user can see on screen, typically a banner image or primary heading.
Watch out for these common developer-side culprits:
- Uncompressed images
- Render-blocking scripts
- Slow server response times
Use next-gen formats (WebP/AVIF), compress aggressively, and defer non-critical JavaScript.
INP (Interaction to Next Paint)
Target: 200 ms or faster
INP replaced First Input Delay (FID) as a Core Web Vital in March 2024. It measures how quickly your site responds to user interactions, including clicks, taps, or keypresses, across the entire session, not just the first one.
90% of a user’s time on a page happens after it loads, which makes responsiveness a critical part of performance. Heavy JavaScript frameworks and third-party scripts are common causes of slow interactions.
Audit what you’re loading and eliminate anything nonessential.
CLS (Cumulative Layout Shift)
Target: 0.1 or lower
CLS measures visual stability, how much elements unexpectedly shift as the page loads. The fixes are straightforward but easy to forget:
- Set explicit width and height attributes on all images and videos
- Reserve space for ads and embeds before they load
- Use font-display: swap to avoid invisible text, and ensure fallback fonts are similar to reduce layout shifts
Pro tip
Run PageSpeed Insights on your top three page templates before launch, not just the homepage. Hero sections, product pages, and blog templates often behave very differently. Catching a 6-second LCP on a product template during development is far easier than fixing it post-launch.
4. Security: The Non-Negotiables
Google treats HTTPS as a baseline trust signal. Insecure experiences can trigger browser warnings, and sites that skip HTTPS miss out on an established ranking signal.
These checks are fast and shouldn’t require much effort in a modern build, but they’re easy to overlook.
- HTTPS across every page, not just the homepage: Verify that your SSL certificate covers all subdomains and isn’t set to expire within the next three months.
- Mixed content: This is a common oversight when migrating HTTP assets to an HTTPS domain. Even a single insecure image or script can trigger browser warnings and remove the secure padlock.
- Custom 404 page: Beyond user experience, a proper 404 page helps prevent search engines from repeatedly crawling dead-end URLs without clear guidance about where to go next.
- 301 redirects: Redirect chains (where one URL redirects to another, which redirects to another) can dilute link equity. Consolidate them into a single hop wherever possible.
5. Structured Data: Help Google Understand What You Built
Schema markup is machine-readable metadata that tells Google the specific meaning of your content. Without it, Google reads your page as plain text and makes its best guess.
With it, you unlock rich results, such as star ratings, FAQ dropdowns, product prices, and breadcrumbs, directly in search results. These expand your SERP real estate and improve click-through rates.
Start with the schema types most relevant to the site type:
| Site Type | Recommended Schema | Benefit |
| Blog / News | Article, BlogPosting | Article rich results in Google |
| eCommerce | Product, AggregateRating | Price, stock, stars in SERPs |
| Local Business | LocalBusiness, Organization | Knowledge Panel, Maps visibility |
| Any site | FAQ, HowTo, Breadcrumb | Expanded SERP real estate |
Validate your structured data before launch using Google’s Rich Results Test. Schema errors won’t directly impact rankings, but they can prevent your pages from qualifying for rich results. In competitive SERPs, missing those enhancements can mean losing valuable visibility.
6. Mobile and JavaScript Rendering
Google has used mobile-first indexing as its default since July 2024. That means the mobile version of your site is what gets crawled, evaluated, and ranked, not desktop.
Two checks are critical here:
- Content parity: Every piece of content on your desktop version, such as body copy, headings, structured data, and internal links, must also be present and accessible on mobile. If key content is missing, it won’t be indexed or used for ranking.
- JavaScript rendering: If your site uses a JS-heavy framework such as React, Angular, or Vue, Googlebot may see an incomplete or empty page instead of your actual content on first fetch. Use the URL Inspection Tool in Google Search Console and click “View Tested Page” to confirm that the rendered output matches what users see in the browser.
JS-heavy sites: a known risk
JavaScript-dependent content is rendered after the initial crawl, which can delay indexing. When critical elements like headings or product details rely on client-side rendering, Google may not see them immediately—or at all if rendering is deferred or fails. Ensuring key content is present in the initial HTML response helps avoid these gaps.
Bonus: Post-Launch Monitoring Setup
The checklist above is for before launch. But technical SEO isn’t a one-time event. Sites evolve, code gets updated, and new issues surface over time. Set these up on day one:
- Connect to Google Search Console before or at launch and verify ownership so you can capture early crawl and indexing data.
- Submit your XML sitemap on launch day under “Sitemaps” to help Google discover your URLs faster.
- Check the Pages report within the first few days as data begins to populate. Watch for unexpected exclusions like “Crawled, currently not indexed” on important pages.
- Use PageSpeed Insights to test key templates, and monitor Core Web Vitals in Search Console as field data from real users starts to populate over the following weeks.
If managing ongoing technical health feels outside your scope, especially for larger or more complex builds, working with a trusted SEO partner helps keep your technical foundation solid as your site scales and evolves. Technical debt accumulates quietly, and regular audits are what catch it early.
Technical SEO: Final Reminders
Technical SEO doesn’t win rankings on its own. Great content, strong links, and genuine authority all play a role. But none of that matters if search engines can’t crawl, index, or render your site correctly. The mistakes on this checklist are invisible until they become expensive.
Run through this list before every launch. It’s a small step that prevents costly fixes later.

