Technical SEO Checklist for SaaS: 100+ Points to Audit in 2026
The definitive technical SEO checklist for SaaS companies — covering crawlability, indexability, Core Web Vitals, schema, security, AI optimisation, and mobile.
Technical SEO is the foundation that every other SEO effort rests on. Brilliant content that cannot be crawled, indexed, or loaded quickly does not rank — on Google or in AI-generated answers. For SaaS companies, where the website is simultaneously a marketing asset and a product entry point, technical SEO failures translate directly into lost trials, lost rankings, and lost revenue.
This checklist covers every technical SEO element that matters in 2026, organised by category, with an explanation of why each item matters and how to fix common failures. Use it as your complete technical audit framework — run through it quarterly. For the full AI SEO strategy, see The Complete Guide to AI-Powered SEO in 2026.
Why Technical SEO Is the Foundation of Everything
Technical SEO creates the conditions under which all other SEO investment works. A page that loads in 4 seconds will not hold position 1 regardless of its content quality. A page with a misconfigured canonical tag may not appear in Google's index at all. A site with AI crawlers blocked in robots.txt is completely absent from ChatGPT and Perplexity — regardless of domain authority.
Technical issues are particularly insidious because they are invisible to users and often invisible in analytics. A page can rank temporarily with technical problems, masking issues until they compound into a significant ranking drop. Regular technical audits prevent this.
OmniRank's automated technical audit checks all of the following signals across your entire site and delivers a prioritised fix list — run a free audit today.
Crawlability Checklist
Crawlability ensures search engines and AI crawlers can discover and access all your important pages.
-
robots.txtis present at/robots.txtand returns a 200 status code -
robots.txtdoes not block Googlebot, Bingbot, or other major search crawlers - AI crawlers explicitly allowed: GPTBot, Claude-Web, PerplexityBot, Google-Extended, anthropic-ai
- Sitemap URL declared at the bottom of
robots.txt - XML sitemap exists at
/sitemap.xmland returns a 200 status code - Sitemap submitted to Google Search Console
- Sitemap submitted to Bing Webmaster Tools
- Sitemap contains only indexable pages (no 404s, no noindex pages, no redirect sources)
- Sitemap
lastmoddates are accurate and updated when content changes - No redirect chains longer than 2 hops on internal links
- No broken internal links (returning 404 or 5xx)
- All important pages reachable within 3 clicks from the homepage
- Pagination handled correctly (rel=canonical or self-referencing canonical)
- No infinite scroll without static pagination fallback
- No JavaScript-only navigation that blocks crawlers
-
/llms.txtfile present at domain root listing key pages
Indexability Checklist
Indexability determines whether crawled pages are actually added to search indexes.
- No conflicting index signals (noindex in meta tag + page included in sitemap)
- Canonical tags present on all pages pointing to the correct URL
- No duplicate content between www and non-www versions (301 redirect in place)
- HTTP to HTTPS redirect in place (301, not 302)
- No accidental noindex on high-value pages (check after every major deployment)
- Thin pages (under 300 words) either expanded, consolidated, or noindexed
- Parameter-based duplicate URLs handled via canonical or robots.txt exclusion
- Faceted navigation URLs handled (canonicalised to section page or excluded)
- No meta refresh redirects on important pages
- Hreflang tags implemented correctly if serving multiple languages
Page Speed and Core Web Vitals Checklist
Core Web Vitals are Google ranking signals and affect AI crawler efficiency.
- LCP (Largest Contentful Paint) under 2.5 seconds on all key pages (field data)
- INP (Interaction to Next Paint) under 200ms on all interactive pages
- CLS (Cumulative Layout Shift) under 0.1 on all pages
- TTFB (Time to First Byte) under 800ms
- All images served in WebP or AVIF format
- Width and height attributes set on all images (prevents CLS)
- Above-the-fold images preloaded with
fetchpriority="high" - No render-blocking CSS or JavaScript in the critical path
- Web fonts loaded with
font-display: swap - Third-party scripts loaded asynchronously or deferred
- CDN in use for static assets (images, CSS, JavaScript)
- Browser caching configured for static assets (1 year for versioned assets)
- HTTP/2 or HTTP/3 enabled on the web server
- Total page weight under 2MB for all key landing pages
Structured Data Checklist
Schema markup is essential for both rich results and AI citation accuracy.
- Organization schema on homepage (name, url, logo, sameAs social profiles)
- WebSite schema with SearchAction on homepage
- SoftwareApplication schema on pricing and product pages (for SaaS sites)
- Article schema on all blog posts and guides
- FAQPage schema on all pages with FAQ sections (minimum 4 Q&A pairs)
- BreadcrumbList schema on all interior pages
- No schema validation errors (tested via Google's Rich Results Test)
- Schema markup uses JSON-LD format (not microdata or RDFa)
Security and HTTPS Checklist
- Valid SSL certificate installed (not expiring within 30 days)
- HTTPS enforced on all pages (no HTTP alternatives accessible)
- No mixed content warnings (all resources loaded over HTTPS)
- HSTS header present on all responses
- X-Content-Type-Options: nosniff header present
- X-Frame-Options or Content-Security-Policy frame-ancestors configured
- X-XSS-Protection header present
-
/.well-known/security.txtfile present
AI Optimisation Checklist
The additional technical requirements of AI-powered search in 2026.
-
robots.txtexplicitly allows all major AI crawlers (see Crawlability section) -
/llms.txtpresent and lists 10+ key pages with descriptions -
/llms.txtupdated whenever major new content is published - IndexNow key file present at domain root for fast Bing/Yandex submission
- FAQPage schema present on all content pages (critical for AI Overview eligibility)
- All content pages have named authors with professional bios
- Organization schema includes comprehensive company description
Mobile Optimisation Checklist
Google uses mobile-first indexing — your mobile site is your primary site.
- Site passes Google's Mobile-Friendly Test
- No horizontal scrolling at 375px viewport width
- Tap targets (buttons, links) minimum 44×44 pixels
- Font size minimum 16px for body text
- Viewport meta tag present on all pages:
<meta name="viewport" content="width=device-width, initial-scale=1"> - No Flash or non-mobile-compatible technology in use
How to Use This Checklist
The most efficient approach is to work through sections in priority order: Crawlability and Indexability first (these are binary blockers — issues here prevent everything else from working), then Core Web Vitals (these affect ranking directly), then Structured Data (high ROI for both rich results and AI visibility), then the remaining sections.
Run this full checklist quarterly. Before major site migrations, redesigns, or technology changes, run it in full — technical issues introduced during major changes are the most common source of ranking drops.
Frequently Asked Questions
How often should I run a technical SEO audit?
Monthly for sites that actively publish content and deploy code changes. Quarterly at minimum for stable sites. Always run a full audit before and after major deployments, CMS upgrades, or domain migrations.
Which technical issues have the biggest ranking impact?
Crawlability and indexability issues have the highest impact because they can silently exclude large sections of your site from search indexes entirely. Core Web Vitals have the next-highest direct ranking impact. Schema markup has the highest AI citation impact.
Can I run this audit myself, or do I need a developer?
Most checklist items can be verified without developer involvement — you need access to Google Search Console, Google's Rich Results Test, and your website's robots.txt and source HTML. Developer involvement is needed to implement fixes.
Does a technical audit help with AI search as well as Google?
Yes — and increasingly the AI-specific items are the highest-priority new additions to technical audits. The llms.txt file, AI crawler allow rules, and FAQPage schema are now essential for AI search visibility and were not on technical SEO checklists even 18 months ago.
Audit Your Technical SEO Today
Technical issues accumulate silently. By the time they manifest as ranking drops or traffic declines, months of visibility have already been lost. The best time to run a technical audit is before problems appear.
Run a free OmniRank technical audit today — it checks all of these signals automatically across your entire site and delivers a prioritised fix list in minutes. No developer required.
OmniRank Editorial Team
SEO & AI Research Team
The OmniRank team combines expertise in AI, SEO, and SaaS growth to deliver actionable insights that help websites rank across Google, AI search engines, and LLM citation networks.