Technical SEO Checklist Template: Audit Your Site Health
Technical SEO is the foundation upon which all other search optimisation efforts rest. You can write the most compelling content and earn the most authoritative backlinks, but if search engines cannot properly crawl, render, and index your pages, none of it matters. A technical SEO checklist template provides a structured approach to evaluating your site’s infrastructure and identifying issues that could be silently undermining your search performance.
Singapore businesses face specific technical SEO considerations. With mobile traffic accounting for over 90 percent of searches in the region, mobile performance is not optional — it is the primary experience. Additionally, many Singapore SMEs operate on shared hosting with Southeast Asian data centres, where server response times and uptime can vary. These factors make regular technical audits essential rather than a one-off exercise.
This article presents a comprehensive technical SEO checklist template covering eight critical areas: crawlability, indexation, site speed, mobile-friendliness, HTTPS security, structured data, XML sitemaps, and robots.txt configuration. Each section includes specific checkpoints, what to look for, and how to resolve common issues. Whether you are auditing your own site or reviewing a client’s, this template will ensure nothing is missed.
Crawlability Checks
Crawlability refers to how easily search engine bots can discover and access your pages. If Googlebot cannot reach a page, it cannot rank it. These checks ensure your site architecture supports efficient crawling.
| # | Checkpoint | What to Review | Status |
|---|---|---|---|
| 1 | Crawl budget efficiency | Important pages are crawled frequently; low-value pages are not consuming crawl budget | |
| 2 | Internal link depth | No important page is more than three clicks from the homepage | |
| 3 | Orphan pages | All pages meant to rank have at least one internal link pointing to them | |
| 4 | Redirect chains | No redirect chains longer than two hops; all redirects resolve within one second | |
| 5 | Redirect loops | No circular redirects that trap crawlers in an infinite loop | |
| 6 | Server response codes | All important pages return 200 status codes; 4xx and 5xx errors are identified and resolved | |
| 7 | JavaScript rendering | Content rendered via JavaScript is accessible to Googlebot (test with URL Inspection tool) | |
| 8 | Pagination | Paginated content uses proper next/prev links or load-more functionality that is crawlable |
How to audit: Run a full crawl using Screaming Frog, Sitebulb, or DeepCrawl. These tools simulate how a search engine crawls your site and report on response codes, redirect chains, orphan pages, and crawl depth. Cross-reference the results with Google Search Console’s Coverage report to see which pages Google has actually crawled and indexed.
For more context on crawlability within a broader audit framework, our SEO audit guide explains how to interpret crawl data and prioritise fixes.
Indexation and Canonicalisation
Getting crawled is step one; getting indexed is step two. Indexation checks ensure that the pages you want in Google’s index are there, and the pages you do not want indexed are properly excluded.
| # | Checkpoint | What to Review | Status |
|---|---|---|---|
| 9 | Index coverage | Number of indexed pages in Search Console matches the number of pages you intend to rank | |
| 10 | Noindex tags | Only pages you deliberately want excluded carry a noindex meta tag | |
| 11 | Canonical tags | Every page has a self-referencing canonical tag; duplicate pages point to the preferred version | |
| 12 | Canonical consistency | Canonical URL matches the URL in the sitemap and internal links (no trailing-slash mismatches) | |
| 13 | Duplicate content | No substantial duplicate content across pages; parameterised URLs handled with canonicals or robots directives | |
| 14 | Thin pages | Pages with minimal content are either expanded, consolidated, or noindexed | |
| 15 | Hreflang implementation | If serving multiple languages or regions, hreflang tags are correctly implemented and reciprocal |
Common issues in Singapore: Many Singapore businesses operate bilingual or trilingual websites. Without proper hreflang tags, Google may serve the wrong language version to users, or competing language versions may cannibalise each other’s rankings. Additionally, e-commerce sites frequently generate duplicate content through product variations, filter URLs, and sorting parameters. Proper canonical and parameter handling is essential.
Site Speed and Core Web Vitals
Page speed is both a ranking factor and a critical user experience metric. Google’s Core Web Vitals — Largest Contentful Paint (LCP), Interaction to Next Paint (INP), and Cumulative Layout Shift (CLS) — measure real-world performance as experienced by your visitors.
| # | Checkpoint | Target | Status |
|---|---|---|---|
| 16 | Largest Contentful Paint (LCP) | Under 2.5 seconds on mobile; under 2.0 seconds on desktop | |
| 17 | Interaction to Next Paint (INP) | Under 200 milliseconds | |
| 18 | Cumulative Layout Shift (CLS) | Under 0.1 | |
| 19 | Time to First Byte (TTFB) | Under 800 milliseconds; ideally under 200ms | |
| 20 | Total page weight | Under 2MB for content pages; under 3MB for media-heavy pages | |
| 21 | Render-blocking resources | Critical CSS inlined; JavaScript deferred or loaded asynchronously | |
| 22 | Image optimisation | All images compressed, served in modern formats (WebP/AVIF), and properly sized | |
| 23 | Browser caching | Static assets have cache-control headers with appropriate expiry periods | |
| 24 | CDN usage | Content delivered via a CDN with edge nodes in or near Singapore |
Testing tools: Use Google PageSpeed Insights for lab and field data, Chrome DevTools for detailed performance profiling, and the Core Web Vitals report in Search Console for site-wide trends. For Singapore-specific testing, ensure you test from a Singapore-based connection or use tools that allow you to specify the test location.
If your site is built on WordPress, common speed culprits include unoptimised themes, excessive plugins, and lack of server-level caching. A purpose-built, performance-optimised site from experienced web design professionals can eliminate many of these issues from the start.
Mobile-Friendliness
Google uses mobile-first indexing, meaning it primarily evaluates the mobile version of your site for ranking purposes. A site that performs well on desktop but poorly on mobile will struggle to rank, regardless of how strong its content or backlinks are.
| # | Checkpoint | What to Review | Status |
|---|---|---|---|
| 25 | Responsive design | All pages adapt properly to screen widths from 320px to 1920px | |
| 26 | Viewport meta tag | Every page includes a correctly configured viewport meta tag | |
| 27 | Tap target size | All clickable elements are at least 48×48 pixels with adequate spacing | |
| 28 | Font readability | Base font size is at least 16px; text is readable without zooming | |
| 29 | No horizontal scrolling | Content fits within the viewport without requiring horizontal scroll on any device | |
| 30 | Mobile content parity | Mobile version contains the same content as desktop; nothing hidden or removed | |
| 31 | Intrusive interstitials | No full-screen pop-ups that block content on mobile (Google penalty risk) |
Testing approach: Test on actual devices, not just browser emulators. Borrow or maintain a small set of popular devices in Singapore — typically an iPhone (latest and one generation back), a Samsung Galaxy mid-range, and a budget Android device. Real-device testing reveals issues that emulators miss, such as touch responsiveness, font rendering, and actual load times on mobile networks.
HTTPS and Security
HTTPS has been a confirmed ranking signal since 2014, and in 2026 it is a baseline expectation. Beyond SEO, HTTPS protects user data, builds trust, and is required for many modern web features including service workers, geolocation APIs, and payment processing.
| # | Checkpoint | What to Review | Status |
|---|---|---|---|
| 32 | SSL certificate | Valid SSL certificate installed; not expired or self-signed | |
| 33 | HTTPS enforcement | All HTTP URLs redirect to HTTPS via 301 redirects | |
| 34 | Mixed content | No HTTP resources (images, scripts, stylesheets) loaded on HTTPS pages | |
| 35 | HSTS header | HTTP Strict Transport Security header is set with an appropriate max-age | |
| 36 | Security headers | X-Content-Type-Options, X-Frame-Options, and Content-Security-Policy headers are configured |
Mixed content is one of the most common issues we see on Singapore business websites. It occurs when a page loads over HTTPS but includes resources (typically images or embedded scripts) served over HTTP. This triggers browser warnings, breaks the padlock icon, and can prevent the page from loading correctly on strict browsers. Use a crawler or browser developer tools to identify all mixed content instances and update the resource URLs to HTTPS.
Structured Data and Schema Markup
Structured data helps search engines understand the meaning behind your content, enabling rich results such as FAQ dropdowns, star ratings, breadcrumbs, and event listings directly in the search results. While not a direct ranking factor, structured data improves click-through rates and visibility.
| # | Checkpoint | What to Review | Status |
|---|---|---|---|
| 37 | Organisation schema | Homepage includes Organisation or LocalBusiness schema with name, logo, contact, and address | |
| 38 | Breadcrumb schema | BreadcrumbList markup matches visible breadcrumbs on all pages | |
| 39 | Article schema | Blog posts use Article or BlogPosting schema with headline, author, datePublished, and image | |
| 40 | FAQ schema | FAQ sections use FAQPage schema; questions and answers are visible on the page | |
| 41 | Product schema | E-commerce product pages include Product schema with name, price, availability, and reviews | |
| 42 | Validation | All structured data passes Google’s Rich Results Test without errors or warnings | |
| 43 | JSON-LD format | Structured data is implemented in JSON-LD (Google’s preferred format) rather than Microdata |
Pertimbangan khusus Singapura: LocalBusiness schema is particularly important for businesses targeting local search. Include your Singapore address, phone number with the +65 country code, operating hours, and the geo coordinates of your business location. This data reinforces your local relevance and supports your Google Business Profile listing.
For a broader view of how structured data fits into your overall search strategy, our technical SEO checklist explains the interplay between structured data and other technical elements.
XML Sitemap and Robots.txt
The XML sitemap and robots.txt file are your primary tools for communicating with search engine crawlers. The sitemap tells crawlers which pages exist and when they were last updated, while robots.txt tells crawlers which areas of your site they should or should not access.
| # | Checkpoint | What to Review | Status |
|---|---|---|---|
| 44 | Sitemap exists | An XML sitemap is accessible at /sitemap.xml or a declared location | |
| 45 | Sitemap submitted | Sitemap is submitted and accepted in Google Search Console (no errors) | |
| 46 | Sitemap accuracy | Only indexable, canonical URLs are included; no noindexed, redirected, or 404 pages | |
| 47 | Sitemap freshness | Last modified dates in the sitemap reflect actual content changes | |
| 48 | Sitemap size | Sitemaps contain fewer than 50,000 URLs each; larger sites use a sitemap index | |
| 49 | Robots.txt accessible | File exists at /robots.txt and returns a 200 status code | |
| 50 | Robots.txt directives | No important pages or directories are blocked; test with Search Console’s robots.txt tester | |
| 51 | Sitemap reference | Robots.txt includes a Sitemap directive pointing to the XML sitemap | |
| 52 | Crawl-delay | No unnecessary crawl-delay directive that could slow indexation |
Common mistakes:
- Including URLs in the sitemap that return 301 redirects, 404 errors, or carry noindex tags — this sends conflicting signals to crawlers
- Blocking CSS and JavaScript files in robots.txt, which prevents Google from rendering your pages correctly
- Forgetting to update the sitemap after a site migration, leaving old URLs in place and omitting new ones
- Using a wildcard disallow rule (Disallow: /) in robots.txt accidentally, which blocks the entire site from crawling
Putting It All Together: Audit Workflow
Having a checklist is essential, but knowing how to work through it efficiently is equally important. Follow this workflow to conduct a systematic technical SEO audit.
Step 1: Gather your tools. You will need Google Search Console, a site crawler (Screaming Frog or Sitebulb), PageSpeed Insights, and Google’s Rich Results Test. For larger sites, a log file analyser like Screaming Frog Log File Analyser provides additional crawl behaviour insights.
Step 2: Run the crawl. Configure your crawler to respect robots.txt and crawl at a reasonable rate (two to five URLs per second). Let it run until complete, then export the data. This single crawl provides data for the majority of your checklist items.
Step 3: Review Search Console. Check the Coverage report for indexation issues, the Core Web Vitals report for speed problems, the Mobile Usability report for mobile issues, and the Enhancements section for structured data errors. Cross-reference these with your crawl data.
Step 4: Work through the checklist. Go section by section, marking each checkpoint as pass, warning, or fail. Add notes with specific URLs and evidence for every issue you find. This documentation is critical for creating actionable fix tickets.
Step 5: Prioritise and assign. Group all fails and warnings by severity. Critical issues that block crawling or indexing should be fixed immediately. Performance issues and structured data improvements can be scheduled over the following weeks. If you need expert assistance resolving complex technical issues, consider engaging a team that offers comprehensive SEO services with technical depth.
Step 6: Re-audit. Schedule a follow-up audit in 30 days to verify that fixes have been implemented correctly and are producing the expected improvements. Then maintain a quarterly audit cycle going forward. Integrating this technical checklist with a broader SEO audit framework ensures you cover all dimensions of search performance, not just the technical layer.
Soalan Lazim
What is the difference between a technical SEO audit and a full SEO audit?
A technical SEO audit focuses specifically on the infrastructure of your website — crawlability, indexation, speed, security, and structured data. A full SEO audit includes technical checks but also covers on-page optimisation, content quality, backlink health, and competitive analysis. Think of the technical audit as one essential component of the larger whole.
Which crawling tool is best for technical SEO audits?
Screaming Frog is the industry standard for small to medium sites, offering a free version for up to 500 URLs and an affordable licence for unlimited crawling. Sitebulb provides a more visual interface and built-in prioritised recommendations, making it ideal for those who prefer guided audits. For enterprise-scale sites with millions of pages, cloud-based crawlers like Lumar (formerly DeepCrawl) or Oncrawl are better suited.
How long does a technical SEO audit take?
For a site with 100 to 500 pages, a thorough technical audit typically takes three to five hours, including crawling, analysis, and documentation. Sites with thousands of pages may require a full day or more. The actual crawl time depends on your server speed and crawler settings, but most small sites finish crawling within 15 to 30 minutes.
Do I need developer access to fix technical SEO issues?
Many technical SEO fixes require changes to server configuration, HTML templates, or JavaScript code, so developer involvement is usually necessary. However, some fixes can be implemented through CMS settings or plugins without coding. For example, WordPress plugins like Yoast or Rank Math can handle canonical tags, sitemaps, and basic schema markup. For more complex issues like redirect management, server headers, or JavaScript rendering problems, developer support is essential.
How do Core Web Vitals affect rankings in 2026?
Core Web Vitals are a confirmed ranking factor within Google’s page experience signals. While they are unlikely to override strong content relevance and backlink authority, they serve as a tiebreaker when competing pages are otherwise equal. In practice, pages that pass all Core Web Vitals thresholds tend to rank slightly higher than equivalent pages that fail, and they deliver better user experience, which indirectly supports engagement metrics.
Should I worry about crawl budget for a small website?
For most Singapore SME websites with fewer than 1,000 pages, crawl budget is not a significant concern. Google allocates sufficient crawl resources for small sites. However, if your site generates thousands of parameterised URLs, has extensive faceted navigation, or includes large sections of thin or duplicate content, these can waste crawl budget even on smaller sites. The key is to ensure that your most important pages are easily accessible and that low-value pages are managed with noindex tags or canonical directives.



