Technical SEO Audit Checklist

Technical SEO should be the foundation of any SEO strategy to improve organic search visibility. I started SEO auditing in 2010-ish. I ran over 100+ SEO audits in the past 10 years both in-house and as one-off projects.

Understanding HTTP Status Codes

As a part of technical SEO site auditing, the first thing you need to pay attention is to the HTTP status codes of your site pages and resources. A status code is issued by a server in response to a browser’s request. There are over 60 different status codes each having its own meaning. The most common status codes you would come across during your technical SEO audits are 2xx successful, 3xx redirection, and the problematic 4xx client error or 5xx server error status codes as shown in the below table.

Status Code What they mean
200 OK (Success).
301 Permanent redirects: requested resource moved permanently to another location.
302 Temporary redirects: requested resource moved temporarily to another location.
400 Bad Request
403 Forbidden: requested resource is forbidden for some reason
404 Not Found: requested resource is not found in the location.
410 Gone: requested resource is permanently gone from the location.
500 Internal/Generic Server Error
503 Service Unavailable: If a server is overloaded or undergoing maintenance.

 

To identify the different status codes of your website, you can use a number of different methods such as;

  • Aiyma Redirect Path browser extension to do spot checks of certain pages.
  • Or use website site crawlers such as ScreamingFrog / Botify / Deepcrawl / various others to run a site crawl of your entire site.

Problematic status codes that you should specifically be looking for when auditing websites to find technical SEO issues are 301’s, 302’s, 404’s, 410’s, 5xx errors and lots of redirects. 3xx and 4xx errors can be resolved by updating internal links to the correct location instead of using redirects. Speak to your IT team who maintains the servers if you encounter a large number of server errors.

The Technical SEO Site Audit Checklist

Before launching a technical site audit, it’s important to gather basic site information. Determine the site’s CMS and any known CMS restrictions. If the site was recently migrated, note the reason, timing, and any impact on rankings or performance. Verify if there’s a staging environment and whether it’s indexed by Google. Check if there are there any reported manual actions or security issues on the search console. Find out if the site has been impacted by Core Updates. Finally, discover the server location by the IP and check the domain history. Now, let’s delve into the details of the technical SEO audit checklist;

Fixing broken links improves user experience that otherwise affects user engagement with the website. A large number of 404s is typically seen in eCommerce sites due to out-of-stock products / discontinued categories.

To fix this, either restore the URL if possible, if not, remove all internal links pointing to these URLs and implement 301 redirects to the most relevant page that exists on site.

Fix internal Redirects (301/302)

Fix your internal link redirects, especially redirect chains that cause more hops for a bot to reach the final page. This can dilute the spread of link equity to the final page. Update the redirecting URL to the final destination.

Fix Duplicate Page Titles & Meta Descriptions

Duplicate titles can cause cannibalization issues. All indexable pages should have a unique title tag.

Ensure you implement unique enticing meta descriptions to help with CTR from the SERPs.

Review your Heading Tags (H1s)

H1s are used to tell Google what the page is about and can be very important when optimising pages for target search terms. Make sure every indexable page has a unique H1 tag and that no page has multiple or duplicate H1 tags.

Ensure the presence of headings, page titles, and meta descriptions, as title tags and H1s can influence Google’s page ranking and each page should have a unique meta description.

Review Noindexed Pages

Regularly audit your website to ensure pages that aren’t beneficial remain set to noindex. This status is defined by the meta directive in the <head> section of the page.

<meta name="robots" content="noindex">

Be mindful of any internal ‘follow’ links leading to pages set to ‘noindex’ and ‘nofollow’ links pointing to indexable pages. Also, check that your internal search result pages are non-indexable.

Site Architecture & Internal Links

Having a clear site structure with optimal internal linking is key for effective search engine indexing and user experience. The structure aids user navigation and helps search engine bots understand your site hierarchy. Additionally, internal links distribute link equity and PageRank throughout the site.

To optimise your internal links, ensure they’re relevant and reduce the amount of duplicate content to prevent wasting link equity. Remove low-quality pages and use the “nofollow” directive to avoid passing link equity to unnecessary pages. Avoid adding too many links on pages and identify unlinked pages using site crawl reports, log files, search console, and analytics data. Include less important pages or directory structures in the robots.txt file to prevent them from being crawled and distributing link equity.

SEO Click DepthReduce the crawl depth of your most important pages

Focusing on the crawl depth of important pages can help distribute link equity more effectively. Links can be found in various areas such as the navigation menu, body content, footer, sidebar, or related sections. It’s vital to have your homepage link to the most popular pages of your site, as it usually receives the most external links.

Review internal links by manually checking or using tools like ScreamingFrog, Botify, or Deepcrawl. Visualising your site’s internal linking structure and crawl depth can be done using Sitebulb or ScreamingFrog. Regularly review your site’s internal linking to the most important pages, focusing on pages with high search demand and revenue contribution. Ensure that category pages are linking to relevant sub-categories and product pages. Verify if product pages are linking back to sub-category and category pages. Additionally, confirm that category pages are interlinking with each other, and product pages are linking to other related product pages.

Some examples of sites getting internal linking right;

Sports Direct Internal Links SEOFind similar items here in the section on Sports Direct

Hometogo internal links seoVacation Destinations Near Section on Hometogo

Internal links SEO EtsyExplore the related categories & searches Section of Etsy

Ensure print versions are non-indexed, non-www. redirects to www. (or vice versa) are set up, and important pages are within 4 clicks from the homepage. Descriptive anchor text should be used in internal links pointing to important pages. Check for long, uppercase, underscore, or non-ASCII URLs. Verify your URL structure indicates the site’s hierarchical tree and that no pages are duplicated due to poor architecture.

Check for correctly implemented breadcrumbs across the site, ensure primary navigation is user-friendly, and check if faceted navigation creates indexable URLs. Utilise an HTML Sitemap, and check for redirect loops, 5xx server errors, 4xx errors, or any internal links leading to 404 errors.

Remember, a well-optimised internal linking structure is key to a search engine’s understanding of your site’s hierarchy.

URL Structure

URLs act as a minor ranking factor. Pay attention to your URL structures. Make them human-readable to make it easy for both search engines and users to understand the destination page and the structure of your website. They shouldn’t be cluttered and long. Keyword-rich URLs are better for SEO.

Let’s take an example of a facetted URL from the House of Fraser website – https://www.houseoffraser.co.uk/women/dresses/maxi-dresses. The URL format is clear for both users and search engines what the page is about and what they are likely to find within. Make it as clear as possible to search engines by laying out your URLs in an ordered format. You can also see the journey the user has followed to reach that specific maxi dress page.

When it comes to the importance of URL structure versus click depth, John Mueller revealed that Click depth determines page importance more than the URL structure. You must ensure that your key pages are as closely linked to the homepage as possible.

HTTPS Website Security

HTTPS, the secure data transfer protocol, significantly boosts the trustworthiness of your website, further influencing its Google ranking. As your site’s security is crucial, ensure that all components like internal links, images, and ad networks load via HTTPS.

  • Set up automatic redirection from HTTP to HTTPS and maintain a valid SSL certificate.
  • Ensure that all your pages correctly redirect from HTTP to HTTPS, with no URLs left on HTTP.
  • Watch out for any mixed content loading over both protocols.
  • Check the security headers used and look for any missing ones.

Review Robot.txt File

The robots.txt file, placed in the root directory, instructs crawlers and user agents (Googlebot, Bingbot, Yandexbot etc) on which parts of your site to crawl or ignore. Its ‘Allow’ and ‘Disallow’ directives guide crawlers, and the ‘Sitemap’ directive points them to your XML sitemap. We block certain sections of a website from crawlers using Robots.txt to prevent crawl budget wastage of site sections you don’t want getting indexed.

Be sure not to block important site sections and disallow site search pages to avoid duplicate content indexing.

Use the robots.txt Tester in the Search Console to test the robots.txt markup.

Page Speed Checks & Improvements

Page speed is critical for both user experience and SEO, with faster pages typically leading to better rankings and conversions. Pages that load slower often suffer from higher bounce rates, negatively affecting your website’s performance and ranking.

Pay special attention to Page Speed and Core Web Vitals metrics. Check your website’s Time to First Byte (TTFB), total page weight, fully loaded time, and number of requests. Core Web Vitals such as First Contentful Paint, Largest Contentful Paint, Total Blocking Time, Time to Interactive, Interaction to Next Paint (INP) and Cumulative Layout Shift should also be regularly monitored. These metrics provide a comprehensive view of your site’s performance and indicate where optimisations can be made.

Several strategies can help with your performance, like;

  • Choosing a reliable Web hosting company.
  • Improve server response time.
  • Minification of HTML, Javascript, and CSS.
  • Check if GZIP Compression is enabled.
  • Having lots of images or large images on your web pages not only affects overall performance for a user but can also have an impact on the ability of the page to rank on Google. Optimising the size of images using image compression techniques. You can use a lossless optimizer such as ImageOptim or FileOptimizer to make your images download faster, without losing quality. Lazy load images below the fold.
  • Eliminate render-blocking resources.
  • Enabling Cache – Server and Browser caching through CDNs.
  • Check if the website using HTTP/2 protocol.
  • Audit the Google Tag Manager to remove unnecessary tags.
  • and more…

Various tools, such as Google Page Speed Insights/Google Lighthouse, WebPageTest, and Google Search Console’s speed report provide valuable in-depth analysis and recommendations and quick fixes. Addressing these issues can lead to significant savings in load time.

A comprehensive performance improvement strategy using a mix of speed-check tools is encouraged, as each has the potential to reveal unique insights.

Check your XML Sitemap for Issues

An XML Sitemap is a way of telling search engines about your site URLs (pages, videos, images etc) you wish to be indexed in search results. An HTML sitemap, on the other hand, helps users navigate the site. The XML sitemap must contain the URL and the last modified date. It can also contain other optional fields such as alternate language versions for an International site. A clear sitemap quickly shows search engine crawlers the key pages you want them to discover sooner. Especially beneficial for large sites.

Ensure your site has a valid XML sitemap or sitemap index containing only indexable, 200 status code, and self-canonicalised site URLs and submit it to the search console.

Using a site crawler or your search console, identify any XML sitemap issues such as the inclusion of non-200 status code URLs, non-self canonicalised URLs, or non-indexable URLs. As part of your sitemap audit, you might also discover orphaned URLs in the XML sitemap, which can then be linked within your site’s architecture.

Make sure the XML sitemap has been submitted and successfully processed by Google Search Console. Resolve the issues related to indexable URLs that are not included in the XML sitemaps, a situation that frequently occurs in eCommerce websites.

To check if a site has an XML sitemap or sitemap index, check the site’s robots.txt and look for the sitemap declaration.

Check Mobile-First Indexing Best Practices

In this mobile-first era, where users are increasingly more active on mobile devices, effective mobile-first SEO strategies and ensuring a mobile-friendly site are crucial. Google, having introduced mobile-first indexing in March 2018, primarily crawls, indexes, and ranks websites based on its mobile version. Follow the Google developers documentation link if you want to read up in-depth details on Mobile-first indexing best practices.

So, what are the best practices to ensure mobile readiness?

  • Maintain parity between your mobile and desktop sites, covering all valuable content like Menu Links, Main body content, Footer and Sidebar links, and Schema markup. If you have a good responsive design (Google recommends this), you should be OK.
  • Responsiveness: Your website must be responsive to provide an optimal viewing experience across different devices and browsers. If your site uses separate URLs for desktop and mobile versions, it’s essential to use rel=”alternate” tags to define the mobile version of a page. For instance: <link rel=”alternative” media=”only screen and (max-width: 640px)” href= “https://m.example.com/page.html”/> on your desktop version. Simultaneously, your mobile page should have a canonical tag pointing back to the desktop version.
  • Ensure your site is mobile-friendly, free from any usability issues on mobile devices. Use the viewport meta tag for an optimal user experience: <meta name=”viewport” content=”width=device-width, initial-scale=1″>.

It’s important to carry out regular checks for responsive design, appropriate image use and resizing, AMP utilization, video implementation on mobile, popup/interstitial management, mobile navigation, and ease of mobile checkout. Also, ensure links and buttons are easily clickable and the Favicon is displayed in mobile SERPs. These checks will assure a seamless and user-friendly mobile browsing experience.

Check issues with Canonical Tags

Canonical tags are valuable in SEO, utilised to indicate to search engines the preferred version of a page URL for indexing. The lack of a self-referential canonical tag can render a page non-indexable. Canonical tags can be placed within HTTP headers or the HTML head, but remember, they’re treated as hints by Google, not directives. Their primary role is to mitigate issues related to duplicate pages, by signifying which version should be indexed in search results.

Best practices for canonical tags include:

  • Implement self-referential canonical tags on unique pages intended for indexing, signalling to search engines the preferred version for indexing. Verify that all canonical tags return a 200 status code
  • Utilise canonical tags on eCommerce sites’ facetted/filter pages to minimise duplication and prevent filter pages from targeting similar terms as your category page. Check out my SEO guide for eCommerce sites or WooCommerce-based eCommerce sites here.
  • Avoid canonicalising to a redirected, non-indexable, or non-200 status code page as it may confuse search engines and lead to canonical chains.
  • Ensure pages within a paginated series have self-referential canonical tags. Incorrect implementation, common on eCommerce sites, often results in all paginated pages within the series canonicalising to the first page.
  • Check the canonicalisation of parametrised page URLs.
  • Avoiding the usage of multiple canonical tags, which can lead to conflicting signals about the preferred page for indexing.

To identify canonical tags:

  • Inspect the DOM or the unrendered HTML in Page Source, searching for “canonical”.
  • Utilise a crawler like Screaming Frog to review canonical tags across your site. This can help identify missing self-referential tags, tags pointing to another page, or incorrect tag implementation.
  • Use Google Search Console to inspect single URLs or review excluded URLs in the coverage report.

Check Issues with Pagination Implementation

Pagination, typically implemented via traditional numbered pages or a user-friendly ‘Load More’ button, helps structure your website’s content. While rel=prev/next markup was once crucial for indicating paginated series to Google, it’s no longer essential since Google’s 2019 announcement. Yet, if properly executed, it’s harmless to keep it. Incorrect implementation of pagination can cause spider/bot traps.

  • To spot incorrect pagination implementation, a crawler like Screaming Frog can be utilised. It can identify issues such as non-indexable paginated URLs or non-self-referential paginated URLs. Your paginated series should be indexable and self-canonicalised. It’s incorrect to canonicalise paginated URLs back to the first page, a mistake often made by website owners.
  • Sites employing the ‘Load More’ pagination often face the issue of using JavaScript without a crawlable <a href> link for bots, limiting their crawl beyond the first page. By inspecting the DOM of the ‘Load More’ button, you can check whether it contains real anchor links to the next page of the paginated series.
  • Infinite scroll is another pagination method used by some sites. These sites must ensure that their implementation supports paginated loading with unique links to each page for proper crawling. Check for these unique links by inspecting the DOM and searching for the next logical page link. Remember, links to the next & previous pages should be accessible to all users, ensuring easy crawling of all paginated pages for both bots and users with JS disabled.

Assess the website’s pagination setup. Ensure pagination is implemented correctly, with necessary elements for bot-accessible. Check for indexable paginated series and appropriate canonicalization.

Images Optimisation

While we’ve discussed image optimisation under page speed, it’s also crucial to evaluate site-wide image usage. Consider factors like the total number of images used on the page, the presence of alt text, appropriate image file naming, and size optimisation. Be mindful of images exceeding 100Kb and the excessive use of stock images, as these can affect both web speed performance and user experience.

Validate Schema Markup

Implementing schema markup / structured data can enhance your site’s appearance in the SERPs. By defining what you would like to see for some elements in the structured data you can standardise the display of your brand in SERPs. Some benefits of implementing schema include displaying rich search results, rich cards (on mobile), knowledge graphs, breadcrumbs, carousels and more in SERPs. Common schema use cases include news articles or blog posts, product schema for your eCommerce product pages, breadcrumbs, recipes, reviews, events etc.

review schema faqpage schemaExample with Review & FAQPage schema from TripAdvisor

The popular schema markup types are JSON-LD (Google’s preference) & Microdata. Microdata is HTML attributes within markup throughout a page. JSON-LD is a structured JSON object produced and injected into a page in one piece. Where possible use JSON-LD schema markup as it’s generally easier to implement and maintain. Click here to view an example of Product structured data using JSON-LD, RDFa & Microdata.

Use the Structured Data Testing Tool to validate if the schema markup is implemented correctly. This can help find and resolve technical issues, including missing required fields.

JavaScript

JavaScript, a lightweight programming language, is widely used for scripting UX elements and events. Since Googlebot’s 2019 switch to Chromium, rendering JavaScript has become less problematic. However, heavy reliance on JavaScript for content loading may still cause indexing delays due to the intensive rendering process.

How to Identify if JavaScript is causing an issue on your website?

  • Do your internal links or page content rely on JavaScript? To check this, disable JavaScript using a Chrome extension and observe any differences upon page reload.
  • Compare your page’s source code with the DOM, or the rendered page, to identify content dependent on JavaScript.
  • Execute two site crawls using a JavaScript-enabled crawler and a text-only rendering crawler (like the default in Screaming Frog), and note any discrepancies.

How do we handle heavy JavaScript Sites to Optimise for Googlebot?

To optimise JavaScript-heavy sites for Googlebot, pre-rendering can be used. This method involves server-side rendering and caching of pages, which are then served to search engines.

The following checks are essential to ensure effective JavaScript use:

  • Does your website heavily rely on JavaScript?
  • Does the site function correctly without JavaScript, and does the primary navigation menu load when it’s disabled?
  • Can Google crawl, render, and index the executed JavaScript without issues?
  • Are there differences between the original source and rendered HTML, and what changes does JavaScript make on the webpage?

Log File Analysis

An integral part of a comprehensive technical SEO audit is the Regular server log analysis. This involves evaluating your server logs to gain deeper insights into search engine bot behaviour. Key areas to monitor include crawl volume, response code errors, crawl budget waste, last crawl date & frequency and redirects. To ensure a comprehensive analysis, consider using either the last 60 or 90 days’ worth of data.

Monitor Google Search Console

Google search console is a free invaluable tool for site owners and SEOs. GSC includes 16 months of search traffic data with key reports such as index coverage, server errors, sitemap, speed reports (including the new core web vitals reports), links and mobile usability reports and much more. These reports can help you monitor, troubleshoot and fix site issues. In November 2020, Google released a new and improved version of the crawl stats report that you can use for your site within the search console to find issues.

Personally, on a day-to-day basis, I use the search console for the following;

  • Analyse website search query impressions, clicks and position on Google search. Are clicks and impressions trending up, down, or staying steady over the last 16 months?
  • Resolve the problems in your page indexing section. Determine the number of pages that are not indexed and explore possibilities to resolve or index those pages. Observe if there’s an uptick in “Soft 404s”. Examine any issues with video indexing. Confirm if Google using the same canonical on key pages as chosen by the website owner.
  • Monitor Sitemap issues.
  • Review index coverage reports.
  • Proactively fix site issues upon receiving alerts over email.
  • Use the URL inspection tool to analyse the indexation and crawling issues of your pages.

Fix what Google is telling you. Google has put together search console training videos on their YouTube channel to teach you how to monitor your site and make informed decisions to optimise your site’s search appearance on Google SERP. Don’t forget to connect your site to Bing Webmaster Tools as they have been revamping a lot of their offerings recently.

A few other things to watch out for

  • Review the Faceted Navigation of your eCommerce sites for common issues. If not handled correctly faceted navigation can cause duplication, massively eat up your crawl budget and dilute your main page’s link equity to low-value pages.
  • Perform Google searches using the ‘site’ command for your domain –  site:domain.com. Review SERPs listings, and look for issues.
  • Track down issues such as hacked pages, cloaking, blocked resources, doorway pages, hidden on-page content or links, improperly canonicalised pages, and unexpected robots.txt changes.
  • Ensure consistent Name, Address, and Phone number (NAP) information across your website and external sites, including Google My Business (GMB) listings, and make sure the GMB listings include the location for improved local search results.
  • Conduct a Google cache analysis and a crawl stats breakdown. Are there any host status problems reported in Search Console crawl stats?
  • Be wary of JavaScript redirections and the usage of Flash and iFrames.
  • Lastly, ensure to keep a check on your site’s HTML usage:
    • Look out for deprecated HTML tags and validate your HTML.
    • Assess your site’s accessibility and see how JavaScript is being used.
    • Confirm if your CSS is minified and limit the use of inline CSS.
    • Evaluate your site’s ad placements and ensure there isn’t an overuse of ads.
    • Check if your site bombards visitors with pop-ups and interrupts the user-friendly experience.

My Favourite Free & Paid Technical SEO Tools

No time to deal with Technical SEO? I provide technical SEO audit services. As a Freelance Technical SEO consultant, my job is to crawl and gather lots of data, interpret the data and provide actionable recommendations. Contact me today to discuss how I could help.

0 comments… add one

Leave a Reply

Your email address will not be published. Required fields are marked *