Read, Learn and Grow

Venture Stream Blog

How to perform a technical SEO audit

How to conduct a technical SEO audit

April 25, 2023

Performing a technical SEO audit can be overwhelming, but it’s one of those things that just has to be done. If you don’t keep on top of the technical side of your website, you’ll struggle to compete in the SERPs, and miss out on valuable website traffic. 

In this post, explain what a technical SEO audit is and break down the process into ten manageable steps. 

What is a technical SEO audit?

A technical SEO audit is an in-depth analysis of a website’s technical aspects. The goal is to identify and prioritise areas that require optimisation to improve search engine visibility, user experience, and overall website performance.

Technical SEO: the basics

A technical SEO audit can be broken down into five key areas:

  1. Site architecture and structure: analysing how the website is organised, the hierarchy of pages, and the URLs used. The goal is to ensure that the site is easy to navigate, and the site structure supports crawling and indexing by search engines.
  2. Indexability and crawlability: examining the site’s ability to be crawled and indexed by search engines, identifying and removing any technical barriers that prevent search engine crawlers from accessing and indexing content.
  3. Page speed and performance: evaluating how quickly the website loads, identifying any technical issues that might be slowing it down, and making recommendations to improve site speed and user experience.
  4. Mobile-friendliness: examining the website’s responsiveness and ensuring that it is optimised for mobile devices.
  5. Technical SEO best practices: analysing how well the website adheres to technical SEO best practices, such as schema markup, canonicalisation, and redirects.

Technical SEO tools

To get started, you’ll need access to the right tools. We’d recommend signing up for a specialist SEO tool, like SEMrush or Ahrefs. These tools will scan your website to check how many URLs there are, how many of them are indexable, how many are not and how many have issues. From here you can create an in-depth report which can be used to help you identify and fix any issues that are negatively affecting your site’s performance.

In addition, Google Search Console will come in handy for certain parts of your technical SEO audit, especially more advanced issues that require further investigation. 

In order to run a site audit, you’ll need to make sure your website is connected to your SEO tool of choice – usually a website will be referred to as a ‘project’ in your dashboard. The simplest way to go about this is via Google Search Console, but you’ll likely need to verify your ownership by adding a DNS record or HTML file.

1. Crawl your website

The first step is to crawl the website to identify technical issues.

Once you’ve got your report, you can easily see top-level issues, the overall health of your website and start identifying the issues that are hindering performance. The report will be split into sections according to specific technical elements, along with the number of pages affected. This includes title tags, meta descriptions, header tags, keyword usage and low word count. 

Issues you identify may fall into two categories: issues you can fix yourself, and issues you will need a developer to fix for you.

2. Identify crawlability and indexation issues

Search engines must be able to crawl and index your web pages in order to rank them. It’s therefore imperative that you identify and fix any issue relating to crawlability and indexation.

Many technical SEO issues are linked to crawlability and indexability in one way or another. You’ll need to take a look at two very important website files – robots.txt and sitemap.xml. These files have a huge impact on how search engines discover your site.

Checking for robots.txt issues: 

The robots.txt is a website text file that tells search engines which pages they should or should not crawl. You can usually find it in the root folder of the site: https://domain.com/robots.txt.

 

The robots.txt file can:

  • Point search engine bots away from private folders
  • Prevent bots from overwhelming server resources
  • Specify the location of your sitemap 

It only takes one line of code in the robots.txt file to prevent search engines from crawling your entire site. You need to make sure that your robots.txt file doesn’t disallow any folder or page that you want to appear in search results.  

Checking for sitemap.xml issues:

The sitemap is a file that lists all of the pages that you want search engines to index (basically the pages you want to rank). 

You therefore need to make sure that important pages aren’t missing from the sitemap, and on the other hand, that it does not include pages you do not want to appear in the SERPs (such as login pages, gated content, etc) or that may hinder your ability to rank highly due to issues like duplication. A technical SEO audit will show any specific issues relating to the sitemap, such as formatting errors, incorrect pages and orphaned pages (more on these later).

3. Audit your website architecture

Site architecture helps users navigate your website. Put simply, it’s the structure of your site. It refers to the hierarchy of your web pages and how they are connected. When organising your website, you should consider the most logical and simple user journey, and it should be easy to maintain as your site grows.

In order to effectively crawl your website, the search engine needs to understand the relationships between your pages. You can split website architecture into three key areas:

Hierarchy:

A technical SEO audit will give an overview of your website’s subdomains and subfolders. This makes it easier to review them to ensure the hierarchy is logical and well-organised. Ideally, a user should be able to reach their desired page in no more than three clicks. This is for two reasons. The first is that it simply offers a better user experience. The second is that search engines consider pages deeper in the website’s hierarchy to be less important or relevant, so these pages are far less likely to rank highly in the SERPs. 

Navigation:

Another pillar of good website architecture is navigation. Navigation refers to things like menus, footer links and breadcrumbs (a trail of text links). Site navigation should reflect the hierarchy of your pages and avoid non-standard names for menu items. We’d also recommend avoiding mega menus to keep things simple and streamlined. Unfortunately, an SEO tool can’t review a site menu, so you’ll need to take some time to do this manually, following best practices for UX navigation. 

URL structure:

URL structure should be consistent and easy to follow. For example, if a website visitor wanted to find lipstick, they might follow the menu navigation below:

Homepage > Makeup > Lips > Lipstick

Some common issues a technical SEO audit will highlight URLs that:

  • Contain underscores
  • Have too many parameters 
  • Are above the recommended length 

4. Fix internal linking issues

An internal link is a link from one page on your website to another. An essential part of good website architecture, they distribute equity (also known as “link juice” or “authority”) to direct search engines to important pages. Internal linking issues will likely arise if you’ve made changes to your site architecture and navigation, but the majority are easy to fix.

The three most common internal linking issues are:

Broken links – an existing link that points to a page that no longer exists. A technical SEO audit will show you how many broken links exist on your site and which page they can be found on.

Orphaned pages – pages that don’t have any links pointing to them. This means they cannot be accessed via any other part of the website. An audit will identify any pages with zero internal links, so you can go and add at least one to ensure these pages are crawled.  

5. Identify duplicate content

When we talk about duplicate content issues, we’re referring to multiple web pages that contain identical or nearly identical content. Sometimes, websites end up with duplicate pages, or sometimes the same copy is used on more than one page because it seems convenient at the time.

But duplicate content can lead to a number of problems. An incorrect version of a page may display in the SERPs, or pages may perform poorly due to indexing issues. Duplicate content may be caused by two common factors: there are multiple versions of URLS, or there are pages with different URL parameters.

Multiple versions of a URL

For example, a site may have a HTTP version, a HTTPS version, a www version and a non-www version. Google views these as different versions of the same site, so if your page is present on more than one of these URLs, it will be considered a duplicate. This issue can be fixed by setting up a sitewide 301 redirect, which will ensure only one version of your pages is accessible. 

 

URL parameters 

URL parameters are the elements of URL that are used to filter or sort website content. These are commonly found on ecommerce sites, as they are used to differentiate product pages with slight differences, for example, colour variations of the same product. You can easily spot them as they contain a question mark and an equal sign. These URLs are often identified as duplicated because they have very similar content to their counterparts without parameters. You can reduce potential problems by simply reducing unnecessary parameters, or using canonical tags pointing to the URLS with no parameters. 

6. Audit site performance 

Site speed has long been a Google Ranking Factor, so it’s something you’ll need to pay close attention to. This is because it’s an important part of the overall page experience – Google isn’t going to send users to a page that takes a long time to load. 

You should pay attention to:

Page speed: how long it takes one webpage to load

Site speed: the average page speed for a sample set of page views on a site 

The two are intertwined, as improving page speed will see your site speed improve. Google’s PageSpeed Insights tool can help you identify the causes of slow page speed so you can work to fix them. This tool only analyses one URL at a time however, so we’d recommend getting a sitewide view using either Google Search Console or one of the auditing tools we mentioned earlier, which will show you which pages have a slow load speed and why.

7. Evaluate mobile responsiveness

As of February 2023, 59.4% of web traffic happens on a mobile device. It’s essential to evaluate your website’s mobile responsiveness to ensure it’s mobile-friendly and meets Google’s criteria. 

Google primarily indexes the mobile version of all websites, rather than the desktop version (known as mobile-first indexing). You can use Google Search Console’s ‘Mobile Usability’ report to see your pages divided into two simple categories – ‘Not Usable’ and ‘Usable.’ This handy tool will also list all of the detected issues, for example, if the content is wider than the screen, or the text is too small to be considered readable. Here you can access detailed guides on how to fix these issues.

8. Identify and fix code issues

A search engine sees a webpage as a piece of code. It’s important to use the correct syntax, with the relevant tags and attributes that help search engines understand your website.

This is when technical SEO and web development go hand-in-hand. You must pay close attention to several parts of your website code and markup: HTML, Javascript and structured data. 

Meta tags

Meta tags are present in your HTML code, in your page’s header. Mega tags are snippets of text that describe a page’s content and provide search engine bots with additional data. 

We’ve covered the robots meta tag, but there are a few others to be aware of:

Viewport meta tag:

This ties in with mobile responsiveness. The viewport meta tag helps you scale your page to different screen sizes, automatically adapting the page size to the user’s device (as long as you have a responsive design). 

Title tag: 

This one’s fairly self-explanatory. Title tags indicate the title of a page, and are used by search engines to form the blue link that you would click on in the SERPs. 

Meta description: 

Search engines use these brief descriptions of a page to form a snippet of text you see below each link in the search results. Meta descriptions are not directly tied to Google’s ranking algorithm, there are still other SEO benefits to optimising them. A technical SEO audit will highlight any issues relating to meta tags, from duplicate title tags and meta descriptions to pages that are missing them altogether. 

Canonical tags:

 Canonical tags point to the main copy of a page. They tell search engines which page needs to be indexed when there are multiple pages with duplicate or similar content. They are placed in the <head> section of a page’s code, pointing to the “canonical” version. Issues you’re likely to come across are pages with no canonical tags, multiple canonical tags or broken canonical tags. 

Hreflang attributes: 

The hreflang attribute denotes the target region and language of a page, helping search engines serve the right variation of a page, based on the user’s location and language preferences. Naturally, this is important for businesses that operate in multiple countries. You’ll need to use the hreflang attribute in <link> tags. 

Hreflang attributes are one of the more complicated aspects of technical SEO, so you may need a Technical SEO specialist to help fix any issues relating to them.

JavaScript: 

Javascript is a programming language used to create the interactive elements of a webpage. Search engines use JavaScript files to render the page. If the search engine can’t get the files to render, the page will not be indexed properly. An SEO auditing tool will detect issues relating to your site’s JavaScript files, such as broken files, uncompressed files and low loading speed caused by JavaScript. 

You can use Google Search Console’s URL Inspection Tool to check how Google renders a page that uses JavaScript.

Structured Data: 

This refers to a standardised formatting for providing information about a page and classifying the page content. It’s organised in a specific code format, often referred to as markup, that shows the search engine what type of content is on the page so it can be indexed and categorised correctly. 

One of the most popular shared collections of markup language among web developers is schema.org. Using schema can make it easier for search engines to index and categorise your pages correctly, and capture SEP features, also referred to as rich results. SERP features are special types of search results that stand out due to their differing format, and the fact that they tend to take up more space on the results page. Examples of SERP features include featured snippets, reviews and FAQs.

You can use Google’s Rich Results Test tool to see whether a page supports rich results. It will also list any issues relating to specific structured data items, along with a link to Google support on how to fix the issues. SEO auditing tools may also offer insight into structured data issues, for example SEMrush’s Markup report provides an overview of all of the structured data types your site uses, plus a list of all invalid items. 

9. Identify and fix HTTPS issues

All websites should be using a HTTPS protocol. HTTP protocols are not encrypted, meaning they do not run on a secure server that uses a security certificate (called an SSL certificate) from a third-party. This certificate proves that the website is trustworthy – your website users will know whether a HTTPS protocol is in use because there will be a padlock icon next to the URL in the web browser.

HTTPS is a confirmed ranking signal for search engines. While implementing HTTPS is a fairly straightforward process, it can cause issues that will come to light when an SEO audit is performed, so you’ll need to check for this. A technical SEO audit will bring up a list of issues connected to HTTPS and the pages affected. Common issues include an expired certificate, old protocol security version, no server name indication and mixed content. 

10. Find and fix problematic status codes

A HTTP status code is a message sent by a website’s server to the browser, which indicates whether or not a request can be fulfilled, and whether a page can be loaded. Status codes are embedded in the HTTP header of a page to tell the browser the result of its request. 

There are five types of HTTP status codes, but we don’t need to worry about 1XX and 2XX statuses. When you perform a site audit, you need to find the 3XX, 4XX and 5XX statuses. 

3XX – indicates redirects. Pages with this status code are not always problematic, but it’s a good idea to check they are being used correctly in order to avoid any problems. A site audit will detect any issues with your website’s redirects and flag any related issues.

4XX – You’ve probably come across the most common 4XX error, which is the 404 error. 4XX errors indicate that a requested page cannot be accessed. If this type of status code is detected, you’ll need to remove all of the internal links that point to the affected pages. 

5XX – this type of status code indicates that the server could not perform a request. This can occur for a number of reasons, from the server being temporarily shut down, or a server overload. If this issue is highlighted during an audit, you’ll need to investigate the reasons why these errors occurred and fix them if possible. 

And that’s a wrap

Performing a technical SEO audit can seem like a daunting task, so breaking the process down into manageable steps makes it a lot easier to navigate. Of course, this isn’t an exhaustive list of technical SEO issues that may arise during an audit, for example log file analysis or problems relating to more advanced website architecture, but if you’re a beginner then these ten steps are a great starting point. 

For a full breakdown of how to conduct a technical SEO audit using SEMrush, click here, and for Ahrefs click here.

If you feel like your business would benefit from the expertise of a technical SEO specialist, get in touch and see how we can help.