Is your website fast, secure, and easy to read? This is what a technical SEO audit boils down to. While technical SEO can quickly get complex, once you look past the jargon there’s some simple steps to make sure you’re nailing the basics. We’re here to help with our seven-step technical SEO audit checklist—designed for business owners with no prior SEO knowledge.
Take Me Straight to the Audit!
What is Technical SEO?
Before we break into the technical SEO audit, let’s lay down some definitions. What is technical SEO, and what’s the point of an audit?
Well, technical SEO focuses on the technical aspects of your site. Many elements affect a website’s technical health, but technical SEO will make or break your rankings; it pretty much determines how easily Google’s robots are able to understand your content. Search engines “understand” your website through crawling, indexing and rendering your website.
The goal of a technical SEO audit is to make your website more efficient. Technical SEO falls into a category of SEO strategies known as on-page SEO: these are practices designed to improve your site rankings in search results.
If you understand the key pillars of technical SEO—site speed, site security and content clarity—this is a great bedrock to form your understanding (and your audit) upon.
Why Do a Technical SEO Audit?
A technical SEO audit doesn’t guarantee a number one ranking. However, ignoring this crucial aspect of SEO will certainly affect your performance in search engine rankings for a variety of reasons.
Technical SEO audits can be as big or small as you like, and come with a number of additional benefits. A few small changes could be the difference between your ranking and that of your competitors.
Technical SEO indicates value
Google aims to deliver relevant content to the right user, at the right time. If Google’s search results fail to meet the searcher’s expectations, that user may take their search to a different search engine – which is bad news for Google. So for this reason, Google rewards webmasters who provide Google’s users with a good experience and play by their rules—and adhering to requirements for technical SEO is one key indicator of value.
Technical SEO improves your UX
User experience (UX) plays a critical role in your performance, both on search engines and in the minds of individual users. If you don’t pay attention to technical SEO, your site will likely be slow, buggy and generally frustrating to use, leading to a poor user experience. Not only will Google penalise you for this, for reasons mentioned above—you’ll see less conversions, higher bounce rate, and fewer visitors.
Tools You’ll Need for Your Technical SEO Audit
With these free tools, you can easily complete your technical SEO audit.
- Screaming Frog – free up to 500 URLs
- Google Search Console
- Rank Math WordPress Plugin
- Page Speed Insights
Technical SEO Audit Checklist
This on page SEO audit checklist covers all the basic technical elements you’ll need to complete your first website audit.
1. Make your site indexable and crawlable
With a clear internal linking structure and sitemap, search engines will understand the most important content on your website with greater ease. However, one small error could prevent Google from crawling the best of content. That’s why it’s important to ensure that your site is indexable and crawlable.
Search engine robots follow a directory known as a robots.txt file. This file is the first reference point for search engines to crawl your site, as it tells them which pages to visit and which to ignore. Through code, it instructs Google what to display in search results and which links to follow. As you can imagine, this makes robots.txt a powerful aspect of technical SEO and should be treated with care.
You can use your robots.txt file to:
- Block certain site pages from being crawled.
- Have a page crawled but not displayed in search results.
- Have a page crawled without Google following the links on the page.
How to optimise your robots.txt file
The robots.txt file is available to the public; you can access it just by adding “robots.txt” to the end of a root domain.
You can test any URL on your site and edit your robots.txt file in Google Search Console. We’d recommend adding your sitemap to your robots.txt file, as it helps search engines discover and crawl the pages on your site.
This tool will tell you if your site has a high crawl rate—in other words, whether your site is regularly visited by bots and is crawled quickly and easily. It will also tell you if your robots.txt file has any warnings or errors.
Address 404 pages
We’ve all ended up on a 404 page (a page that doesn’t exist). It’s a frustrating experience for both the user and search engine bots—who generally find many more dead links than users, because they automatically follow every link they encounter. Including hidden links.
Most sites have a few hidden links and dead links; it’s bound to happen when you’re consistently updating a website. To avoid this, you should always redirect or move deleted page URLs.
Inspect your URLs
With a URL inspection tool, you can check whether new pages are indexed and troubleshoot any lost traffic to important pages on your site. For example, the coverage report in Google Search Console allows you to view the status of each page on your site. Some statuses to look out for include:
- Errors—404s and redirects.
- Valid with warnings—indexed pages with warnings attached.
- Valid—successfully indexed pages.
- Excluded—pages not indexed and why (eg. redirects or blocked by robots.txt).
Run a crawl
With Screaming Frog, you’ll be able to see two columns after a crawl which can help you check your site’s indexing:
- The indexability column tells you whether the URL is indexable or non-indexable.
- Indexability status shows why the URL is non-indexable.
Screaming Frog is a great method for bulk auditing your site, because it helps you understand which pages are indexable (will appear in search results).
A simple Google search can tell you what percentage of your site is indexed: just search “site:yourdomain”. The results page will show you every page of your website that has been indexed, as well as how many pages Google currently has stored. If you notice a big difference between this number and the amount you think you have, this may indicate that your indexing needs fixing.
2. Secure your website
An SSL (secure sockets layer) certificate is a must for any website. Users need to know that their data and credentials are secure, in order to trust you and your site. By purchasing an SSL certificate, you can improve your SEO rankings and the overall security of your website.
When you purchase an SSL certificate, you can also implement HTTPS. HTTPS ensures that the information exchanged between the browser and your site is secure. HTTPS is a Google ranking signal, meaning secure websites rank higher than sites without HTTPS.
To check if your website has HTTPs, look at the left side of your browser’s search bar. If your site is safe, you’ll see a lock.
3. Create a dynamically generated sitemap
A sitemap is a comprehensive list of pages which serve as a roadmap for bots finding and indexing pages on your site. XML sitemaps ensure that bots won’t miss your most important pages, and usually include categorised posts, pages, tags or custom post types, as well as the number of images and the most recent modification date.
Sitemaps are easy to create and can pay off, especially if your site’s linking structure needs work. If your site architecture is in good shape, robots won’t necessarily need a sitemap; regardless, it’s always best to submit your sitemap to Google Search Console and Bing Webmaster Tools.
In creating your sitemap, it’s a good idea to consider the following:
- Structure your sitemap in an XML document, and follow XML sitemap protocol.
- Only include canonical versions of URLs, and include new or updated pages.
- Don’t include “noindex” pages; pages you don’t want Google to index.
A sitemap should refresh every time a change is made to your site. To make things easy, WordPress has a plugin which creates a sitemap for you. Similarly, Screaming Frog can do this analysis for you in great detail, including the URLs listed in your sitemap with missing and orphan pages.
4. Make it mobile-friendly
Most users browse on their smartphones and as a result, Google prioritises sites optimised for mobile devices. In fact, instead of using desktop versions for ranking and indexing, Google first looks at your mobile page. Much of your competition is likely already optimising for mobile—so taking care of this will ensure you don’t miss out on any traffic!
Google offers free tools like the Mobile-Friendly Test to check your site’s mobile compatibility. Just input your domain, and it shows you what your page looks like on mobile, and whether it requires further mobile optimisation. In addition, this tool can tell you:
- Details on your site speed with 3G, 4G and 5G connection.
- Customised recommendations for individual page fixes.
- How to benchmark your site speed compared to your competitors.
- The impact your site speed has on your revenue.
It’s always a good idea to check your site with your own phone. As you’re navigating the website, look out for any errors which could be blocking conversions. This includes contact details and key service pages.
5. Check your site’s load speed
Site speed is a big part of technical SEO—even more so, now that Google has implemented core web vitals as a ranking factor. More than ever, a website needs to load quickly in order to rank and minimise bounce rates. Page speed encompasses the speed at which search engines can crawl it… which means making your code as efficient as possible with the help of online tools. Here are some small but key steps you can take.
First, test your site speed using Google’s PageSpeed Insights. It scores your website either “fast”, “average” or “slow” for both mobile and desktop versions, and makes suggestions on how you can improve. If nothing else, test your homepage and key content pages.
Google Analytics runs detailed reports on site speed, and can be accessed in Behaviour > Site Speed. This section has a range of useful insights: namely, it shows you how certain pages perform in terms of site speed based on browsers and countries. To make sure you’re prioritising optimisations on the right pages, compare this data against your page views.
6. Remove duplicate content
If you have the same content on multiple pages of your website, this is known as duplicate content. This can result in confusion for search engines, as they don’t know which page to rank higher. In turn, this can cause Google to rank all website pages with duplicate content lower.
Duplicate content can happen more easily than you think—sometimes for technical reasons. The user may not notice, but Google bots will; in crawling your site, they’ll be quick to identify the same content with the URL being the only difference.
You can quickly check for duplicate content with Google search parameters: just type “info:yourdomain” into the search bar and select the final page of results. If you see a duplicate content warning, you may need to run a crawl with Screaming Frog and sort the results by Page Title.
7. Optimise page titles and H1s
For every page on your website, there should be a corresponding page title and meta description. These should be optimised for the keywords you aim to rank for, so it’s best to ensure none of these are missing or in need of optimisation.
These are just a few technical SEO audit steps you can take, to secure greater visibility and higher rankings. Technical SEO is made up of many elements and sometimes, the assistance of an expert can help you solve issues much faster. And, you’ll learn a thing or two yourself! For more information on how to perform a technical SEO audit, contact the SEO team at Vine Digital today.
The method used by search engines to discover and understand the pages on your website.
The process of storing and organising content found on pages which have been crawled.
Search engines use this process to turn your site’s code into viewable web pages.
User Experience (UX)
The user’s experience of interacting with your website.
A list (or map) of URLs, for crawlers to follow when they arrive on your website. A sitemap helps search engines understand and index your site content.
A file outlining the pages on your website that search engines should crawl, and avoid.
SSL (Secure Sockets Layer) certificate
A SSL certificate signals to search engines that your website is safe. A Secure Sockets Layer encrypts the data exchanged between the server and the user’s web browser.
Your website structure. Ideally, similar content should be grouped together by theme, and presented on the navigational menu of your site.
A tag which allows you to tell search engines which pages are original, and which URLs are duplicate pages.
A HTML element which shows the title of your webpage in search results, and in the browser tab.
Meta descriptions describe the content on a particular page of your website. These are HTML elements, and often appear in search result pages.
The process of ensuring pages are aligned with technical SEO best practices.