10 Step Technical SEO Checklist for Finding Fixes That Can Quickly Move the Needle

How many times have you come across guides like this or like this

There’s nothing wrong with them, but often they just contain too much. Every little detail has been covered. 

Even so-called “essential” technical SEO checklists can have 30+ points

If you’re restrained by small budgets, or simply struggling for time, how do you decide which technical SEO checks are most important?

Well here comes the classic SEO phrase…

“It depends.” 

Covering everything in an audit often results in very little being done – if anything at all – especially if you’re relying on someone else to implement your suggestions. 

In my experience, prioritising technical SEO comes down to a combination of two things. 

  1. What can I actually change?
  2. What’s going to have the biggest impact? 

Here’s 10 technical SEO basics that are well-worth prioritising when you’re strapped for time.

1. Check title tags

All crawling tools – Screaming Frog, Sitebulb, DeepCrawl – will flag duplicate title tags, as well as title tags that are too short, too long, or non-existent. 

But that doesn’t always mean it’s necessary to take action.

Firstly, check whether the affected pages are indexable. You can ignore those which are (intentionally) deindexed.

Next, look at where they live within the site.

Ecommerce sites with near-identical products often end up with duplicate title tags, and no one would want to (or should) go through the process of remedying this by hand. 

Instead, write a rule for populating product title tags automatically. 

Alternatively you could canonicalise near-identical products to one version of the product page. You can then use your time to focus on category and other key pages.

If you suspect something else is to blame, you’re going to need to dig a bit deeper. Duplicate meta data often occurs when a site lacks a process for managing parameters, or serves indexable 200s when an incorrect URL is entered.

2. And meta descriptions

“Problem” meta descriptions can be grouped into the same categories as “problem” title tags – they’re too long, they’re too short, they’re duplicate, or they’re missing. 

Unfortunately, crawling tools can’t understand the nuances in these supposed issues (and some humans don’t think about them either). 

Your priority should be whether your meta descriptions could be optimised for click-through rates, and which ones are worth optimising. 

Since meta descriptions aren’t used as a ranking factor, start by working on descriptions for your highest ranking pages. This can be one of the fastest and most effective ways of increasing click-through-rates and traffic.

3. Do a site: search

Go to Google and put site: before your domain (make sure not to leave a gap between the colon and your domain).

Next, look at the top of the SERP to see how many pages (approximately) Google has in its index.

Is this number similar to what you’d want and expect to see?

It’s not a perfect science, so don’t worry if the number displayed doesn’t exactly match the number of indexable pages on your site. However, a significant discrepancy between how many pages Google is indexing, and how many pages you want indexed, should not be ignored.

Too many indexed pages = index bloat. This usually occurs when there’s nothing in place to prevent dynamic URLs (those containing parameters) from being crawled and indexed. You can read more about URL parameters and how to handle them here.

Too few pages means Google either can’t – or doesn’t want – to index pages that you feel differently about.

There are many potential causes of this.

It could be that user error has blocked search engines from crawling or indexing pages (usually through incorrect use of canonical tags, the robots.txt file, or the noindex meta tag).

It could also be because Google just doesn’t like the pages it’s keeping out of its index. 

Maybe they’re duplicated (either with internal or external content), they’re competing with other pages on your site (keyword cannibalisation), the content’s too thin, or the page load time is too slow.

There are other reasons Google might choose not to index every page you want it to. Finding out why, and fixing the problem, has to be up there with the most important “must-dos” in technical SEO.

4. Check the robots.txt file

As above, checking the robots.txt will tell you whether it’s been used in error to block crawlers from accessing pages, subfolders, or even the entire site (that’s what’s happened if you see this: User-agent: * Disallow: /).

It’s also worth checking the robots.txt to ensure it’s been used correctly – that each directive has been placed on a new line, for example.

You can read more about creating, understanding and using robots.txt files here.

5. Check for broken internal links and pages

Broken links and pages are poor for users, for search engines, and for your SEO.

When users find themselves on a non-existent page – whether that’s through a misspelt link or a link to a page that’s been removed – that might end their interaction with your site.

When the same happens to search engines (or more specifically, their crawlers), they hit a roadblock. They have to take a stab at what’s happened and how they should treat the page. This eats into your crawl budget.

Lastly, if external sites link to a broken URL or non-existent page, you’re not going to benefit from the extra link equity.

Want to know more?

6. Visit search console

Google Search Console has come on leaps and bounds in the last few years, and is now an essential tool for digital marketers. It should always play a part in audits. It can flag issues crawling tools cannot. Plus the information is coming straight from the horse’s mouth.

If possible, get Bing Webmaster Tools as well – no two search engines are the same. 

7. Use the site

This might sound obvious, but it’s amazing how many audits are completed with little to no hands-on interaction with the site. Simply using a site can highlight critical issues that could be affecting conversions.

UX also has an important role to play in SEO. Google’s not only looking for the best possible answer to a given query; it also wants to send users to the best possible site to answer it. 

Since we’re far from understanding the intricacies of Google’s algorithms, it’s best to assume that if users don’t like your site, Google won’t either. 

8. Run the site through a speed tool

Google’s gone on record to say that all things being equal, site speed is a ranking factor. It also – needless to say – makes a big difference to your UX and in turn, your conversion rate.

So what’s the best way to check your site’s speed?

While tools like Gmetrix are undeniably valuable, the best way to find out how Google sees your site is through its own site speed tool – PageSpeed Insights

Just make sure to check multiple pages of your site, from different subfolders. Just because your homepage passes the test, doesn’t mean a category page, product page, or blog post will.

9. Look at the search results

Most automated tools will tell you that your site needs to be faster. That you need more content. That you need more words on your category pages. 

In reality, Google looks at each niche differently. 

Benchmark your site (or the site you’re auditing) against competitors – not against the search results generally. This can help determine where best to focus your efforts, and what your competitors are doing that you need to do better.

10. Don’t overlook common (but simple) mistakes

First hand experience here: I once worked on an audit and noticed that the robots.txt was blocking both Yahoo and Bing from accessing the site. Nobody had noticed for the entire lifespan of the website (and it had been live for a good few years at the time). 

The removal of a few lines of code generated an extra quarter of a million pounds a year for the client. 

If I’d relied on conventional wisdom and did a standard “box-tick” audit, I might not have spotted  this. And – as was the case here – something very simple can mean big wins.

So in summary…

Even when carrying out a simple SEO audit, don’t just rely on what the experts suggest. Do the basics, but always keep your eyes and ears open for mistakes that might be staring right at you. 

Every site is different. No crawling tool can highlight every error. And no SEO checklist can include every single thing you need to watch out for.

Need more help with technical SEO? We’re a remote-first, Birmingham-based team of tech SEO specialists. Talk to us today.

Latest Blogs