Eyeweb

Tel: 01482 628830

Email: hello@eyeweb.co.uk

Most common issues from SEO audits

Over the past eight years as a technical SEO, I’ve seen my fair share of websites and the resulting technical issues. Unfortunately, a surprising number of things get overlooked by website owners and developers, so I’d like to share some of the most common problems I’ve encountered and how to check for them.


Before we start, let's cover what technical SEO is: Simplified, it’s ensuring that Google can find a website, Google-bot can understand the content, and the website adheres to the webmaster's guidelines.


Here are my top 7 things that websites get wrong from an SEO perspective:


1 - Multiple versions of the website are getting found by Google. 


From an overall website point of view, there are many easy ways for your website to have duplicate versions. First is the difference between HTTP and HTTPS. All websites should be served over a secure connection nowadays (HTTPS), but many website owners forget to stop the HTTP version from being accessible. The second is the difference between www. and non-www. versions of the site. When setting up your DNS, you can set which version of the website address to prefer to show, but again many websites can forget to stop access to the alternative version.


The easiest way to check for duplication is by searching for the different versions on Google to see what’s showing. Simple type site:”yoursite” inurl:”version you wish to check” onto Google results to see which pages are returning and whether Google has managed to find them. 


Checking for HTTPS:


Checking for non-www. version

Checking for HTTP version


2 - Google finds more pages on your website than in your sitemap. 


It can be an issue when Google finds procedurally generated pages or duplications of your ideal landing pages. You don’t want Google-bot wasting time crawling your irrelevant pages as it potentially reduces the amount it revisits your website or even gives up crawling more pages in extreme circumstances. You also don’t want to find duplicate versions of your landing pages, as Google bot can struggle to understand which ones to rank and weaken your ideal landing pages.


The first thing is to check your website sitemap.xml, a file that lists all your website landing pages to help Google-bot find your content. This file can be found at yoursite.com/sitemap.xml or yoursite.com/sitemap_index.xml. If you don’t have one, contact your website developer to generate a sitemap and upload it to the location.


You should be able to review which pages are in the sitemap - Does it contain all of the pages that you want Google to find / Does it contain anything you don’t? Compare how many pages are in the sitemap vs how many Google has found. This will allow you to see which parts of the site Google bot finds unintentionally and restrict its access to them. 


3 - Websites haven’t been optimised for page speed


This issue comes up in almost every audit as it can be challenging to implement correctly (or be very time-consuming), especially when using website builders within WordPress or other CMS’. However, it is a way to see an almost instant improvement to your website experience, conversion rates and, ultimately, SEO performance. Google has a whole report in Google Search Console dedicated to page speed, so it is well worth understanding the key metrics and how to improve them. 


In Google Search Console, you can find information about speed and performance in the Core Web Vitals section. This report flags any page speed issues Google bot runs into whilst crawling the site. If you don’t have enough data/pages in this report, you can manually test things at Google’s Page Speed Insights. You are specifically looking for any failures or warnings for LCP, CLS and FCP.


LCP = Largest Contentful Paint - This is how long it takes for the main content on the page to become visible.

CLS = Cumulative Layout Shift - This measures how much the page moves around as it is loading, making it difficult to click on things quickly.

FCP = First Contentful Paint or FID (First Input Delay) - These measures how quickly the website begins to load. 


The great thing about the Page Speed Insights report is that it gives a list of items for developers to work through to improve the site speed. Some key ones to make a quick impact are minifying and optimising the CSS, JS and HTML files and ensuring that your images are optimised. 


4 - Webpages don’t use a correct/nested heading structure 


This is another issue that comes up frequently in website audits, as it can often be a handy development shortcut to use heading styles when building templates. Google bot uses headings to help understand the context of content on webpages and its relationship to the page subject. Every page should have a Heading 1 (H1) and then content nested within subsequent heading structures (H2, H3 and so on). There shouldn't be the use of random headings outside of the document outlines, especially not to style text within the website navigation or footer. 


Checking for this requires looking at the website source code to see how the text has been marked. The easiest way to do this is to use your browser's inspect element tool, which will bring up the website code alongside your web page. Right-click the heading you would like to review and click “inspect”.



You should then be able to see how elements are marked up. You are looking for <h1>, <h2>, <h3> tags and so on. 


Top tip - You can install browser extensions that pull all headings into a document outline. We use the ‘Web Developer Tools’ extension in Google Chrome. 


You should also try to use relevant keywords within your page headings where they are appropriate to help Google understand the page's subject. 


5. Not all pages get checked for mobile usability. 


It’s more important than ever to ensure your website is optimised to show on a mobile device, and a lot of the development community has moved to a mobile-first approach to web design and builds. As a result, it’s become rarer and rarer to see websites that aren’t responsive and mobile optimised, but a surprising number of websites fail to check every page template to ensure no elements fail. 


The easiest way to check this is to head to Google Search Console and review the Mobile Usability report. This will list if you have any pages on the website that have mobile usability issues. Some of the most common issues are - 


Content wider than screen - Especially if you use any iframes or embeds on your websites such as embedded virtual tours or videos. 


Text too small to read - Especially if you can set the text size in the CMS.


Clickable elements too close together - Especially if you’ve recently added more pages to your navigation or any new call-out elements. 


Some rarer issues include viewports not being set correctly or using incompatible plugins. 


Any of these issues should be pulled out into a list to be fixed by the website developer.


If you don’t have access to Search Console, you can use the mobile-friendly test available on Google - Mobile-Friendly Test - Google Search Console 


6 - Broken Links, 404 pages and broken images


This is standard in most SEO audits and monitoring tools for a good reason. Over time websites evolve and update, and it doesn’t take long for pages to get deleted, images to be removed, or links to external sources to fail. It can become very frustrating for a website user to reach dead ends when looking for information or products, and these bad experiences can cause them to leave the website. 


Some of the most common broken links we come across result from the old blog/news pages that haven’t been updated to match new services and products. Many people forget that old pages can still be a good source of the traffic to the website, so it is vital to ensure users can easily find their way to your services and products without a hiccup. Search Console does a great job flagging broken pages in the “Coverage” report. 



This reporting section will help you monitor the number of broken pages and identify where the resulting internal links are found. Most SEO agencies use website crawlers to monitor the website's health and fix any broken pages regularly.


Another great thing to think about is building a custom 404 Page so that when users do end up hitting a dead end, there is an optimised page to help them find what they are looking for. Consider adding a search box, links to your services and contact information to make it as accessible as possible.


7 - Poor page titles and descriptions


Page titles and meta descriptions offer a valuable opportunity to encourage people to click through to your website from search results. The issues that we find in our website audits vary, but the one thing in common is most sites are missing opportunities.


The page title and meta description are the two fields appearing on your page's Google search results. 



The most common issue we come across is missing meta descriptions. Meta descriptions are often overlooked as they aren’t a direct ranking factor. However, they still provide an opportunity to increase the click-through rate. This issue is especially prevalent on blogs, and news pages as the sheer number of them on the site turn this optimisation task into a tremendous job. We recommend that the best process to address this is to write the descriptions each time a new blog is added to the site. In addition, it would be worth going through your key pages, such as services or essential landing pages, and ensuring you have good meta descriptions.


A less common issue but more impactful is making sure to use good keywords in your page titles. A page title aims to concisely describe the content on the page and encourage users to click through to your page. Therefore, you should consider the keywords that users might search on Google and list the main one as your page title. 


We’ve also seen SEOs go too far with these optimisations, leading us to the final common issue in audits - over-optimised and spammy keywords in page titles. Although page titles are an excellent opportunity to get your keywords onto search results, putting too many can be spammy and even prevent some users from clicking on your listings. Some recent algorithm updates have led to Google overwriting pages' poor page titles and descriptions to match user intent if they aren’t good enough.