Regular SEO audits are crucial for a healthy website that ranks high on the SERPs. That said, an SEO audit does require resources. Without a well-built audit algorithm, you are at risk of wasting too much time on unnecessary things.
That’s why we’ve come up with a detailed guide on performing an SEO audit quickly and easily. Save it, share it with colleagues, and enjoy your SEO workflow.
Let’s get down to business.
1. Cover indexing issues
Your website may be full of great content and perfectly interlinked. None of it matters if search engines cannot see and index it. An unindexed website does not appear in SERPs, so users cannot find it in organic results. This means no organic traffic, no user interactions, and no target actions. This is definitely not a business goal you have.
To check if Google (and other search engines) can access and index your site, go to Google Search Console. This free set of webmaster tools lets you check and fix site issues, measure traffic and performance, and easily deal with most of the problems a website may face.
Note: If you haven’t connected your website to GSC, it’s high time you do. Here are detailed instructions right from Google.
In Search Console, go to Index > Coverage report. Pay attention to the Error and Valid with warning sections.
Scroll down the Coverage report to the Details section to see the exact errors and warnings detected on your website:
To see the pages affected by an error, click on the issue line.
Now look at the errors and warnings and see how to fix them.
Errors in the indexation process mean that Google tried to index the page but failed for some reason. And the first thing is to check whether you actually want a URL to be indexed or not. If not, then just tell Google to stop trying — exclude the page from your sitemap and block it with a noindex tag.
If you’re sure that the page needs to be indexed, then fix the issue:
- Submitted URL marked ‘noindex’. Remove the noindex tag from the page’s HTML code or delete a noindex header from the HTTP request.
- Submitted URL blocked by robots.txt. Update the robots.txt file on your website to remove or change the rule that prevents the page being indexed.
- Submitted URL not found (404). The submitted URL does not exist. Check if the content was moved, and set up a 301 redirect to a new location.
- Submitted URL seems to be a Soft 404. The submitted page either does not exist or has too little content (so Google treats it as an empty one). Check if the page has enough good content and add some if it’s thin. Or set up a 301 redirect if the content was moved.
- Submitted URL returns unauthorized request (401). Google cannot access your page without verification. You can either remove authorization requirements or let Googlebot access the page by verifying identity.
- Submitted URL returned 403. Google has no credentials to perform authorized access. Allow anonymous access to get this page indexed.
- Server error (5xx). Googlebot fails to access the server, but it doesn’t necessarily mean that the server is down all the time — it might have been when Googlebot came around. Check the URL with the Inspect URL If it’s all fine, then request reindexing. If the server is still down, then fix the issue and request reindexing.
- Redirect error. This means that a redirect chain is too long, that a redirect URL exceeds the max URL length (2 MB for Google Chrome), there’s a bad URL in the chain, or there’s a redirect loop. Check the URL with the Lighthouse debugging tool to find out what’s wrong.
Note: Google is planning to update this report soon and merge it with the Error report under the name Invalid. As for now, the Valid with warning repost is still present the way it is.
If Search Console reports that some of your pages have valid with warning status, then these pages are indexed, but Google has doubts if it was necessary. Speaking of SEO, warnings may cause much more headaches than errors, as Google may index pages you don’t want it to.
There are two types of warnings:
- Indexed, though blocked by robots.txt. The page is indexed by Google despite being blocked by the robots.txt file. Here you have to think if you want this page indexed. If not, hide it with the noindex tag, and remove the rule from robots.txt.
- Indexed without content. The page is on Google, but the search engine cannot read the content for some reason (cloaking, thin content, unknown format of the content). Here you’ll have to carefully examine the page to figure out what’s wrong and fix it.
Ok, done with indexation. Let’s move further to the next step of our SEO audit.
2. Check user experience
Poor user experience means a slow website, and a slow website means engagement and high bounce rates. What’s more, as Google made Core Web Vitals a ranking factor, poor UX will also result in low positions.
To help you, Google Search Console has a nice set of Experience reports that will let you check and fix any issues related to UX.
First off, let’s check the Experience > Core Web Vitals report. Here you will find two reports representing the CWV stats on mobile and desktop devices.
Click Open Report to explore the CWV issues in detail.
Focus on the URLs that are marked Poor or Need improvement. Scroll further to see the list of issues.
To explore the issue in more detail and see the list of affected pages, click on the line, then click on the affected URL, and go to PageSpeed Insights. This tool will show you a detailed report on your page performance based on the data coming from your real users.
Look through the report, and scroll down to Opportunity and Diagnostics sections to see the exact suggestions on what to improve and how your site will benefit from those edits.
Note: Google’s CWV benchmarks are kind of too general and are hard to reach for many websites. Still, you should remember that you don’t need to be the best and the fastest. All you need is to be better than your competitors. As Google’s PageSpeed Insights lets you test any page on the web, go and test your competitors’ ranking pages to see if you’re moving in the right direction.
3. Optimize content
Content is one of the most important ranking factors. So, it’s crucial to fill your website with meaningful and keyword-rich content to win high SERP positions and attract visitors.
The easiest way to start content optimization is to optimize existing pages that slightly underperform. As Google Search Console lets you track the current positions of keywords you rank for, you can easily check what keywords need your attention first.
In GSC, go to Performance > Search results, enable Total impressions and Average position, and scroll down to the table of keywords.
Now it’s time to decide what keywords are under performing. Some SEOs believe that positions 5 – 6 are not enough and strive to reach positions 1-2. Others think that moving a page from the second SERP to the first one will do perfectly. Choose the approach according to your business niche and the level of competition there. Also, don’t forget that Google’s SERPs are now full of various SERP features more than ever before, and positions 8 – 10 may happen to appear not even on the first SERP as they were once.
Depending on your choice, filter your queries to see the under performing keywords. In the table of queries, click on filter > position > greater than, and enter the relevant number. Also, filter your keywords by impressions to eliminate unpopular queries. Click filter > impressions > greater than, and enter the minimal number of impressions you consider worth your effort.
To see the pages that rank for the query, click on the query and then go to Pages.
Search Console is a nice tool, but it cannot tell you how exactly you should improve your page and what to add or maybe delete from it. But a dedicated website audit tool can. For example, here’s how to do it in WebSite Auditor.. To see your content optimization opportunities, go to Page Audit > Content Editor, and enter the URL you’re working on and the keyword you optimize for. The tool will carefully analyze your page and welcome you with a bunch of optimization suggestions based on your top SERP competitors. It will give tips on the word count, what keywords to add, how to optimize the title and meta description, etc.
You’re free to edit the content right in the tool (Document Mode of the page) and see the optimization rate score changing as you make your edits. You can then download a PDF of your optimized page and pass it to your content writers as a technical task.
4. Improve CTR of your snippets
Your SERP snippets may have a low CTR for some reason. Maybe they don’t look attractive enough, or they lack information, or the SERP is full of rich snippets, or anything else. Anyway, your goal is to make your snippets get more clicks and bring you the desired traffic.
Check the average CTR rate for the position of your ranking URL and the type of SERP you’re present in. You can go to advancedwebranking.com and select the SERP types you’re interested in. The tool will show you the average CTR for different positions.
Let’s get back to the Search Console. Go to Performance > Search results, and enable Average CTR, Average position, and Total impressions. Filter impressions to exclude unpopular keywords, filter CTR to eliminate keywords that perform as good or even better than the highest benchmark, and sort the table by Position in ascending order.
Pay attention to the CTRs that are too below the benchmark.
What should you do next?
First, go to Google and type in the target query and examine the SERP. Then analyze the SERP — does it contain a lot of ads, rich features, or featured snippets? Do titles look catchy? Optimize your page depending on what you see.
A great optimization hack is to enrich your pages with structured data markup to earn a rich snippet. Also, think of releasing some new types of content (say, a video if the query triggers a lot of videos).
5. Update pages that lost visibility
It’s quite logical that pages may lose popularity and traffic as time goes by. So it’s necessary to find these pages and update them to keep your rankings afloat.
To find these pages, go to GSC’s Performance > Search results report and click Date > Compare > Compare last 6 months to previous period.
Enable Clicks, and switch to Pages to see what pages experienced the greatest traffic loss over the last six months.
Note: Always check if the page is still relevant before you start updating it. Go to Google Trends, for example. The page’s topic may have become irrelevant, rendering it hardly worth your time.
Blockchain was big in 2017 but has been losing popularity ever since. A couple of spikes did not manage to bring the popularity back, so the topic does not now look particularly inviting. Not yet, at least.
6. Find internal linking opportunities
Good internal linking helps search crawlers discover your pages quicker. What’s more, internal linking is a great way to redistribute PageRank on your website and strengthen some pages that are weak for some reasons.
To check what pages lack internal links, go to Links > Top linked pages on the Internal links report in GSC and sort the links in ascending order.
7. Audit your backlinks
Backlinks are one of the most important ranking factors, so keep an eye on your backlink profile. To check what websites link to you most often and to what pages, open the Links report and look at the External links section.
The Top linked pages table will show the most linked pages on your website:
The Top linking sites table, as it’s clear from the name, will show what websites link to you most often.
Check if the pages linked to most often are valuable and well-linked internally so as not to find your incoming link juice sitting idle on your site.
What’s more, you need to get new backlinks from time to time. How? I suggest analyzing your competitors.
- First, open the Top linking sites in GSC and export the data in any format you prefer.
- Now check what websites link to your competitors. Here you’ll need a backlink checker tool. You can choose any you like, even a free one will do (say, Ahrefs or SEO PowerSuite free backlink checker). Check your competitors’ domains to see their backlinks.
- Pick up the most powerful backlink domains of your competitors and search for the same domains in the datasheet you’ve exported from GSC.
- If the domain does not yet link to your website, then you can contact the owners and ask for a backlink.
Think wisely when doing your backlink campaign — ask to link to relevant pages and do not ask for backlinks from loyal competitors’ affiliates (unless you can offer them some more attractive affiliate terms). Do not add backlinks to already powerful pages, or to irrelevant and outdated ones.
Also, look through your business partners and clients and ask them to link to your website. Too simple? Yes, it is. SEO is complicated by default, so people often forget to pay attention to something easy and obvious.
To sum it up
An SEO audit can require a lot of resources Still, it becomes much easier if you have a structured algorithm and know what you do and why. Audit your website regularly, prevent or fix SEO issues using different SEO tools, and enjoy your high rankings.
Jamil is an Organic Search Manager at Cloudways - A SEO friendly hosting Platform. He has 14 years experience in SEO, and is passionate about Digital Marketing and Growth Optimization. Jamil is a Minimalist, Observer, Loves Nature, Animals, Food, Cricket & Mimicking :)