How to Do an SEO Audit in 2023. Part 2 (Beginner Friendly)
Want to spot and fix any SEO mistakes early
on before they ruin your rankings? Make sure you're running regular SEO audits. Hi, and welcome to the second part of the
Full SEO Audit guide. I'm Mary from SEO PowerSuite, the all-in-one
toolkit that covers every aspect of SEO. Before we start, make sure to subscribe to
our channel so you don't miss the next videos. If you haven't seen the first part of the
video, I've included the link to it in the description below.
Additionally, in the description, you will
find a downloadable PDF file with a 15-step SEO audit checklist for quick reference whenever
you need it. Now, let's begin. If your business operates in several markets
and has an international website, you might have already implemented site localization
to target different audiences. If so, you need to audit your localization
implementation. rel= “alternate” hreflang setup is required
for localization to show relations between pages. If it’s implemented incorrectly, that may
cause a number of issues like duplicate content, de-ranking, and others. Here are the things to audit:
Return or self-links. Each language version of a page should have
hreflang attributes that point to the page itself and also to other language/region versions
of the page.
X-default values for unmatched languages. X-default tells search engines which page
version they should use for languages and regions that have not been defined through
your hreflang attributes. It is not necessary but advisable to use. Language or region codes. They should be set correctly. To check your site localization issues, proceed
to the Localization report in WebSite Auditor > Site Audit. Go through each point to make sure your international
SEO works like a charm. Correct localization isn’t limited to technical
aspects only. Cultural peculiarities should also be considered. Most likely content created for individualistic
cultures won’t be understood or correctly perceived by some of the collectivist cultures. Besides, in different countries, they have
different search habits. They may use different search phrases and
different search engines. Yes, in some countries they don’t use Google.
For example, in China, it’s always Baidu,
and in Korea – Naver. So, make sure you revise your content optimization
strategy as well. Do that with Rank Tracker as it shows your
rankings as if searched from a specific location. You can specify the preferred search engines
and then add a preferred location to track rankings more precisely. Redirects may affect both indexation of your
pages and user experience. That’s why let’s pay due attention to
it. The most common mistake even professionals
make is misusing 301 and 302 redirect types. This happens because 302 is set by default
until you directly specify the 301 redirection. The first one is permanent and the second
one is temporary.
And here lies the problem: if you use a 301
redirect, search engines stop indexing the old URL and some of its link juice is passed
to the new destination. But if you use a 302, search engines may continue
to index the old URL, and consider the new one as a duplicate, dividing the link juice
between the two versions. That may hurt your pages' rankings. Additionally, it’s worth auditing your client-side
redirects. Ideally, there shouldn’t be any. But if there are some meta refresh or JavaScript
redirects, make sure you implement them correctly. To detect any redirect issues, go to the corresponding
section in WebSite Auditor > Site Structure > Site Audit and check out the following: Everything that is marked in red should be
fixed as soon as possible. Any redirects are a certain burden for your
site. Their excessive amount may hurt your site
speed and if your pages are well-interlinked, that may significantly complicate crawling
and indexing. To check how many pages on your site are redirected,
go to WebSite Auditor, find Site Structure > Pages and apply filters for HTTP Status
Code to be = 302 or 301.
To quickly check if the number of redirects
can affect your site speed proceed to Site Structure > Site Audit > Page Speed and find
Avoid multiple page redirects. Alternatively, you can find the same information
in Google Search Console (see Experience > Core Web Vitals). If page 1 redirects to page 2 and this one,
in turn, redirects to page 3, and so on, you have a redirect chain. And if a redirect chain ends up with an initial
URL, it’s a redirect loop. As a rule, redirect chains and loops are created
accidentally and are just a waste of resources (link juice, crawl budget, page speed).
Fortunately, they can be easily detected by
WebSite Auditor in the same Redirects report: If the issue occurs, just redirect from the
initial page to the destination page passing all the intermediate “hops”. And if there is a loop, just remove all the
redirects. Security is a ranking factor and should be
maintained no matter what. Here are the basic things that should be audited
first-hand. First things first, buy an SSL certificate. You may purchase it once but make sure it’s
being timely updated. Otherwise, you users will get a notification
like this one: You can use any SSL checker to see if your
certificate is fine.
Also, don’t forget to set up notifications
about the upcoming renewals. To check if there are any HTTPS issues on
your website, go to Google Search Console > Experience report > HTTPS. Besides the fact that your site may not be
served over HTTPS, there can be another issue of mixed content – when HTTP and HTTPS are
met on one page. It weakens the security of the whole page. To check your website for such type of issue,
use WebSite Auditor.
Go to Site Structure > Site Audit> Encoding
and technical factors > HTTPS pages with mixed content issues: Core Web Vitals are not only about speed as
many think, it’s about overall user experience – how fast pages load and how responsive
and stable they are. In plain words, LCP shows how fast it takes
for content to download. Ideally, our LCP should be less than 2.5 seconds. The time greatly depends on your:
Images Image tags
Video thumbnails Background images with CSS
Text elements You can check LCP metric for each page in
WebSite Auditor: from Site Structure, move to the Pages report > Page Speed: You will see the list of all your pages and
their LCP scores.
Those requiring improvement will be marked
in red. The responsiveness of your pages is measured
with First Input Delay. Basically, this metric reflects the time it
takes for a server to respond to a user’s first interaction (click on a button or link)
with your site while it is loading. Ideally, it should be 100 ms and less. What things may worsen FID:
Java Script Unnecessary scripts
Images External fonts and excessive CSS Again, check FID for each page in WebSite
Auditor. In the same workspace (Site Structure > Pages
> Page Speed) find the First Input Delay column: Cumulative Layout Shift measures every unexpected
layout shift that occurs during the entire lifespan of a page. Such a layout shift happens when a visible
element changes its start position. What may worsen CLS:
Images and embeds without specified dimensions Ads, embeds, and iFrames without specified
dimensions Dynamic content
Web fonts causing flash of invisible text or unstyled text.
To check CLS for your pages, check the same
name column: There is a way to check your CWV for your
whole site in Google Search Console. For that, go to Experience report > Core Web
Vitals. You will immediately see how many pages need
better optimization: Alternatively, you can use WebSite Auditor
to check your Core Web Vitals in bulk. From the Site Structure module, go to Site
Audit > Page Speed. You will get not only the list of pages that
do not pass CWV assessment but also a list of recommendations that will help you improve
these metrics.
Mobile friendliness is the gold standard for
a quality website. However, I suggest you first check your mobile
traffic and other metrics (like Bounce Rate and Conversions) to understand how much mobile
traffic you get and whether you meet the needs of mobile users. Note that even if you don’t have so many
mobile users, working on the mobile-friendliness of your website is crucial anyway. Since Google sticks to mobile-first indexing,
it’s the mobile version of your site that Google sees and on the basis of which it makes
ranking decisions. You can do that in Google Analytics > Audience
> Mobile > Overview: Then check the technical aspects of mobile-friendliness. Whatever option you’ve chosen, make sure
you audit the corresponding aspects: If you use responsive design, check out your
viewport meta tag in the page header. It should be set like this: <meta name="viewport"
content="width=device-width, initial-scale=1.0">. This way, your page will display correctly
on any device. If you use dynamic serving, check your vary
HTTP header (Vary: User-Agent). Thus, you tell search engines that different
content will be served on desktops and mobile devices.
If you have a separate mobile version, check
the use of the link rel=alternate tag. It’s used to indicate the relation between
desktop and mobile versions of your website to search engines. From an SEO perspective, responsive design
is a preferred option. So, choose it over others. This is something that is easy to check with
your eyes using Device Mode in your browser. Pay attention to how the following things
look on different devices: Sizes of elements (text, images, icons) are
optimal to be readable Touchpoint elements are not too close to each
other You can check out your mobile usability for
any issues in Google Search Console > Experience > the Mobile Usability report. If something is wrong, there will be details: Alternatively, you can check that out separately
for each page with Mobile-Friendly Test. Now, let’s audit your code and script as
these are the technical factors that may severely impact your SEO. You need to keep your code clean and neat. If there are unnecessary elements (for example,
multiple head elements, etc.), it may slow down page speed.
What you need is to minify your source code. For that, you need to detect these unused
JavaScript and CSS codes first. Once detected, remove this unused code to
speed up your page load. You can’t do any SEO or marketing without
analytics tools. If something is wrong with it, tracking becomes
impossible, or even worse, your data can get into the hands of 3rd parties. So you need to study your source code for
the analytics snippets and make sure they are set up correctly. If you have similar content on several pages
of your site, search engines won’t ever understand what page to rank till you tell
them.
That’s why you need the rel=”canonical”
element in your page markup. It tells search engines which version should
be given preferences. There may be broken canonical links or multiple
canonical URLs. To detect if there are any, go to Site Structure
> Pages and add the needed columns in WebSite Auditor’s workspace – Rel Canonical and
Multiple Rel Canonical URLs: Titles and descriptions not only should be
present on your page, be descriptive and contain a keyword. There also should be no duplicates among them. Besides, both titles and descriptions should
be of optimal length.
You can spot issues with your meta titles
and descriptions in WebSite Auditor > Site Audit > On-Page:
H1-H6 tags’ aim is to inform search engines of what your page is about, and what its structure
is. Plus, they help users navigate the page. Examine your H tags and make sure:
They are consistent and hierarchical. As a rule, H2 are main points, H3 are sub-points,
and so on. It’s not advisable to go for content structure
deeper than H4, otherwise, it will be too difficult to comprehend for users. Single H1 on a page. The H1 heading is the most important one and
should concisely summarize the content’s key message in one phrase. Basically, it’s your headline. You can conveniently audit your H1-H6 tags
in WebSite Auditor > Site Structure > Pages > the On-page tab. You just need to add the appropriate columns
in your workspace. Robots meta tags help control crawling and
indexing. These are the most common values added to
robots tag: Index
Noindex Follow
Nofollow None
Nocache Nosnippet Very often, these tags are implemented incorrectly.
For example, some important pages might be
tagged as noindex or a page may be specified in the robots.txt file and tagged as noindex
simultaneously (which makes the noindex tag inefficient). To avoid possible issues, check if any of
your papes with a noindex tag got into the robots.txt file. For that, go to WebSite Auditor > Site Structure
> Pages. Add the Robots Instructions column to your
workspace to see those pages. Make sure you rebuild the project, enable
expert options, and unclick the Follow robots.txt instruction option so that the tool can see
the instructions but not follow them. Structured data is needed to embrace all your
search appearance opportunities. It helps search engines to understand your
page faster. Besides, using structured data may become
your chance to get yourself rich snippets instead of plain blue links (which may result
in higher click-through rates).
For different types of pages, there will be
their own markup, so we’ll focus on auditing for faults and finding opportunities. Go to Google Search Console > Enhancements
to see if all markups on your website work as intended. If you stumble across any invalid items, make
sure to check the reasons behind that. If you need to audit a specific page, use
Rich Results Test. It may be that you miss out on something you
can implement but haven’t done yet. You need to detect the features that you can
optimize for.
First, check which pages have structured data
markup and which do not in WebSite Auditor: Site Structure > Pages > the Open Graph & Structured
Data Markup tab: For those pages that aren’t marked up, think
of any possibilities. For example:
Product pricing pages and Product markup Product videos and VideoObject markup
Product reviews and Review markup Once opportunities are detected and you are
ready to implement the markup, use Schema Markup Validator to make sure you did everything
correctly. Once the rest of the possible issues have
been worked out, you need to look at the things that may prevent your page from appearing
in search results. And one of the basic things is an XML sitemap,
of course. There can be several issues here: If there is no sitemap, it’s not an issue
as such. However, without it, the crawler doesn't know
what pages to prioritize. You may want to instruct Google what pages
not to crawl and how often your pages are updated so that they are crawled accordingly. And if you have a huge website with a complicated
structure and high click depth or an international one, you can’t do without an XML sitemap.
If you run a large site, segmenting your sitemaps
by sections is a good SEO practice. For example, if you have an e-commerce website,
you can create a single sitemap for your static pages (Privacy Policy, Copyright Policy, Terms
of Use, etc.) and then different sitemaps for your category pages. Or if you have a business site, you have more
static product pages and a blog section that is updated more frequently . In this case,
you create two different sitemaps. By creating sub-sitemaps, you manage the crawl
budget more effectively. That’s why make sure you have several sitemaps
based on how static your pages are. There are a dozen possible faults that may
cause issues with your sitemap. It may be due to the wrong format or wrong
HTML tags or incorrect sitemap URL.
You can observe your sitemaps in Google Search
Console > Index> Sitemaps. If there is something wrong with your sitemap,
its status will be corresponding. A sitemap may become outdated or you simply
might put the wrong URLs onto your sitemap. So first of all, you need to check your sitemaps
for pages that shouldn’t be included: Pages blocked in robots.txt file
Pages with a noindex tag or X-robots tag Pages redirected with 301 status code (permanently)
Deleted pages with 404 status code Canonicalized pages. And remember that you can always generate
your XML sitemaps right in WebSite Auditor’s Sitemap Generator to avoid any mistakes. Finally, the last thing to check is your robots.txt
file. It enables you to block the crawling of certain
pages. Here are the common issues that might appear: You may accidentally block the wrong page. Or it may be so that you deleted a page or
set a permanent redirect but the page remained in a sitemap file. To see what pages are blocked from crawling,
go to Site Structure > Site Audit > Indexing and Crawlability.
Find the Resources Restricted from Indexing
factor and check out the list of pages and their robots instructions. If you see that some important pages were
blocked, fix the issue by removing the corresponding robots.txt rule. There also may be another issue: you block
a page but it’s still crawled and indexed because the page is well-interlinked. You can check what pages are blocked in the
robots.txt file but still are linked to with Website Auditor. For that go to Site Structure > Pages and
look over the pages, their robots instructions, and whether they have any internal links. Note: You can use Google’s robots.txt Tester
to spot any issues that occurred. As you can see, there are plenty of things
that can be managed incorrectly in terms of SEO. So, don’t wait until something bad happens
to your site, run an SEO audit regularly.
And don’t forget to download the PDF to
keep the cheat sheet at hand when needed. Thanks for watching this video till the end! I hope you guys found it helpful, and if you
did, give it a thumbs-up and subscribe to our channel! See you in the next video. Bye!.