Debugging JavaScript SEO issues

Debugging JavaScript SEO issues

[MUSIC PLAYING] MARTIN SPLITT: Hi, everyone. Thanks for watching this session
on debugging JavaScript SEO issues. In the next 15 minutes,
I will take you on a short journey
in which we will talk a bit about the worries
that a few SEOs still have about JavaScript
and Google Search, then look at the tools available
to SEOs and developers, and then get our hands
dirty on a few case studies from the real world. Now, let's get started
with looking at the basics. Can SEO and
JavaScript be friends? There is a bunch of
history behind this that contributed to various
opinions and answers to this question. Today, the answer
is generally yes.

Sure, as with every
technology, There are things that can
go wrong, but there's nothing inherently
or categorically wrong with JavaScript
sites and Google Search. Let's look at a
few things people tend to get wrong about
JavaScript and Search. The number one
concern brought up is that Googlebot does not
support modern JavaScript or has otherwise very
limited capabilities in terms of JavaScript features. At Google I/O 2019, we announced
the Evergreen Googlebot. This means that Googlebot
uses a current stable Chrome to render websites
and execute JavaScript and that Googlebot follows the
release of new Chrome versions quite closely.

Another worry is concerned
with the two waves of indexing and the delay between
crawling and rendering. Googlebot renders all
pages, and the two waves were a simplification of the
process that isn't accurate anymore. The time pages
spent in the queue between crawling and rendering
is very, very short– five seconds at the
median, a few minutes for the 90th percentile. Rendering itself,
well, takes as long as it takes a website
to load in the browser. Last but not least, be wary
of blanket statements that paint JavaScript as
a general SEO issue. While some search
engines might still have limited capabilities
for processing JavaScript, they ultimately want to
understand modern websites, and that includes JavaScript.

If JavaScript is used
responsibly, tested properly, and implemented
correctly, then there are no issues for Google
Search in particular, and solutions exist
for SEO in general. For example, you may consider
server-side rendering or use dynamic rendering as a
workaround for other crawlers. When saying, test
your site properly, the follow-up question
is usually, well, how do I test my site properly? And luckily, we have a
whole toolkit for you to test your site
for Google Search.

Let's take a look
at what's available. The first tool in your tool
belt is Google Search Console. It's a super powerful tool for
your Google Search performance. Besides a ton of reports, it
contains the URL inspection tool that lets you
check if the URL is in Google Search, if
there are any issues, and how Googlebot sees the page. The second tool that
is really helpful is the Rich Results test. It takes any URL or lets you
copy and paste code to check. Its main purpose is to show
if structured data is correct implemented, but it has much
more to offer than just that. Last but not least, the
Mobile-friendly test is similar to the
Rich Results test. On top of the rendered HTML,
the standards of all embedded resources and network
requests, it also shows an above-the-fold
screenshot of the page as well as possible mobile
user experience issues. Now, let's take these
tools for a spin. I have built three websites
based on real cases that I debugged in
the Webmaster Forums. The first case is a single-page
application that does not show up in Google at all.

As I am not the
owner of the domain, I don't have access to Google
Search Console for this site. But I can still take a look. I will start with a
mobile-friendly test to get a first look at
the page in question. As we can see, the page loads
but shows an error message. When I load the
page in the browser, it displays the data correctly. We can take a look at
the resources Googlebot tried to load for this page.

Here we see that
one wasn't loaded. The api.example.org/products
URL wasn't loaded because it's blocked
by robots.txt. When Googlebot renders,
it respects the robots.txt for each network request
it needs to make– the HTML, CSS, JavaScript,
images, or API calls. In this case, someone
prevented Googlebot from making the API call by
disallowing it in robots.txt. In this case, the web app
handles a failed API request as a Not Found error and shows
the corresponding message to the user. We caught this as a sub 404,
and as it is an error page, we didn't index it. Take note that
there are safer ways to show a 404 page
in a single-page app, such as redirecting to
a URL with a 404 status or setting the page
to a new index. Right. We solved that one. That's pretty good.

All right. Onto the next one. This one is described as
a progressive web app, or PWA, that didn't
show up in search, except for their home page. Let's go find out why. Looking at the home
page, it looks all right. The other views in this
progressive web app also load just fine. Hmm. Let's test one of these pages. We will use the
Mobile-friendly test again to get a first look
at what's going on. Oh. The test says it
can't access the page, but it worked in the browser. So let's check
with our DevTools. In the Network tab, I see
that I get a 200 status from the service worker, though. What happens when I open the
page in an Incognito window? Oops. So the server isn't
actually properly set up to display the page. Instead, the service
worker does all the work to handle the navigation. That isn't good. Googlebot has to behave
like a first time visitor, so it loads a page without
the service worker cookies and so on.

This needs to be
fixed on the server. Great. Two websites fixed, but
I have one more to go. This one is a news
website that is worried because not all content can
be found via a Google Search. To mix things up
a little bit, I'll use the Rich Results
test for this one. The website doesn't seem
to have any obvious issues. Let's look at the rendered HTML. Hmm. Even that looks fine to me. So let's take a look at
the website in the browser. So it loads tens news stories
and links to each news story, and then loads more
stories as I scroll down.

Do we find that in
the rendered HTML too? Interesting. This story isn't in
the rendered HTML. It looks like the initial
ten stories are there, but none of the content that
is being loaded on the scroll. Wait. Does it work when I
resize the window? Oops. It only works when
the user scrolls. Well, Googlebot doesn't scroll. That's why these
stories aren't loaded. That's not exactly a problem. This can be solved by
using an intersection observer, for instance. Generally, I recommend
checking out the documentation at developers.google.com/search
for much more information on this topic and other topics. I hope this was
interesting and helped you with testing your websites
for Google Search. Keep building cool stuff
on the web and take care. [MUSIC PLAYING].

Watch this as video on Youtube

Hire an SEO Expert and get your job done.

Leave a comment

Your email address will not be published. Required fields are marked *

loader