English Google SEO office-hours from October 2, 2020
JOHN MUELLER: All right,
welcome, everyone, to today's Google SEO
Office Hours Hangout. It's not Hangout anymore. I guess the Hangout part we'll
have to drop at some point, but Google SEO Office
Hours, I guess. I'm John Mueller. I am a search advocate here
at Google in Switzerland. And part of what we do are
these Hangouts, office hours, I guess, so awkward
with the names. They keep changing. But, anyway, a
bunch of questions were submitted already on
YouTube, which is fantastic, and it's good to see a
bunch of you here as well. If any of you would
like to get started with the first question,
you're welcome to jump in.
AUDIENCE: Hi, John. JOHN MUELLER: Hi. AUDIENCE: Can I start? JOHN MUELLER: Sure. AUDIENCE: OK. First of all, thanks because
the [INAUDIBLE] problems, I don't know if you remember
from a couple of months ago, have been fixed, finally. So thanks. [LAUGHS] JOHN MUELLER: Woo. AUDIENCE: Now it appears. JOHN MUELLER: OK. AUDIENCE: Thank you very much. This is the question. If I look at the
statistics of the Google Search Console in
the past six months. I see that the number of
impressions has nearly doubled. So in six months, Google
shows twice the time on the Search Console to
the people on the internet. So I can say, OK, this is very
good because maybe Google is liking the site now. But at the same time,
the average position went down, significantly down. And so the overall number
of clicks is the same. So how can I interpret this? I mean, what needs to be changed
or to improve the situation because these are two
different signals? One is positive, and
the other is negative.
JOHN MUELLER: Yeah,
so with any kind of changes that you see
in the performance report, my recommendation
is always to try to drill down and find some
examples of this change in a way that is very visible. So, for example, you
could look into see if there are specific
pages that have changed in like the ranking or in
the number of impressions or if there are specific queries
where you see this change. So one thing that
might be happening is that maybe just more
people are searching. It might also be
happening that you're showing for slightly
different queries than before, where
there may be more impressions for those
queries, but you're not quite in that competitive range yet. So all of these things can help
you to figure out a little bit better where your site is
positioned at the moment and what you might be able
to do to improve there. So, for example, if
you're being shown for more competitive
queries, more things that more people
are searching for, then that could be a sign that,
like, Google has recognized that you're relevant for
those kind of queries, and now it's up to
you to really improve the quality of your website
overall so that Google says, well, this is actually a
good result for this query.
And it's something
that, ultimately, you have to figure out with
your business goals in mind and think about, like,
is this the query that I think my site
should be ranking for, or maybe is Google showing
it for the wrong query? And if so, how can I
help Google understand my content a little bit better? So it's not just
like blindly focusing on the number of impressions
and the number of clicks and the number of the
rankings that you have there, but rather trying to
figure out how can you position your
business in a way that makes sense for your business,
for your business goals, and kind of brings it into
the right place for users. AUDIENCE: OK, thank
you very much. JOHN MUELLER: And it's sometimes
really tricky with these kinds of things because– also because of things
like internationalization. So that's something
that we sometimes see with our own content. For example, I don't know,
the Webmaster Central blog in Canada sometimes
ranks for the word Google.
And you can imagine then
the number of impressions from Canada is suddenly really
high, but the number of clicks is really low because
normal people, when they search for
Google, they don't want to go to the webmaster blog. So it's something where
sometimes you can do something to control it, and sometimes
it's just, I don't know, the algorithms being
a little bit weird. AUDIENCE: OK, thank you. JOHN MUELLER: Sure. AUDIENCE: Thank you, John. Thank you very much. JOHN MUELLER: All right, any
other questions before we jump into the submitted ones? Sergio, I think you're
trying to say something, but I can't hear anything. AUDIENCE: Well, he did post the
question in the chat, though? JOHN MUELLER: Oh, in the chat. OK. All right, let me see. Oh, OK, long question. "One of my client sites has been
around for more than five years and holds good rankings for
keywords related to hypnosis, self-help, binaural waves,
and subliminal messages, as a form of therapy.
We've noticed over
time, especially after the medical
and BERT updates, that these rankings dropped
to the second and third page of results." Oh, man, I see you're
struggling with the microphone, but I'll just read
for the moment. And it's like, if you
get online, no problem. "The content has always
been great in comparison to the pages that
now rank on page 1. I can see that the first five
positions for many keywords have been awarded to
medical and psychologically authoritative sites, which is
great, but position 5 to 10, there are many low quality
sites in comparison." And then there's
some examples there. I think you also
submitted the question. Let me see if I can
find your question. "I do see that the
site is thriving in other areas and other
pages that we consider more supplemental in content,
while the main content seems to be plummeting.
My suspicion is that
Google thinks the site and its author's
areas of expertise are not relevant
enough to be trusted to award better rankings. Could that be the case? If so, how could we
recover from it?" So it's definitely
possible that this is the kind of thing that is
happening here, especially when it comes to, I think,
medical kind of queries, it is something
where our algorithms try to be a little bit more– I don't know, more
critical, with regards to the site and the
information that we find there. So that's something where
I would try to find a way– oh, I can hear
something from you. AUDIENCE: Yeah, I
think I've fixed it. JOHN MUELLER: OK. AUDIENCE: OK, cool. Cool. Yeah, I just would like to add. I didn't want to
really interrupt you, but I wanted to add– I mean, I've been
looking at old queries, and it seems like the pattern
indicates that the specialized sites like the psychology and
all this traditional, I would say, medical institutions
would dominate those rankings, top five positions. But then let's say that the
alternatives sort of ways of treating those
mental illnesses are falling a bit after that
between 5 and 10 sometimes.
So my question really
is, is it worth it to keep on trying
going after those queries, or is it something that maybe
it's never going to happen? Maybe we should just
do something else. JOHN MUELLER: I would
never say that it's never going to change because
the web is very dynamic, and these things do change. But I think just
leaving it as it is and hoping that it magically
changes on its own, I think that's generally
a bad strategy.
So on the one hand, I
think kind of shifting the focus a little bit
and trying to find queries where it makes more sense
to show these sites, that might be an option. The other idea might
be really to find a way to really highlight
why your site is really the one that should be shown
for some of these queries. And I realize that's kind
of tricky in the sense that my usual advice
is to say, well, you should make sure
that your site should be ranked number one by far.
And it sounds like, in your
position, you're saying, well, the number one
rankings are kind of OK, and we agree with those. We would just like to be kind
of on the first page as well. And I think that's kind
of a tricky balance. So one thing I would
recommend, though, is to look at what some
of the other people around in the SEO area around EIT
have been writing about. There are lots of
really good, I think, case studies or
examples of sites that worked hard to improve
their expertise, authority, and trust in terms of how
they present their content and in terms of how they create
the content, how they have things like author
profiles, all of this.
That might be something
where you could look into it. So I don't know all
of the names offhand, and I feel if I
mentioned some of them, then the other ones that
I forget would be upset. But there is some
really good SEOs that have been
working in this area, have been working with
medical sites on this topic, and I would try to
search out their content and look at some
of those examples. I can't guarantee
that if you improve your site in that
regard, that, suddenly, our algorithms
will say, oh, this is really a fantastic site. But it sounds like you're
touching on the medical area there, and it definitely makes
sense, at least for users, to make sure that you
have all of those signals as well, that you're
really saying, well, this is not something
that, I don't know, some kid dreamt up
in his basement, but, rather, we spent a
long time researching this, and here is like the research,
and here are the behind it and kind of presenting it
in a way that is trustworthy for users as well.
And then sometimes we can pick
that up for searches as well. AUDIENCE: Thank you. Thank you, John. Perfect. JOHN MUELLER: OK. AUDIENCE: Hey, John. JOHN MUELLER: Hi. AUDIENCE: Question. Hi. JOHN MUELLER: Go for it. Sure. AUDIENCE: OK, my question
is regarding– well, one first on indexing and ranking. So, for example, there
is a website that is on mobile-first indexing. So from this point of
view, I think Google will– what you say, will use
mobile-first content indexing signals for ranking sites
on desktop and [INAUDIBLE].. Is that right? JOHN MUELLER: Yes. Yes. AUDIENCE: OK. Now, the second question,
is if the website is on mobile-first indexing,
but if the majority of traffic is coming from desktop
because their product is a desktop-only product,
it's a software application that runs on desktop only. So the majority of traffic
is coming from desktops. In that case, will you
consider that desktop content for rankings, or it
will be still considered the mobile version for rankings? JOHN MUELLER: It would
only use the mobile version for rankings.
So when we shift to
mobile-first indexing, or if we've already shifted that
site to mobile-first indexing, then we will only use
the mobile version for indexing and for ranking. So in a case like that,
one thing you can do is if it's really
primarily a desktop site, like, for example, if it's
a software company that only sells desktop software, then
you can just make a desktop site and don't have kind
of a simplified mobile version for that. But it doesn't matter where
the traffic is coming from or what type of users are
searching for the site, if it's in
mobile-first indexing, we only use the mobile
version for indexing. AUDIENCE: Sure. Thank you. This was confusing, basically,
because most of the time, on desktops we were confused
if Google is considering desktop signals or not.
JOHN MUELLER: Yeah. Sure. AUDIENCE: John, quick
follow up on that. I know we talked last time
about a classified site. I'm working within the fact
they're still on desktop, and one of the reasons
it wasn't migrated yet was because of differences
in content for pages with very high impressions. So I checked a bit
into that, and I noticed pages with the
highest impressions are usually kind of
like category pages for classified ads. And those change very fast. I mean, there are a lot
of– it's one of the biggest classified ads in Romania. So the pages
themselves change a lot in a very short amount of time. So I was wondering
when Google checks whether the desktop version is
similar to the mobile versions in terms of content,
is there any delay? Does it check it at
this exact same time, or is there some time passing
between the individual tests? JOHN MUELLER: I mean, there
is always a delay because, like, you can't
really do everything at exactly the same time.
But, in general, we should
be able to deal with that. So, like, on news
sites, we kind of have the same problem that there
are always new news articles. And my understanding
is we should be able to deal with that. What I have seen on– not on classified sites in
particular, but sometimes on e-commerce sites, is when
you look at category pages, sometimes the
mobile category page has a lower number
of articles on it. So on desktop, you have kind
of like this matrix of content where you have, I don't
know, maybe 50 items on the desktop category
page, and on mobile, you just have 10 because you
have this one row or one column of content.
So that's something where we
could kind of pick that up on. And I think we have
that in the last blog post on mobile-first
indexing as well. We kind of touched
on that subject. I think, in general,
in a case like this, I wouldn't necessarily
worry about it if you're sure that
kind of like this is how we want to present
it, and if we index the mobile version, it's like
we have all of the content. It's just a different
number of items, and it's like it's all the same. Then in a case like that,
I would just leave it be. It's not that you're going
to rank better or higher with mobile-first indexing. At some point, we'll
just shift the site over and say, oh, we've
waited long enough.
And if you're happy
with the mobile content and the desktop
content, then it's like, well, nothing will
change, right? So that should be fine. AUDIENCE: Right. It makes sense. In this particular case, the
number of individual listings is the same, but, again, due to
the freshness of the new ads, the actual content
changes in minutes. So if there's, like, if we're
talking about minutes, in terms of delay, it's already
different content, especially for those very high
impression categories. JOHN MUELLER: Yeah. I could imagine that maybe
that throws us off, yeah. AUDIENCE: But you mentioned it
doesn't really matter anyway because whether it's on desktop
or you shift on mobile, it's just going to be
the same content, I mean, new content anyway.
JOHN MUELLER: Yeah. Yeah. Yeah, that's an
interesting one, though. I mean, I think with
mobile-first indexing, it's always kind of a case of if
your site is not shifted over, do you have a problem or not? And as long as you can
determine that, actually, like, our site is OK, and
it's just Google that is a bit confused,
then, like, I would just leave it be and let it
kind of take its time. AUDIENCE: Right. I mean, if it was an
internal linking problem, that would actually
be more of a concern because then ranking signals
might not be transferred correctly and so forth.
But in this case,
it's not the case. JOHN MUELLER: Yeah, cool. AUDIENCE: Thanks. AUDIENCE: Hey, John. JOHN MUELLER: Hi. AUDIENCE: Quick question. So I've been working with
a news media publisher, and they did recently– in early August,
they did a redesign. And after two to three
weeks, their traffic dropped by almost 40%. So they are quite big
in terms of traffic and in terms of keyword
rankings and authority. And the only thing that
changed during this,, let's call it light redesign,
was some fonts and not a lot in terms of layer changes. And we've tried different
things to kind of investigate why this drop happened,
worked on page fit issues that were there before, but maybe
could have been the reason.
We know that some internal
linking might have changed, but that doesn't totally
explain why the drop happened site-wide, not just on
some specific articles and other things. But one thing that
we are suspecting that might be the case is
the page layout penalty. So here's what happened. When the article starts,
there is a video there, and sometimes, in that video,
an ad plays, not all the times, but in some cases.
When we are auditing the
pages through Lighthouse, in some cases, it shows
that there is no ads. In some cases, there is. So we're suspecting that that
is causing the Google to see that there are a lot of– like there is this
ad on mobile that's above the fold that is
considerably taking space, and that was before that we
designed at the same place.
But our suspect is that
because of the light redesign, Google totally re-evaluated the
whole site and now is thinking that this shouldn't be there,
and it's not a bad user experience for the users– it's a bad user experience. And for that reason,
we're suspecting that this might be the
reason for the drop because we are
struggling to find one reason to explain
the site-wide traffic drop of about 40%. JOHN MUELLER: Yeah. I don't think a redesign
would trigger a re-evaluation in terms of understanding
the page layout.
So that seems like something
that would be unlikely. It's hard to say– with what you're saying
about the redesign, it sounds like that wouldn't
be playing a role with kind of how we rank things there. But sometimes, there
are subtle effects that aren't visible at first. So, for example, if
you were to shift to a JavaScript framework
rather than a static HTML site, then it could look very similar,
but from a technical point of view, it would
be very different. AUDIENCE: Which we didn't. JOHN MUELLER: OK. AUDIENCE: In our case. JOHN MUELLER: Yeah, so one
item less to worry about. I mean, not that JavaScript
is bad or problematic. AUDIENCE: Of course. Of course.
JOHN MUELLER: It's just
like these kind of changes are pretty big. I don't know, offhand, I
think if you have videos on the pages, and there towards
the top of the page, then that should be
something that we would be able to recognize as kind
of like a video landing page rather than to assume
that the video is an ad. AUDIENCE: So just to
clarify because maybe I didn't give enough insights,
so, basically, it's a news site, and the pages are
a news template. So, basically, it's
just the standard stuff that you see in all
like news media sites, and nothing, let's say,
very different from that. In terms of the
videos, like, you've seen those videos when
you go to some news site that they are in on
top, not necessarily super relevant to that article,
but they are on the same topic, like, let's say, there is an
article about Google launching the new Pixel phone,
and then we are just talking about the Pixel line,
not the specific new phone that was launched.
And sometimes some, on
that video, an ad plays. Some other things that we've
been working on is the CLS. So one thing that we've noticed
compared to other publishers is that we had this shift
happening in our site because for some images and some
ads, the space wasn't reserved, and it would kind of
push the pages down a bit when the ad would load or,
like, sometimes the ad would be a medium rectangular, and
sometimes another format would show, and this wasn't
always predictable. We were addressing that, but,
again, we are not very sure if that could have been
the reason for a 40% drop, and that issue has been
before the site redesign. So, again, it's very tricky to
pinpoint what the exact reason could have been. JOHN MUELLER: I don't
think the CLS metric itself would be the cause
because we don't use Core Web Vitals as a ranking factor yet. So just kind of the CLS side
itself wouldn't be an issue.
It might be that there
is some indirect issue that, like, users are
confused, and they start hating your website
because of this shift. And then, like, over
time, you see an effect, but that wouldn't be
an immediate effect. AUDIENCE: Yes, and it's a
very light redesign, meaning that some users–
like, it's just small improvements on making
some things clearer and cleaner in general. So we double checked
a lot of things. It's not a tracking
issue because you're seeing the same things
on Search Console, and [INAUDIBLE] is reporting,
and one interesting thing is that for top-three rankings,
which we had the large drop, they are staying
on the first page, but we are just not the ranking
on the top-three rankings. And this has had quite
some effect on the traffic that you get. JOHN MUELLER: So I
don't know your website. So it's really hard to say,
but my general feeling here is that this is less
of a technical issue from the redesign and more
a matter of our algorithms just re-evaluating the
overall site quality.
So that's something where it's– especially when you're looking
at the top queries for a page, or for a site, then you might
see that those are still ranking really well because
we think your pages are really relevant for those queries. But if the general shift
is a little bit downwards, then it's almost
like, well, overall, we think your site is a
little bit less, I don't know, less important from a
quality point of view. So it's like, overall,
shifted a little bit down, but the top ones are
still very relevant. So we still show those. So just from hearing
these things, it sounds like it's
something in that regard. And it's really
hard for me to say.
Like, if I'd be able to figure
out, like, more specifically, if I had the site and
were able to use my tools, my feeling is that this
is just an overall shift in how we evaluate the
quality of the site. AUDIENCE: Yeah. May I share this site with
you in private on Twitter, just for you to take a look? JOHN MUELLER: Sure. Yeah. AUDIENCE: Perfect. JOHN MUELLER: Or, I mean,
you can also leave it here in the chat. I pick up the chat afterwards
and look through the URLs that are submitted there. So if you have, like, a comment
and just drop it in the chat, then I can pick it up here too. AUDIENCE: [INAUDIBLE] JOHN MUELLER: I'm
happy to take a look, but if it's just like a
general shift in a way that we understand the
quality of your site, then I don't think I'd even
have anything specific to say because I wouldn't be able to
say, oh, on this page you need to improve this.
It's just, overall,
our algorithms are a little bit more
critical with regards to news and maybe, like, overall
things could be improved there. AUDIENCE: Yeah, the
only thing that kind of had made this very tricky for
us is that it was immediately after the light redesign. So it could be a coincidence,
but we put couldn't base– JOHN MUELLER: Yeah. AUDIENCE: Yep. OK. I'll share that with you. Thanks a lot. JOHN MUELLER: Yeah,
I think it's always tricky with these things
that happen at the same time, but we make so
many changes, it's really hard to kind of,
like, avoid some coincidences sometimes. AUDIENCE: Yep.
Yep. Thank you. JOHN MUELLER: OK. Let me run through some
of the submitted questions so that we don't
lose track of those, and we'll definitely have
more time towards the end to discuss more other topics. Follow up to the last Hangouts
question on naked URL links without anchor text. So naked URL is essentially
just when someone uses the URL as the anchor text
instead of a text or word for the anchor text. "Does it mean that
without the anchor text there is no value in that link? Like, this great website
is linking to another site with just the URL. Would you pass some
of the greatness to the linked site, even though
there is no anchor text?" Yes, there is absolutely value
in a link without anchor text.
But, of course, we just
don't have as much context. So you could imagine
a situation where, even internally
within your website, you just use the link itself
without an anchor text, and, of course, we would be
able to crawl your website. Of course, we would be able to
figure out which of these pages are important. But we would lose a
little bit of value or a little bit of
context from that link.
So it's a bit, like, I don't
know, making a page with a text file instead of
an HTML file where you can format things and
specify headings and titles and things like that. So you could do that, but we'd
lose a bit of information. "I updated my content
at the end of last week, and Google only picked up the
updated content yesterday.
I've never seen such a delay. Usually, it takes at maximum
a few hours to update content. Why does it currently
take so long? Will it be back to normal soon?" So we don't guarantee any
specific time with regard to crawling and indexing,
and depending on the site and where things
change within the site, sometimes we can
pick up those changes within a couple of minutes,
sometimes hours or days, and, sometimes, it takes months
to pick up those changes. So that's something where,
from our point of view, it's not that there is a
clear, like, minimum response time for indexing, but
rather depending on the pages that you change,
sometimes it takes longer.
Sometimes it's very quick. For example, if
we recognize this as a page that is very
critical to your site where things are changing
very frequently, then probably we will pick those
changes up fairly quickly. On the other hand,
if we realize this is actually a page that has been
the same for the last 10 years, and you make a
change there, then probably it will take a
couple of months for us to realize that actually
you changed this page. So from that point of view,
there's a wide range there. What you can do to make it so
that we pick up these changes a little bit faster is to let
us know about the changes. So you can do that
with a sitemap file. That's very common.
Most CMS systems, if you're
using something like WordPress or Blogger or whatever,
they will automatically generate sitemap files
or feeds for you, and then it's just a matter
of you kind of submitting that feed to Search
Console, and then once that feed is submitted,
then all of the updates automatically go to Google. So that's kind of
the fastest way to get things automatically
get picked up. If there's something really
critical on your site that is changing, and you
really want to make sure Google picks that up
as quickly as possible, then you can also
use the Inspect URL feature in Search Console
and submit the change there.
I would really only use this
for exceptional purposes. So if you're just
changing text naturally, if you're adding a
few new articles, then there's no need to use
any of the kind of Submit URL features. But if there is something
really important and urgent that you need to change,
then that might be an option. "If we submitted an
incomplete sitemap, does it affect the
overall search performance of the website? All web development work
is done by another agency, and we already sent
recommendations to replace the old
sitemap with a new one.
So until this is done, will
this affect our search campaign at the moment?" Probably not. So the reason I'm
saying probably not is if your sitemap file is
missing the pages that you care about, kind of the new and
updated pages, then, of course, we need a little bit
longer to actually find all of those new and updated pages. So that's kind of the
potential downside. On the other hand,
a sitemap file only helps us to crawl
a little bit better.
So it's not the case that
our crawling would only focus on the sitemap file, and
we would crawl kind of the– or we would kind of suspend the
normal crawling of the website and only focus on the sitemap. That's not the case. It's really the case that we
crawl your website normally anyway. And then the sitemap
file helps us to crawl a little bit better. So if you make changes
on your website, and they're picked up
through the normal crawl, then that's perfectly fine. The sitemap file doesn't
change anything with regards to ranking. So if you have an
older sitemap file, then that doesn't mean that
your pages will rank in any way differently.
It's really only about
this crawling part where we might want
to pick up changes a little bit faster if you make
specific changes on your site. And if you have an old sitemap
file, then that doesn't get picked up. One thing I'm kind
of worried about here with this question is that
you're telling the agency to use a newer sitemap file,
where, really, what you should be doing with a website
like this or, in general with a website, is
to have your sitemap file generated automatically. So instead of kind of manually
replacing the sitemap file, you should make sure that you
have a system in place that automatically generates
it all the time, just so that any time you make
changes within your website, then those changes get
picked up automatically, and there is no kind
of manual step involved in getting that updated.
So that's kind of
my recommendation. One thing you can do,
depending on your setup, especially if you're
at a company where you have different
departments working on different parts
of your website, and you have kind of
a marketing department that does the website and
then maybe a tech part that does something different, you
can host your sitemap files somewhere else. So in your robots.txt
file, you can specify the location
of your sitemap file, and that can be somewhere else. It could be even on
a different domain. So if you have access to
the back end of your server and know when things
are changed, then you could put your sitemap
file on a separate domain, just for site maps,
for example, and use that as a way of
submitting always live sitemap files,
even if the content itself is something that
takes longer to be updated. So that might be
an option there. "How to write a canonical
tag and use a sitemap for multilingual websites?" Wow. So many sitemap questions.
So cool. So the canonical tag,
it's not really a tag. It's a link element
that you place into the head of the
page, the web page itself. So it's not something that
you would put into the sitemap file, but rather it needs to
be in the HTML page itself. And that's something
that needs to be in a specific format in a
specific part of the file so that we can process
that and trust it. Like, depending on how
you create your pages, you might need to look
into that in particular. For a multilingual
website, you can use the hreflang annotations
between different language versions, and these
different language versions, you can put either into the head
of your page, like with the rel canonical, or into
a sitemap file, and that's, I guess, a little
bit different from the rel canonical in that the hreflang
annotations can be either one. With regards to the mix
of canonical and hreflang, the important part is that
all the individual language versions of your pages should
be canonical to themselves.
So the canonical tells
us which of your pages you prefer to have
indexed, and if you say, for example, the
English version is my canonical for the French
version, then we may say, well, then we don't need to
process the French version. We will only index
the English version. And, usually, that's
not what you want. Usually, you'd like to have
all of the different language versions indexed individually. So lots of different
answers there. I don't know which of
these might help you there with your question. It's a little bit
vague, but, hopefully, that helps refine
things a little bit. "I've been trying to feature
my website on Google News. I successfully submitted
my website in Google News three or four months back. Is it mandatory to add
a new sitemap in order to feature it in that section? Any other recommendations
would also be helpful." So I don't know too much
about Google News with regards to how to get things
submitted there, and into the Google
News side of things.
So I can't really
help you there. I have heard from other
people that things are a little bit backed
up with regards to getting new websites into Google News. So maybe that's
something where you'll need to be a bit more patient. With regards to the news
sitemap, I don't know for sure. So I do know it's
something that we strongly recommend to use a news sitemap
because, especially on news sites, it's really critical that
we pick up the news content as quickly as possible.
So that's something
kind of to keep in mind. But we do have a lot of
this documented in the News Publisher Help Center. So I would strongly
recommend going there. And I believe there's
also a News Publisher Help Forum where you can ask
more specific questions on these kind of things. But, also, like I mentioned
in the beginning there, I have heard from
people externally that kind of getting new
sites into Google News is a lot harder now or it takes
a lot longer time, so kind of– yeah, I don't know, just
to set expectations. "How breadcrumbs help SEO– does the breadcrumb schema
show any rich result in Search? Should we include the home
page as a first position in the breadcrumbs schema? Should we add the current
page as the last element of the breadcrumbs schema?" So I think you almost
answered your first question, how breadcrumbs help SEO.
In general, breadcrumbs,
especially when you're talking about the
breadcrumbs structured data, they don't change
anything for SEO. It's not that your pages
rank any differently, but rather that we would
show them differently in the search results
when we understand the breadcrumb markup. So we could show kind of that
breadcrumb trail in the search results as a rich result.
It wouldn't change anything from ranking. It just makes it a little
bit easier for people to kind of jump in at the place
where they would like to be. With regards to the home
page and the current page in the breadcrumb schema,
I am not 100% sure, but my understanding is
that we made it a little bit clearer in the documentation
that this is optional. So you can include your home
page and the final page. You don't necessarily have to. We definitely understand
where you're home page is. We definitely understand
where your final page is. So we can interpolate a
little bit from there. The one place where breadcrumbs
could have an effect on SEO is less around structured data,
but if you use them to actually create links on the page.
And in a case like
that, the effect is essentially that your cross
linking these different pages. So if you have a website that
has multiple category levels, for example, then you're linking
from one product to maybe the subcategory and to
the higher-level category in the breadcrumb– in the HTML on the page. And, oftentimes, that's
a good thing for users. They can navigate and find
the category that they want. For crawling, it definitely
helps us as well. So that's something that
kind of makes sense, but if you already have this
HTML, and you're wondering, should I add the
structured data or not. Then that's really
just a display change. And with regards to
the display, we also try to figure out
which breadcrumbs to show automatically. So it might be, depending on
your website that we're already figuring out which
breadcrumbs to show, you can see this when you
search for your own content. "I'm running a local
business website. There are few competitors. My question is, for
a long time, my web and some of my
competitor websites are working very well
against some competitors whom made a website on WordPress.
But, now, suddenly, for the
past two to three months, all of the web's made
on WordPress are on top. And the most frustrating thing
is that their entire content is copy and paste. Their page feed, PA, and
DA scores are worse." Just a quick side
note, PA and DA are not metrics
that Google uses. These are from
third-party tools, but they can be useful
sometimes to compare things. "There is nothing
which I am able to find that they have better than my
website except their backlinks. They have 80% more
backlinks than mine. And all those three to four
webs are made on WordPress and ranking well, and they're
only one to two years old. So the question, are backlinks
still this much important for web ranking?" So I think a few things
were worth mentioning here. On the one hand, the age factor
is not necessarily something where we'd say, well, older
sites deserve to rank higher, or newer sites deserve
to rank higher. Sometimes new content
is very relevant. We will show that visibly.
Sometimes older content is. Sometimes we show
a newer domain. Sometimes we show
an older domain. So it's definitely not the case
that you need an old domain name to rank well. The other question that I
kind of read between the lines is, is WordPress better,
or is a custom CMS better? Is another CMS better? And from our point of view, we
don't care which CMS you use. So it doesn't matter to us
if a site is using WordPress. It doesn't matter to us if a
site is using Wix or Blogger or any of these other systems.
Essentially, we look at the
HTML pages that are generated. And all of these
systems have worked really hard to make
reasonable HTML pages. I think that's one of the really
cool things about the web in, I don't know, the last 10 or so
years, in that if you're using any of the common setups
to create web pages, then chances are,
kind of by default, things will work reasonably
well with regards to search, and you don't need
to do anything custom to make them work even better. So from that point of view,
if WordPress works for you, and even if you're not creating
a blog, but rather maybe a company website,
or shop even, then feel free to keep using
that, or if some other system works for you, then
that's also totally fine. So from that point of view,
I think, like, whether or not a site is on WordPress or
not, does not play a role. With regards to
links, we do use links as a factor in some
of our algorithms, but we use a lot
of other things.
And links are
probably not the one that I would say is the
most critical item here. It's really hard to say much
about this specific case because there's not
a lot of detail here. So this is something where
I would be tempted to say, it would be useful to go maybe
to the Webmaster Help Forum so that others can take a look
to see if there is something specific here, but kind
of the shift from position 3 to position 1, or
the other way around, is something that can happen
for a lot of different reasons. And just because a website
has some things that are worse doesn't mean that it
will automatically rank lower than other ones.
So a really common case
that comes up in the forums and when talking with people is,
for example, maybe a site has hidden content on it somewhere,
and people will come to us and, say, oh, this website
is ranking above mine, but it has hidden content, and
your webmaster guidelines say hidden content is bad. Therefore, you should remove
that website from search. And from our point of view, we
might recognize other things that are good here,
and we might even recognize that there is
hidden content there. But if we can recognize
there's hidden content there, then we can also ignore it. So just because there are
some aspects of a site that are worse than yours
doesn't mean that it will always be ranking lower.
Maybe there are
other things that are actually pretty reasonable. Or what might also
be happening is that these sites in
the search results are actually kind of very
similar or very kind of similar with regards to how they
fulfill the user's need, and then it's something
where if we were to take this to our search quality team and
say, well, like these sites are all very similar, but
the one at number 3 is the one that really
wants to be number 1, then they'll tell us, well,
if they're so similar, then there's no reason for us
to change anything with regard to ranking here. And the way that you
kind of work around this is by making sure that your
website is by far the most relevant one for these queries.
So make it something
so that if we were to take this case to
the search ranking team, then we could go and say, well,
we have a bug in our systems because what we're showing in
the top results here is clearly a bad result, and what
we're showing at number 3 or number 4 is clearly the one
that we should be showing here. And in a case like that,
then the ranking team will be able to look
at that and say, yeah, like, we need to make some
changes to better understand the unique things that
are happening here. But if all of these
are kind of equivalent, and they're kind of
doing the same thing, then the ranking team
will say, well, we could spend a couple
of months working on tweaking the
ranking for this site, and maybe 50 people will see
this improvement, which is OK.
Or we could spend a couple
of months improving something for a lot of other sites
or a big mass of users, and, probably, they'll
focus on the bigger issues. So I don't know. I guess, in short,
my recommendation would be to maybe go to
the Webmaster Help Forums and get some input
from other people, kind of maybe more objective input,
and really to think about what you could be doing on your site
rather than to focus on what your competitors are doing. "Can fragment
identifiers be used to optimize for rich
features snippets?" I don't think so. So fragment identifiers are
these URLs with the hash or the number sign in them. And they're generally
generated as links on a page, and they jump to a
specific part on that page. So that's kind of the
traditional thing with regards to fragment identifiers.
And, essentially,
you're on the same page or you're just going to
different parts of that page. And when it comes to
indexing, we essentially drop all of those fragments. We ignore them completely. We see them as a link
to the same page. And then, if it's a
link to the same page, there's nothing we
need to do there. And that wouldn't
change anything with regards to rich
features or kind of the featured
snippet aspect there. Sometimes we do pick
up the fragments with regards to making a cleaner
snippet in the search results, but that's really something
where we understand this is a specific
part of the page, and we can link to that
part of the page directly, and we will just
link to it like that.
So you often see this
for Wikipedia pages which use these fragment
identifiers quite regularly. The other place where these
fragments are used sometimes is on JavaScript sites, and for
us this is really problematic because, like I mentioned,
we dropped the fragment for indexing, and if the
content on the JavaScript site is only loaded when the
fragment is included, then it's very likely
that we will not be able to index that content. There is a very,
very small number of sites where we
do use the fragment identifier for indexing,
and that's essentially a very small number
of cases where, in the early days of
JavaScript indexing, we thought this would be the
only way to pick that up. And in the meantime,
we realized that we shouldn't be doing this because
it just causes so much trouble. So for the most part, we drop
those for indexing completely. "While Google rejects
AdSense applications, just say specifically
what went wrong with that application,
that's the best update you can do for the
sake of your customers." I don't know how
AdSense is handled here, and I imagine they get a lot
of applications for things that are not relevant to be shown.
But I have no idea. You'd need to maybe add that
to the AdSense Help Forum. "I have a Hindi word in my
domain, which is a language. Will it affect my articles
written in English?" So we do use some signals
from words in a URL, but it's very, very small. And, especially if we can pick
up the content on the page, then we can essentially
ignore the words in the URL.
So if you have, I don't know, a
Hindi word in your domain name, and your content
isn't English, then that feels perfectly fine
from my point of view. It is very common to also have
international websites where you have maybe a domain
name in French or in German or in English, and the content
itself is in other languages as well, and that's
totally fine. So that's not something
I would worry about. "Is it recommended
to use keywords as it is the content to get
the best results on Google? Does the crawler use
fundamentals of AI to make combination
of keywords that are closely related to the
content and then rank them? We want to know how a
crawler picks up keywords to be ranked on Google." So I don't know. There's lots of ways
that we pick up keywords, but I think, in general,
we essentially look at the content of the page. Way, way in the early days, the
keywords metatag was a thing.
But, essentially, the
content on the page is really what users see. So that's where we try to
understand which words we could show this query for. And we do try to
figure out combinations that are more like
synonyms or that are equivalent with regards
to keywords on a page. Sometimes we figure out which
are acronyms or singular and plural, and we
try to understand, is this page relevant for
both of these versions? Or maybe it's just relevant
for one of those versions.
But this is something
that is quite complex. There was a video
that we put out, I think, beginning of this year
from the webmaster conference we did in Mountain View from
Paul Haahr, which was, I think, a really interesting
session, and he goes into a lot of these
aspects with regards to keywords and when we understand things
are similar or equivalent, when we understand that
things are different. So I think some of the
examples that he had where, like, you have a page
that is about New York, and a user searching for York,
should that page about New York also be ranking,
and, of course, we should be understanding
that New York are two words, but they belong together. And York is a different word,
a different location, that should be ranking individually. And all of these things are
really kind of unique problems and interesting to look at. So I would definitely
take a look at that video.
I think the only short answer,
if you really want something short here, is you
don't need to put all of the variations of all
of your keywords on your pages. If we understand your page
is about a specific topic and has some of those
keywords on there, then we can understand
the rest itself. So you don't need to put
all synonyms on your pages. You don't need to kind of
do this SEO thing where you include all of the typo
versions of your keywords on the same page. We can figure it out. Oh, OK, wow. We're already at time. Maybe I'll just open things up
for more questions from some of you, and then I'll
pause the recording so that it doesn't get too long.
And then maybe we can
continue a little bit kind of off the
recording afterwards too. Woo. AUDIENCE: Hey, John. JOHN MUELLER: Hi. AUDIENCE: I did post a question. So I understand Google mobile
indexing is going through AMP sites first for ranking, but
we don't– it's an e-commerce website. It's not the entire
website is AMP. I want to say about 50% of it. So Google is first
trying to index AMP. At some point, it's hitting like
a regular desktop website link, but the desktop is
responsive, hence mobile friendly, so it's now
also grabbing all the mobile, or the rest of the sites.
From our regular page,
I'm linking the AMP pages with the link, rel AMP HTML
and from the HTML doing AMPs. So I'm kind of tagging
them each other. But this ideally
would be duplicated for mobile versus
AMP, right, or no? JOHN MUELLER: So when it
comes to the paired AMP setup, I think that's what
you have there, like, the normal HTML and
then the AMP page and then the linking
between the two. AUDIENCE: Right. JOHN MUELLER: We would use the
normal HTML page for indexing. So from that point of view,
it's like the AMP page is more supplemental
for us, and we can show that when
people are on mobile on inappropriate devices,
but for indexing, we would use the
normal HTML version. AUDIENCE: OK, and
is there a reason why AMP won't get traffic? Like, I see standard devices
like iPhone or, I don't know, MindWrite, or
something like that.
JOHN MUELLER: I don't know
why that might happen. So one thing that
has to be the case is that this cross
linking has to be correct, and the AMP page has
to be a valid AMP page. You can use kind of the
AMP tester for that. The other thing that sometimes
happens is because of the way the AMP pages work, you have
to do kind of the analytics there separately,
especially when we show it as a page on the AMP
cash, then you can't just use the same Google
Analytics setup there. You kind of have to
mix those two together.
And that's something that's
sometimes confusing in that you look at the analytic
side, and it seems, oh, nobody is going
to my AMP pages, but then you kind of need to
add that separate AMP part to it as well. AUDIENCE: OK, for the most part,
since AMP is a separate set up from development perspective,
do you suggest user maintaining two versions or potentially
three desktop mobile and also, AMP or just kind
of step away from AMP. I see a value for speed
and all, but, I mean, what are your general thoughts? JOHN MUELLER: I don't know. So I think, in general,
from Google's point of view, AMP is a great way to
make really fast pages, and there are some
features in search that rely on AMP to work
well, especially things where we need to embed
kind of a page in an AMP viewer type of situation then we need
to have an AMP page for that.
If your content is not relevant
for those search features, then it's more a matter
of, like, the speed side. Can you generate
same kind of speed with your normal HTML pages
as you can with the AMP pages? And, if so, maybe it
makes sense to focus more on the regular HTML pages,
but if AMP is the way that you can make your
mobile site really fast, then I would definitely
continue using that. AUDIENCE: OK. All right, thanks. JOHN MUELLER: Sure. AUDIENCE: Hey, John,
got a follow up question on the issue. I've shared the
domain of the chat. So you know what for what
site you're talking about. Just to clarify I think you
mentioned earlier on that you don't think that the side was
re-evaluated because of this, let's call it, design refresh. Could something
else been using– and that has been used
even before the redesign. We've been using something like
tracking parameters on the URL, so to all the articles that
we link from the home page.
So we know what traffic is being
generated from the home page to which articles. So we know which kind of
sections of the home page are driving the most
clicks internally. This is just to measure
some things that we needed. But those pages
themselves, they have the canonical implementation
the right way. Could that have had some impact? Because home page is the most
authoritative page of the site, and maybe those pages weren't
getting as much value. But this has been implemented
pre-redesign as well. [INAUDIBLE]? JOHN MUELLER: That could– I don't know, depending on
how you have that implemented, that could have an effect in
that if from the home page or linking to the articles
with a unique URL, and then from the articles,
you have the canonical back to a clean URL. Then is that about the setup. AUDIENCE: And so
it is the same URL. It just has URL parameters
at the end of the whole URL. It's like UTM, but we've
modified them a bit.
JOHN MUELLER: Yeah. AUDIENCE: We removed that I
think about two weeks ago. But it's not like we're seeing
something happening yet. JOHN MUELLER: So one thing
you can do to kind of check that hypothesis is whether those
particular kind of the tagged URLs are ranking in search,
so looking at the performance report to see is Google
focusing on the clean URLs, or is Google focusing on
the parameterized URLs? And if Google is focusing
on the clean URLs, then we reconfigure
the canonical part out, and that's all fine. If Google has been focusing
on the parameter URLs, then it seems like, well, we
got confused with your site structure, and then over
time, we will kind of dilute the value because we're
not sure which of these pages we should actually be seeing
associated with your home page, for example. AUDIENCE: Yeah. We checked that, and it's the
clean URLs that are ranking.
So the canonical is
working properly. So, like, there
are so many things that could have happened. So we're testing and trying
and thinking about everything. JOHN MUELLER: Yeah. Yeah. I can sympathize with that. It's, like, if there is a
big change had happened, then trying to figure
out what exactly is responsible for
that is something that I think anyone would do. I do think, like,
if you're seeing an overall drop like this
from one day to the next, then it seems a lot more like
a quality issue rather than a technical issue. If it were a technical
issue, then you would usually see kind
of a subtle decline over time, where, as we
reprocess things for indexing, then, like, some things
go a little bit faster. But it is really something that
would take about, I don't know, a couple of weeks time
to be fully processed.
And if you're seeing it
from one day to the next, then it seems a lot
like our algorithms are kind of classifying your
site slightly differently. AUDIENCE: I see. So, yeah, this could be the case
because we saw a small decrease in August 30. Then it accelerated
dramatically on September 3. So by September 3, it
started really dropping. It hasn't been like a
couple of weeks in the– JOHN MUELLER: Yeah. Yeah. I mean, like technical issue
would be a little bit easier to figure out and clean up. So I understand kind of
focusing to make sure that all of the technical
things are lined up first. But it really feels, like,
from a quality point of view, it might be worth getting
some more input from people and seeing what could you be
doing slightly differently. But I really don't
know your website. So it always feels
awkward to say, oh, your website's
quality is bad. And it sounds like it's not bad. It's just, well, we thought it
was a lot better in the past.
AUDIENCE: Yep. So it's tricky. I hope that you will have
a chance to just take a look at the site. And any input that
you could give us would be very helpful to kind
of work out what happened. JOHN MUELLER: Sure. I'll take a look. OK. Let me pause the recording here. For those of you
watching the recording, thanks for sticking around. And thank you for everyone
who submitted questions along the way. I'll still be here a little
bit afterwards if any of you want to stay and chat
a little bit longer. And, otherwise, I wish
you all a great weekend. Bye, everyone..