English Google SEO office-hours from April 1, 2021
JOHN MUELLER: All right. Welcome, everyone, to today's
Google Search Central SEO Office Hours Hangout. My name is John Mueller. I am a search advocate here
at Google in Switzerland. And part of what we do are
these Office Hour Hangouts, where people can join in
and ask their questions around their website
and web search, and we can try to
find some answers. A handful of things were
submitted on YouTube already, so we can have some of
those to go through.
But as always, if
any of you want to get started with
a first question, feel free to jump on in. Robin, go for it. Oh, we can't hear you. Oh no, you're not muted. But the microphone
doesn't seem to work. ROBIN LESPAGNARD: Hear me now? JOHN MUELLER: Yes, perfect. ROBIN LESPAGNARD: You can? Sorry, sorry, I'm fixing this. I'm sorry. Here we go. OK, it's fixed. So thank you for
the opportunity. I don't want to
take too much time, so if I take too much
time, please interrupt me. I'm wondering, how does it work
when lots of links to websites are deleted, for example,
after a manual penalty? Is it considered as some
kind of alarm signal by the Google algorithm,
which would, for example, take lots of quality
pages that were on page 1 and go push them far, far
in the pages of 4 or 5? JOHN MUELLER: So it wouldn't
be considered an alarm signal in the sense that it's like,
links come and links go, and that's kind of normal.
I think the effect
that you might see is if the website
previously was artificially showing too visibly in search
because of those links, and then you remove those
links and obviously, that artificial support
is no longer there. And then you would
see changes in search. But that's not because
we see, oh, lots of links disappeared or lots of
links became nofollow or got disavowed. It's more that, oh,
there are fewer links, and we do use links to some
extent in our ranking, so– like, we don't have
that support anymore. ROBIN LESPAGNARD: OK. So if– yeah, for
example, on a website, I saw it goes far, far,
far, far in the pages.
It's considered as where it
should be by the algorithm? JOHN MUELLER: I mean,
where it should be, that's always open
for discussion. But that's– so I mean,
it's kind of tricky, because if these were, say,
more kind of like spammy or artificial
links, then usually, our algorithms try to recognize
that and ignore that already. So if you're removing things
that we're already ignoring, then you wouldn't
see any change. ROBIN LESPAGNARD: Yes. JOHN MUELLER: So yeah. ROBIN LESPAGNARD: Maybe
those were some good links. Could it be– because when
you get a manual penalty– or in general, right,
I'm speaking in general, not specifically– but
it feels very urgent and you have to act
super, super quickly.
And it's very fearful
because your website has disappeared, maybe. And could it be that maybe
some website's owner take away too many links, and some
of them that should be OK? JOHN MUELLER: Yeah. I think that that happens
every now and then. But– ROBIN LESPAGNARD: OK. JOHN MUELLER: I mean, the
important part is also– it's like, we use a lot more
than just links for ranking. So it's not like if you
removed one link too many, then suddenly your website
will disappear kind of thing. We use a lot of
different factors. And I do understand
kind of the side that oh, you get a manual
action, and it's like, oh, I will just delete
everything and make sure everything is clean. ROBIN LESPAGNARD:
It's terrifying. It's terrifying
when you get that. JOHN MUELLER: Yeah. I think going too extreme
is not very useful, but it's sometimes tricky. If it's your website
and you care about it, and you just want to get
everything cleaned up again, then I could see that
some people might say, well, I'll just remove
everything I can find.
ROBIN LESPAGNARD: OK. So I have a next question. Please feel free to interrupt
me if I take too much time. So I understand that
backlinks are not the only signal you
use, but still, I believe they're a
pretty strong signal. And so when you read
the Google guidelines, it just says, well,
for link-building, you don't have to do anything. Just make quality,
authoritative, original content, and the links
will come by themselves. And I'm pretty sure that on the
long term, in the long term, that's how it works, sure.
But it might take years,
and not one or two years, but especially in
the younger markets, like in Europe and France
and stuff like that, it might take years. And so is that an inconvenience
that Google assumes– I mean, understands and
recognizes and says, well, that's just how it is? Or are there ways that you can,
well, make it a little quicker, like encourage
people to hey, look, I've done some really
quality stuff here. Can I– JOHN MUELLER: Yeah. I mean, there are
two aspects there.
On the one hand, if
you're in an area where it's hard to kind of get support
from the ecosystem in terms of links, then it's like,
keep in mind everyone else who is in that area
has the same problem. So it's kind of like if
you have a knitting store in a small city,
it's kind of normal that you wouldn't get a lot
of links to a site like that. But at the same time,
the other knitting stores in the same city, they
have the same problem, right? It's not that one site
gets 50 million links and the other site gets zero.
It's like– it's kind of
an even playing field. And the other thing
with regards to links is our guidelines are pretty
specific around the type of links that we would
consider unnatural and, like you mentioned, the
approach of creating something really fantastic and
people will recognize it over time and kind of
link to it over time, I think that's one aspect. But there's also
a lot that you can do kind of to encourage that. It's kind of similar to how
when you start a small business, you do advertising. You encourage people
to come to your website to look at your content. And if you have some great
content, then obviously, those people have a chance
of linking to your content. Other things you
could do are kind of the more PR-oriented
outreach things, where you do something really
fantastic on your website and you reach out to maybe
local press or something as like, say, hey, look at this
cool stuff that I did here. I'm happy to talk to you
about it, like would you be interested in featuring that? And those are the kind of
things where you're not going out and placing links
on other people's websites, but rather, you're kind
of reaching out to them and saying, look, I have
this really cool stuff.
Maybe you would
like to take a look? Maybe you would like to write
about it on your site as well? And that's, I think,
the kind of thing– especially with
newer businesses, where you can kind of take a
step up and be a little bit more proactive rather
than just like, I put a website on the internet
and maybe hopefully someone would find it at some day. Because if you do that,
then it's really hard for us to recognize that there's
actually something really useful and important here. So kind of taking that step and
dragging people to your website and saying, look
at this cool stuff. And if they like it, then
they can do something with it. Or if they don't
like it, then it's like obviously a
different problem. ROBIN LESPAGNARD:
Further on that subject. So in those kinds of PR
moves that you can make, how do collaborations work
in the eyes of Google? Like, what's allowed
and what's not allowed? JOHN MUELLER: From
our side, anything where there is no exchange
of value for a link is essentially OK.
I don't know. Probably not anything,
but for the most part, like if you're not doing
something where you're saying, I will link to you
if you link to me, or I'll pay you this much
money if you link to me, then essentially, that's OK. The important part is really
that the other website is kind of deciding on their
own to link to your website. ROBIN LESPAGNARD: OK, OK. It gets maybe a
little tricky there because sometimes you invite
someone and they collaborate and they ask for
something in return, especially in younger
markets like here. It's really– like,
sometimes people are very, I give, but you give. Like, they really
want that exchange. And you say, mm, but
we can't, really. It's tricky. JOHN MUELLER: Yeah, yeah. I don't know what exactly you're
working on, so it's hard for me to say, but– ROBIN LESPAGNARD: OK, OK, OK. JOHN MUELLER: I
mean, it's something where it is sometimes tricky. But I think the important part
is also, like I mentioned, we do use a lot more than
just links with regards to our ranking.
So links are especially
important in the beginning when you're just starting out,
like if we don't know about your website at
all, then obviously, we have to find a
link to it somehow. But once we know
about your website and you can kind
of understand how it fits in with the
context of the web, then that kind of can grow a
little bit more organically. You don't need to constantly
add links to a website. ROBIN LESPAGNARD: OK, OK, OK. For the original boost, maybe. And then when it flows,
when it's in running speed. JOHN MUELLER: I guess. I guess you could frame
it like that, yeah. ROBIN LESPAGNARD: OK, OK, OK. So I might be just– if I take too much time, please
don't hesitate to interrupt me. So on that collaboration side,
so I have a good example. So let's say you're a
comparison website, right, and you do something
on the best yoga mats. And you can contact really
original and authoritative and expert yoga practitioner
who have blogs, and you invite them to– basically, you invite
them in a collaboration.
So you basically say, hey, how
about you check out our article and you make a proposition
to make it better? So for example, you
add, oh, OK, so you said that this is important
for yoga mats choosing. And I, as an expert, I'd also
put that and this and that. So they help you make
your content better, and you give them– so you link to them
in regard and you say, we collaborated
with these experts, and you give them a link to
their websites in recognition as a source. And so– and maybe
after that, you say, oh, well, if
you want to mention that you were in this article
as an expert on your website, well, if you want
to, you can do it. Just mention it on
one of your article or maybe create a new article
where you say, oh, I just interviewed in that website. Is that OK? JOHN MUELLER: I– in
general, that's OK.
I think it's something
businesses over time, you recognize it. And it's like,
oh, they just want me to do this back
and forth thing. But it's– like, as long as
you're not forcing them to do that and saying, in order to
be listed in our comparison website, you will need to link
to my website kind of thing– as long as it's open like that,
then generally, that's OK. ROBIN LESPAGNARD: OK. JOHN MUELLER: Yeah. I think– I mean, it's
like all of these things are a little bit
tricky in that it's easy to take one example
like this, where I think like in your situation, that
case sounds pretty reasonable, and to expand it
and say, oh, well, I will contact 50 million people
and do this game with them. Then that's something where
our web spam team might say, well, it looks like this
is just a giant scheme. It's not actually something
that's happening organically. ROBIN LESPAGNARD: OK. Well, thank you very
much for the information. That clears things up a lot. And so if maybe someone
had a problem before, it might be because they've
just made way too many of these and so it was excessive.
JOHN MUELLER: Yeah, yeah. ROBIN LESPAGNARD: OK. Well then, that's all
the questions I had. Thank you very, very
much for your answers. JOHN MUELLER: Cool. All right. Any other questions
before we get started with the ones
submitted on YouTube? No more questions? Cool. Well, I mean, we'll still
have room for more questions along the way, so no pressure. OK, let me look at
the ones on YouTube. So the first one, I
think, is kind of tricky. It's like a few
months ago, we started working for a new company. We found that one of
their local websites had some strange SEO issues. It was indexed in
Google, but not in Bing, and later the Bing
technical support told me there was a sort
of block that they removed. The website was
launched in 2018, but there was no organic
traffic until April 2020, and the traffic is far
below the expectations. We recently found
out that this domain hosted another website 10 years
ago that looked a bit spammy. We haven't found
any manual actions.
And it kind of goes on. Essentially, it's like, is
this domain OK or is it not OK? So from looking at
things from our side, I assume you have it
verified in Search Console. If there are no manual
actions that you see there in Search Console, then
there are no manual actions associated with this website. And taking a look a little
bit further on our side, I also didn't see anything
special happening there. And it definitely
wouldn't be the case that if there was
another website hosted on the same
domain 10 years ago that was spammy
that you would still be having any kind of
side effects from that.
So these kind of, I think,
spammy after effects are things that in general,
you would not see for– like, with a pause of 10 years
in between with a website like that. And in any case, if there
were spammy after effects, you would see that as a manual
action in Search Console and you'd be able to
resolve that, clean that up. The one area where you
might see some kind of after effects for a while
is with regards to links. If you had– or if
the old website had a lot of really, really bad
artificial links associated with it, then that might be
something that at some point would count against your
website in that sense, even if in the meantime,
you have a different website up there.
But my assumption
is that if this was the case 10 years
ago, and for 10 years now, nothing kind of problematic
has been happening with regards to links with that website,
then that would kind of not be the case anymore. So my assumption here is that
this website is essentially just ranking as it
normally would with regards to its
content at the moment, and that there is nothing
artificially holding it back. And if you were to move
to a different domain name to try to escape whatever might
be associated with this domain name, I don't think you would
see any change in visibility in search. And moving domain names is
always a bit of a hassle, and it takes a lot of time. So my recommendation
here would be more to focus on
the website itself rather than to try to
move off to other domains.
Do the Core Web Vitals
affect AMP pages? We essentially treat AMP
pages like normal HTML pages when it comes to search,
for the most part, so we would rank
them appropriately. At the moment, we don't use the
Core Web Vitals with regards to ranking, but the plan
is to introduce that, I think, in a month or two. And when that happens, that
will affect AMP pages as well. I think the useful part,
with regards to AMP pages, is by default,
they're really fast. So for the most part, if you
create a normal AMP page, it would be probably
normal to see that it meets the thresholds
of the Core Web Vitals, and then you'd be kind of on
the good side automatically. So with that regard, I think
AMP pages are a great way to make really fast web pages. You can make other kinds
of really fast web pages.
And if a website is
appropriately fast and has a good user
experience, then it'll pass the Core Web
Vitals thresholds and, especially from the page
experience ranking factor, that would be OK. But it doesn't need
to be an AMP page. It can be an AMP page
if you want to do that. We have a WordPress site. Updating the movie
gallery, we get pictures daily for the same movies.
We only create one
gallery for one movie. The pictures we
get every day are updated in the same gallery. Can we change the date and
time when it's updated daily? Would that be a problem
for search ranking? So you can definitely update
the date and time on a page whenever you make
changes on a web page. If you're just shuffling
pictures around in a gallery, that feels kind of
misleading with regards to updating the date,
just because you're shuffling pictures around. So from a user point
of view, I would find that a little bit awkward.
I don't think it would
change anything with regards to search, and we definitely
wouldn't rank those pages differently in search just
because you're changing the date and time on a page. So my recommendation
there would be if your CMS does this by default
and you can't control it, then fine. If you're doing this manually,
kind of like tweaking the date every time you make a small
change, like shuffling images, then I would recommend
just skipping that and only update the date
when you actually make significant changes on a page.
I have a few questions, but
the overarching question is, what are the things
that a website manager should do to ensure that
URLs are published– URLs published are
indexed in Google as soon as they are published? Yeah. I think everyone wants that. Like, you make something and
you want it visible in Google right away. But let's see. So are URLs in News sitemaps
indexed more frequently than regular sitemap
indexes that are submitted in Search Console? So News sitemaps are specific
sitemaps for news websites, specifically for Google News. So if you don't have
a news website that's listed in Google News, then you
don't really need to do that. News sitemaps are limited to
1,000 URLs per sitemap file, and that limit is
there so that we can kind of get the fastest– the content as quickly
as possible, especially on news websites where we want
to get the content quickly.
So that's something
where if we are already trying to get everything
indexed as quickly as possible from your website, then
using a News sitemap is a good way to
help us with that. KAMLESH SHUKLA: [INAUDIBLE] JOHN MUELLER: Just a moment. But if we're currently not
already indexing your content as quickly as possible, then
putting it in a News sitemap won't change anything. So that's kind of–
from that angle, it's like, if we're trying
to get everything quickly, then obviously, using the right
type of sitemap file helps.
But just artificially
putting it in a sitemap file like this
doesn't change it. Go ahead. KAMLESH SHUKLA:
[INAUDIBLE] index sitemaps in news websites. So the crawling frequency should
be hire by a News sitemap. JOHN MUELLER: For
news websites– so the index sitemaps
are essentially just a collection of the
sitemap files that you have. So it's not something
where you're submitting more and more content
and making it harder to crawl. It's more like, here's
everything from my website.
And usually, we do recommend the
[INAUDIBLE] that makes of one– like, for news sites,
one News sitemap with all of the fresh content
and then everything else in another sitemap, or a
set of sitemap index files. And that makes it
possible for us to try to kind of get the
new things fast, but also make sure we're not missing
any of the old things. KAMLESH SHUKLA: Thanks. ASHWANI KUMAR: Hello. JOHN MUELLER: Hi. ASHWANI KUMAR:
Yeah, hi John, sir. Yes, sir, it's a very pleasure
to connecting with you, sir. So my question is, I am facing
issue in Google [INAUDIBLE] like my website [INAUDIBLE] is
continuously fluctuating from phase one to phase
fourth and fifth, and this is the major issue
I'm facing since last week– last month. So what can I do right now? Could you please
suggest anything? JOHN MUELLER: So if
you're seeing fluctuations like that, then
usually, that's a sign that our algorithms
are not sure how to show your website
appropriately in search.
And usually, the
good part there is that it doesn't take
a lot more to make our algorithms a
little bit happier, and then you see
fewer fluctuations. But it's hard to figure out
what that little bit more is. So my recommendation
there is always to try to find a way to
significantly improve the quality of the
website overall so that the algorithms are for
sure always on the good side. And that– I think it's hard
to turn that into something practical, and that's
something where you almost have to know your website
and you know your users, and figure out
ways to really try to improve it with a big step.
ASHWANI KUMAR: Sir,
could you please let me know how
much time it takes? JOHN MUELLER: How
long that takes. ASHWANI KUMAR: Yes. JOHN MUELLER: It's hard to say. It's really hard to say. Yeah. On the one hand, we have
to recrawl the content. Like, if you make significant
changes on your website, we have to recrawl that. And to recrawl that
across a larger site, that can take a bit of time,
especially if you make bigger changes across
everything– if you change the structure of your website. I would assume
something like that, just purely from a technical
point of view, would take, I don't know, maybe a month. And for understanding the
quality changes overall, I would see that as
something where it probably takes a few months on our
side to actually understand that this website has
significantly changed. So not something that
you can fix in a week.
It's probably more
like, I don't know, three, four months,
something like that, if you make significant
quality changes. ASHWANI KUMAR: OK, sir. Thank you, thank you. KAMLESH SHUKLA: One– I have a question. JOHN MUELLER: OK. KAMLESH SHUKLA: So
fact-check articles. If my website is doing
fact-check article, it's ranking signal
for that website– any– is that website getting
any benefit from Google [INAUDIBLE] anything like that? If I fact-check. JOHN MUELLER: I don't think so. My understanding is that it's
more like normal rich results, which are essentially
just ways of showing that information in the search
results slightly differently.
It's not the case that we would
be ranking that page higher because of that. KAMLESH SHUKLA: And we
have some certification from international
fact-check checking network. So is any– it's helping
any [INAUDIBLE] parameters? Does any of that kind of
certification help [INAUDIBLE] parameter? JOHN MUELLER: Possibly. I don't know. I don't know what
certifications you have and what we would watch out for. But if these are things that we
could recognize while crawling and indexing the
web, it's possible we would be able to spot that
and take that into account. But I don't know. It probably also depends a
little bit on what exactly it is. Like, if you reach out
to a random blogger and they give you a
certificate, then that's slightly different than
if it's a really, really highly official certificate. Cool. Let me run through– ASHWANI KUMAR: [INAUDIBLE] JOHN MUELLER: Let me run through
some of the other submitted questions, and then we can
get back to the live questions as well. ASHWANI KUMAR: Sir, I
have one question, sir. JOHN MUELLER: I'll
get back to you. That OK? Let me just go through some
of these submitted ones, because they submitted
them ahead of time.
And then we can get some
more of the live questions from you all, too. OK. In May 2020, it was
announced that Google will incorporate
the page experience metrics into the ranking
criteria for Top Stories in search on mobile and
remove the AMP requirement from Top Stories eligibility. Does this also apply
to Discover as well? In other words, will
the AMP requirements be removed from Discover? So even at the moment, we
don't require AMP for Discover. It's something where we do show
AMP pages because we understand that they're normal
pages, but we also show normal pages
in Discover as well. The difference primarily
with regards to AMP and non-AMP pages in
Discover is with regards to the size of the thumbnail
image that we show. By default, for AMP pages,
we can show a slightly bigger thumbnail image. By default for
web pages, we have to show a slightly
smaller thumbnail image. However, if you use the, I
think, max image preview robots meta tag, you can let
us know that we can also show a larger thumbnail
image for your web pages.
And if you do that,
then essentially, the way that the pages
are shown in Discover are essentially the same in that
you have the larger thumbnail image, if you choose to do
that, plus your normal snippet. And you can be shown essentially
like any other page there. When comparing Core Web Vitals
between AMP and mobile pages, does a mobile site page
need to outperform the AMP page, or meeting the Web
Vitals target set by Google is sufficient for
the mobile page to replace the AMP in the
Top Stories and Discover? So essentially, what we watch– so I don't actually know how we
would handle that specifically in the Top Stories carousel,
so with regards to the ranking especially there. So I don't think it's as easy
as your page has to be faster, then we will show it
in Top Stories instead of someone else's pages.
It's probably the
case that we just use lots of different factors
with regards to ranking in the Top Stories
carousel, and we also plan on using speed there. So that's something where
I don't think blindly focusing on speed makes sense. A good comparison
that I've used, or that I think is
a good comparison, is you can make an
empty page really fast, but it's not very
useful for users. So similarly, like if
you made a non-AMP page super fast by removing
all of the content, then for us, that
wouldn't be very useful to show in
the Top Stories, even if it's super fast.
So there are lots of things
that come into play with regards to ranking in
situations like this, and they also apply
in Top Stories. I have no idea how we
do ranking in Discover. That's kind of a mystery to me. But within Top Stories,
it seems like something where we could take a lot
of the existing web search ranking factors, too. I have a website that mainly
deals with providing services for approvals and
certifications that are mandatory for importers and
manufacturers of electronics and IT products. On the Services page,
I've put the JSON-LD for contacts, logo, addresses. If I put FAQ structured data
markup on every Services page to make it appear
in the rich results, will it affect any way
my website is appearing for the search results
intended for service providers kind of query? So it's perfectly fine
to have multiple kinds of structured data on a page.
The structured data
should be appropriate for the primary content
of the page, though. So that's something
whereas, like, you wouldn't put the same FAQ
on all of your product pages because those questions
wouldn't apply to every product individually. So with that in
mind, I think if you focus on what is the
primary content of this page and provide structured
data for that, then you're probably
on the safe side. It's not the case
that we would say, if you have this kind
of structured data, we will ignore all of the
other kinds of structured data. We will always take
that into account. However, we might not always
show that in the same way. So for example, I don't
know what the current mix is for things like this, like if
you have recipe structured data on a page and maybe you also
have FAQ structured data on a page, then probably we
would choose one of those two types when it comes to picking
which way to show that page.
We wouldn't show all of the
types of structured data at the same time. But it's not that
there's a defined order or priority with regards to the
different rich results types. So my recommendation
there would be focus on the primary
element of the page. And if you could provide
structured data around that, that's a great use. If you just have
generic structured data that you put on all
of your pages, then probably that's not a
good use of your time. One of my websites
has fallen down– oh, I think we talked
about this briefly. For which cases do I need a
self-referential canonical? So self-referential
canonical is when you have a link rel
canonical and you essentially point to the same page
that you're already on. And from our point of
view, this can make sense because a lot of
pages on the web can be crawled with
different variations of URLs. So for example, you might have
the same page under www and non www on your website. And if we crawl one version
and we don't have any link rel="canonical" on a
page, then we don't know, is this the version that you
want to have indexed or do you want the other
version to be indexed? And then we might pick one.
Whereas if you do have the
rel="canonical" on that page, then we would know, oh, this
is the dub-dub-dub version, but you want the non
dub-dub-dub version indexed. So we will focus on that one. Similarly, if you
have URL parameters– for example, if
you're tracking events with Analytics or
some other reason that maybe you're tracking
referrers or something else with the parameters where
you have a question mark, and then parameter
equals some text– if you don't want to have
those parameters indexed, then having your
self-referential canonical to the primary
version of that page is a good way to catch that. So in particular, if you have an
HTML page that is static HTML, then you can't
really check to see if all of these factors
in the URL are perfect. You can, however, put the
self-referential canonical on the page, and
then we would be able to see, oh, this HTML
page with these parameters, actually, the canonical
should just be the HTML page, and then we can focus on that.
So that's kind of the
angle that I take there, is that using
self-referential canonicals is a good way to make it
so that it's easier for us to pick the right version
when it comes to search. Are link rel="next" and
rel="previous" ignored? Yes, they're ignored. At some point, we use that to
understand pagination better, but our systems have
gotten a little bit further in that we're able to recognize
common types of pagination setups on our own,
and we can process those as normal internal links
and understand the context from there a little bit better.
So we no longer need the
special kind of link attributes. You can keep them on your pages
if you have them there already. They can be useful for
things like accessibility, where if someone is
using maybe a screen reader or some other kind
of assistive technology and they want to see the
next version of that page, then this kind of
markup can help them to. So it's not something that
you should kind of remove by default, but rather,
it's just, well, Google doesn't use it, but
maybe other systems do use it. How to optimize a knowledge
base website like dictionary.com with very short content and
less than 400 words per page? So first of all, I would
not focus on the number of words per page. I think that's a terrible metric
when it comes to understanding if content is good or not. Sometimes it's useful
as a metric which you can use internally as
a guideline on your side, like I want to provide
a lot of information. Or where there is a
strong information need, I will provide more
information, or where there's a lower
information need, I'll just provide
less information.
All of these things are useful
maybe on your side internally, but it's not something that
Google is going to use. So Google is not going to
count the words on your page and say, oh, 300 words. We will never show this
page in the search results. With regards to kind of
a knowledge base website like that, I think
you really need to work hard to make sure that
you're providing unique value and something really
interesting for your users, with regards to a
website like that. It's very tempting as
a programmer to go in and say, well, I have this
database of information.
I will just make a
website out of it, and the database
of information is something that I've collected
from the internet or whatever. It's very easy to go off
and create a website that has millions of pages
based on a database, but it's hard to do
that in a way that is actually useful for people. And that's essentially what our
systems are looking out for, because this database
of information will probably already be on
the web a number of times. And then going off and– it's like, you can
create all of this again, but making sure that we
show your version instead of the other versions
that we have, for that, you really need to provide
something more than just a dump of the data there. And similarly, we sometimes
see that for businesses, where they take– as the database, they
take a list of cities or a list of countries or a list
of different types of services, and you take this database
of 1,000 services and 50,000 cities, and you
cross them together, and you create this website
with a giant mass of pages which are just database driven.
And you can do that, of
course, but our systems, when they look at that,
they'll say, well, what is the value here? Why should we be
showing this to users? And for that you need
to do more than just, well, it's correct data
and it's short content, but it's good content. You really need to provide
something that users pick up on and that they'd like
to see in search. And I think that's always
the hard part there.
Like, going off and creating
these things is easy, but doing that in a way
that is actually useful, that's kind of tricky. Cool. Let me see. Maybe we even ran
out of questions. Yeah, I think we got through
pretty much everything that was submitted. So cool, we can go more– more
live questions from you all. Dido. DIDO GRIGOROV: Yes, John. I have– before
asking my question, I wanted to support your answer
about the content [INAUDIBLE] that me and Nicola
have tested this with several different
articles for websites, and I can confirm that it's
more about how you write and what you write
and how you provide the answer to the user
compared to how lengthy is the answer, because
it could be 2,000 words, could be absolutely
pointless and so on.
OK, so my question
regarding the content is, we were wondering about
the featured snippets here in Bulgaria. Is it possible to
expect them soon, because they are really
good element of Google? They're really attracting,
and we do our best to provide usually the best
content when we prepare articles for the websites. Thank you. JOHN MUELLER: Yeah. I don't know. I thought we had featured
snippets everywhere, so they're not in Bulgaria? DIDO GRIGOROV: No, no, no. Not yet. JOHN MUELLER: OK. That's weird. OK. I don't know what
the plans are there. But I can check to see
what the team there says. I mean, I can't,
like, give you a date and say, we will do
this then, because– DIDO GRIGOROV: Of course
not, of course not. JOHN MUELLER: I don't know,
and even if the team tells me, like, oh, we're going
to launch next week, I can't really tell
you ahead of time. But if– it can happen
that something like this gets stuck in
there, like, oh, we have five countries left
where we need to launch this and it's like, oh,
let's go off and do it.
But yeah. I mean, sometimes with some
of these search features, there are also tricky
policy questions associated with
that, where they say, we would like to do
it here, but actually, because of some local laws
or some local policies, we can't easily do that. So that's sometimes
a bit tricky. But I'll ping the team
and see what they say. DIDO GRIGOROV: Yeah,
thank you, thank you. JOHN MUELLER: Sure. SUN-KYUN SHIN: Hello. JOHN MUELLER: Hi. SUN-KYUN SHIN: Hey
John, I have a question. Actually, I wrote down some. We have AMP version, mobile
version website m-en.abc.com, and we have also desktop
version, en.abc.com, and the problem is that our
[INAUDIBLE] m-en.abc.com is not a mobile domain. So Googlebot cannot– we
have some kind of advantage– disadvantage for the website
indexing and domain ranking.
Do we need to change the domain? That's the first question. JOHN MUELLER: So the M
version is not mobile friendly, or how do you mean
it's not mobile friendly? SUN-KYUN SHIN: It's
mobile friendly and we have crawling already,
but it's kind of possibility that we could have great
disadvantage, So we don't know. So we have to maybe
change the domain or make it the other site. JOHN MUELLER: Yeah. I don't think you would
need to change that. So we support the separate
mobile URLs configuration, where if you have one URL for
desktop and one URL for mobile, I think that's what
you have, right? SUN-KYUN SHIN: Right. JOHN MUELLER: Yeah. If you have that setup and
you link between the versions appropriately– there's like
the link rel="canonical" to desktop and link alternate
to the mobile version, then we will be able to
understand that configuration and we would be able to treat
that the same as any other website.
We don't recommend that anymore. It's not that we say
you shouldn't do it. We just say if you
make a new website, we recommend just
making one version. The main reason for
just having one version is to make it so that it's
easier for you to maintain and that you understand
what is happening better. Because we will try to index
the mobile version of the site, but if you have the canonical
set to the desktop version and then in Search Console,
you see both versions, it gets a bit confusing. So we recommend
just one version, but we support the other setups. And in the past, we have
recommended that as well. So it's like a lot of
websites out there use that separate setup. SUN-KYUN SHIN: OK. Quick question, Mr. John. We have news website,
dah-dah-dah abc.com, and that's our website. But we use abc.kr
as 302 redirection.
So does it affect
SEO in bad way? And also, we tried to register
abc.kr domain in Search Console because we need to
delist some news when there is problems in the news. JOHN MUELLER: Yeah. Is it such that everyone is
redirected to the .kr website? SUN-KYUN SHIN: Yes, sir. JOHN MUELLER: Yes. OK. SUN-KYUN SHIN: From
abc.kr to abc.com. JOHN MUELLER: Yeah. SUN-KYUN SHIN: Not everything– about some of them are in
the abc.com and some of them are in abc.kr But all the
directions, destinations are abc.com. JOHN MUELLER: OK. Yeah, I think that
that should be fine. So the 302 redirect
is theoretically a temporary redirect, and
we might choose to index the kind of source URL instead
of the destination URL. But if you have that redirect
in place for the long run, like more than a– SUN-KYUN SHIN: Five years. JOHN MUELLER:
–short period of– OK, years, then we will treat
that as a permanent redirect, and we will try to index
the destination page. And it's not that there
is any value that is lost because it's a 302 redirect.
It's really, we
have these two URLs and one is redirecting
to the other one. Which one should we index? And we'll try to index the
one that is clear for us as the primary version. SUN-KYUN SHIN: [INAUDIBLE] JOHN MUELLER: Yeah, yeah. I think that should be fine. Yeah. SUN-KYUN SHIN: Thank you, sir. JOHN MUELLER: Sure. Let me just see. There's one more question I
think I missed from YouTube. How long does Google
give a persistent error if the site is
temporarily using 503? So 503 is a temporary
server error code, and from our point of
view, it's something where we don't have a
fixed timeline with regards to when we drop individual
URLs from our index when they return 503.
But we recommend doing that,
at most, for a couple of days. So somewhere between
two to seven days, something like that, at maximum. And at that point, we will start
dropping URLs from the website. So it's not that there
is a exact date or time limit that is there. It's really– if it's
a temporary error, then it should be temporary. It shouldn't be something that
is in place for a longer time. All right. I think someone
else had a question. ASHWANI KUMAR: Sir, hello. JOHN MUELLER: Hi. ASHWANI KUMAR: Yeah. Sir, again, I had one question. I'm working on a
[INAUDIBLE] site, like pharmaceutical industry. OK. And what I'm realizing there– this industry is
totally different from any other industry, if
I'm working on any IT company or any other industry. So pharmaceutical basically
is totally different. We need to follow each and
everything, like EAD, like– so I'm just asking
you, what kind of link I can create for
[INAUDIBLE] site? Like, content is
good, but we need to create some link quality.
So what do you suggest? What kind of link is good
for [INAUDIBLE] site? JOHN MUELLER: I mean, the
link should be natural link, so it shouldn't be the case that
you're going off and creating links to your website, right? So that's– ASHWANI KUMAR: Yeah. [INTERPOSING VOICES] JOHN MUELLER: I mean, that's
kind of the starting point there. And based on that,
I don't think you would need to do anything
different in that regard.
Essentially, creating
useful and helpful content that people want to
link to makes sense, and you can't really control
who is linking to your site. So from that point
of view, it's not like you should avoid
having these kind of people link to your website
and instead you should make sure
these kind of people are linking to your website. Like, that's rarely something
that you can really control. So from that point
of view, I think focusing on what
kind of link should I get for my website,
that's probably the wrong approach there. ASHWANI KUMAR: OK, thank you. JOHN MUELLER: But I
do kind of appreciate that especially when it comes
to pharmaceutical and medical websites, you do need
to be a little bit more critical with regards to
what you put on your website, how you work with your
website, all of these things. And that's something like– I think Nicola dropped a lot
of links in the chat about EAT and things like that.
I think that's all
worth looking into. And it does make sense to
spend a little bit more time to think about
what you want to do here and how to do it in a way
that is long term sustainable, and where users, when they go to
your website, they understand, this is a trustworthy website. ASHWANI KUMAR: Sir, one
more question is there. In Google Search Console, I
have submitted my sitemap, OK. But after a couple
of days, what I found is sitemap is indexed
properly, all the web pages are indexed properly, but
what I observe is there is– like, number of
link found is zero.
So why it is showing
zero I am not getting? Would you please let me
know why it is showing zero, number of link count? JOHN MUELLER: Where
are you seeing that? ASHWANI KUMAR: Inside
the Search Console. JOHN MUELLER: Inside– ASHWANI KUMAR: Yeah. The number of link
search has found is zero. It is showing zero. But [INAUDIBLE] it was showing
like 200 links were there, but right now, it
is showing zero. But each and every page
is indexed properly, but the figure is zero. [INAUDIBLE] it was 200. JOHN MUELLER: So that's
the external links report that you're looking at, or– ASHWANI KUMAR: No. It's the complete
website, internal link. There are 200 pages
on the website. All of them are
indexed properly. But right now, the
figure is not 200. It's zero. It's showing zero. NAN. It's showing NAN. JOHN MUELLER: OK. I don't know. Could you post maybe
a screenshot or link to a screenshot in the chat? I can take a look at that.
I don't know. ASHWANI KUMAR: Yeah,
I will definitely send this screenshot. JOHN MUELLER: OK, cool. All right. I think– let's see,
who do we have here? Dido, I think you're next. DIDO GRIGOROV:
Yes, I just wanted to add a little bit about
the links shared by Nicola and the certification of the
team members of a website, that this is another thing we
have tested that did really, let's say, count. I don't mean that you will
have a blow up of visibility or something like that. But it supports the
trustworthiness of the people to your website,
and it really works. It really works. It's way better. People kind of trust in
the website and the brand better that way
because these people look more like experts
coming from the company.
Yeah. JOHN MUELLER: OK. So kind of like your
co-workers and people who are working on
the website to include more information on those. DIDO GRIGOROV: And the
credibility of the other team members, yeah. This is really important
because it's like– let's imagine we have
a doctor's website. And yes, we know that the
doctor has his degrees and so on and so on. But once they share some
certificates about conferences, seminars, additional
practices they have passed within
their background and so on, this looks like
a little bit more credible, more trustworthy to trust
your health, this specialist. Yeah. JOHN MUELLER: Yeah. Sounds like a good idea. OK, let me pause
the recording here. I still have a bit
more time, so if you want to stick around
a little bit longer, you're welcome to do so. But thanks for all
of the questions. Thanks for all of the
comments along the way. And thanks for submitting
questions on YouTube, as well.
If you're watching
this recording, I hope you found this
useful, and hopefully I'll see you all again in one of
the future Hangouts as well. All right. Let me see. ASHWANI KUMAR: Thank you. Thank you, sir. JOHN MUELLER: Thanks..