Google’s Mueller answered a quest from someone whose site was deindexed and lost their rankings. John Mueller offered a list of technical issues that can cause Google to remove a website from the search results.
What’s good about this question and answer is that Mueller discusses two kinds of deindexing, a slow deindexing and a faster deindexing.
SEO Office-hours hangouts are not the place to ask for a diagnosis for a specific website. So it’s reasonable that Mueller did not give the person asking the question a direct answer specific to their website.
Upgraded Yoast SEO from Free to Premium and Lost Rankings
This is the question that was asked:
“I own a site and it was ranking good before 23rd of March. I upgraded from Yoast SEO… free to premium. After that the site go deindexed from Google and we lost all our keywords.”
The person asking the question noted that for the past few days the keywords returned to the search results for a few hours and then would disappear.
They said they checked Robots.txt, and checked the sitemaps and verified there were no manual penalties.
One thing he didn’t mention checking was whether the web pages contained a Robots Noindex meta tag.
Mueller Asserts Yoast Plugin Not the Reason Site Was Deindexed
Google’s Mueller begins his answer by speculating that the deindexing isn’t connected to updating the Yoast plugin from the free version to the premium version.
I think it is reasonable to start with the Yoast plugin and look at the settings. I have had it happen to me where I installed the Yoast SEO Plugin and subsequently discovered that pages had somehow acquired a “noindex, follow” meta description.
I have no idea what caused that to happen, I just noticed that it happened.
In my experience it’s a good practice to not dismiss anything as a reason without first checking it.
So I have to disagree with dismissing the Yoast SEO plugin upgrade as a cause without checking it before ruling it out.
Mueller answered:
“I don’t know… it sounds kind of tricky… I would say offhand it probably doesn’t have to do with the updating of your plugin.”
John Mueller from Google Answering Question About Deindexing
John Mueller discussing different ways that Google removes websites from their index
Why Google Deindexes Websites
Mueller next offers insights into the deindexing process including a long deindexing scenario where parts of a site are slowly deindexed because Google doesn’t consider them relevant.
Mueller Discusses Slow Partial Deindexing
Mueller next discusses a slow deindexing of parts of the site but not the entire site. What he describes next is a partial deindexing.
Mueller:
“But it could very well be a technical issue somewhere.
Because usually… when we reduce the indexing of a site, when we say we don’t need to have as many URLs indexed from a website, we tend to keep the… URLs that are more relevant for that site and that tends to be something that happens over… I don’t know… this longer period of time where it like slowly changes the indexing.”
What the person asking the question was not a slow or partial deindexing. His problem is a total site deindexing.
John Mueller Discusses Full Site Deindexing
Next Mueller described the possible reason why a site might experience a complete deindexing.
Mueller:
“So if you’re seeing something where like the whole site disappears from indexing, it almost sounds like something that might be related to a technical issue… something along those lines.”
Mueller next goes on to recommend going to the Webmaster Help Forums to ask for help in diagnosing the specific issue, something that is inappropriate for the Google SEO Office-hours hangout but appropriate to ask in the Google forums.
Mueller suggests it could be a technical issue, a site quality issue, a spam issue, possibly a hacking event,
Many Types and Reasons for of Deindexing Events
If a site is being deindexed it’s good to check not only the Robots.txt file but also to check the source code of the individual pages themselves to make sure there isn’t a rogue noindex meta description file that is blocking Google from indexing the web page.
There are many reasons why a site could be deindexed beyond the accidental robots.txt and robots meta tag, as Mueller noted. Reasons such as a hacking event or other technical issue that could be blocking Google should be investigated and nothing ruled out until it’s been checked.
Aside from that, the information about the slow partial deindexing and the total site deindexing was good information related to how Google deindexes websites.
Citation
Watch John Mueller answer why websites get deindexed.
He answers the question about the seven minute mark.