Google’s John Mueller was asked if blocking Google from anti-ad block detection function would cause Google to see the page as cloaking. Mueller explains what cloaking is and why blocking Google from the ad blocker detection script isn’t cloaking.
Cloaking Violates Google’s Guidelines
Cloaking is an old trick where a web page shows different content depending if a site visitor is a search engine bot or a regular human user.
In the very old days adding keywords multiple times on a page could help that page rank better. That was called “keyword spamming.”
However the page looked really bad and untrustworthy and visitors were apt to back out of that page than to click an affiliate link and earn the site owner a referral fee.
So what spammers did was to show a page full of keywords to search engines in order to help it to rank.
But for human users the web page would show a nice and normal page that would convert better because it didn’t look spammy.
Google Search Central Guidelines provides these examples to help understand what is cloaking:
“Serving a page of HTML text to search engines, while showing a page of images to users
Inserting text or keywords into a page only when the user agent that’s requesting the page is a search engine, not a human visitor”
Could Ad Blocker Cause Cloaking?
The person asking the question said that they were thinking about adding an anti-ad blocker to their site. An anti-ad blocker blocks visitors with an blocker from seeing the content.
The goal is to train visitors to whitelist the website so that they can see the content and the advertisements.
This is the question:
“We have a site that is considering adding ad blocker detection to prevent users from accessing the site whenever the ad blocker is on.
The question here is, if we decide to exclude Googlebot from seeing the ad block detection, will we be flagged for cloaking in that situation?”
The situation described isn’t really about showing different content to users and Google.
It’s really about creating two sets of user/site visitor statuses (in addition to the administrator status of whoever runs the web page).
Visitors without ad blockers have higher privileges which make them entitled to read the content.
Visitors who have ad blockers enabled have less privileges which deprives them of the opportunity to read the content.
This is how John Mueller answered if the situation amounted to cloaking:
“Probably not. I think in general that would be fine.
I would kind of see that as a way of recognizing that Googlebot doesn’t actually have an ad blocker installed.
So it’s kind of a unique setup that Googlebot has with regards to rendering pages and I think that would kind of be okay.”
Mueller didn’t see this as showing different content to humans and Googlebot. He saw it as Google doesn’t have an ad blocker so that makes it entitled to see the content.
John followed up by explaining more about cloaking:
“In regards to cloaking, the cloaking team mostly tries to watch out for situations where you’re really showing something different to users as to Googlebot.
And with regards to.. ad blocking or …other kind of things where it’s like you have to be logged in to actually see the content and that’s kind of different.”
Mueller moved on to note that he’s not a fan of “anti-ad blocking setups” but acknowledged that if a site needs to do it then that’s an “appropriate approach.”
The person asking the question next asked if serving Google an ad-blocking overlay on top of the content would create indexing problems.
Mueller:
“If it’s an HTML overlay on top of the existing page then I don’t see that as being problematic because we would still see the actual content in the HTML, kind of, behind that.
That’s similar to like if you have …a cookie banner or a cookie interstitial that you’re essentially showing just an HTML div on top of the page.
From our point of view if we can still index the actual content from the page then that’s fine.”
Cloaking and Google
Cloaking is a very specific thing with a definite intent to deceive Google and site visitors for the purpose of achieving better search engine rankings.
Showing different content based on what a site visitor is entitled to see based on their status or user privilege is something else entirely.
News organizations routinely distinguish between a paid visitor and a non-subscribed visitor, creating two classes of site visitors.
Forum software does something similar as well like not allowing search engines (and unregistered site visitors) to view user profiles.
In both of those cases it’s about creating different site visitor classes and showing different things based on how they’re categorized.
Cloaking is showing unique content to search engines for ranking purposes, something completely different.
Citation
Watch the Google SEO office-hours hangout, segment located at the 19:46 minute spot.