8 On-Page Optimization Techniques that Google Hates

In the past, a wide range of black hat SEO techniques proved to be successful in the short-term.

However, they simply weren’t sustainable.

Many websites implementing such tactics experienced severe penalties that were impossible to recover from.

The bottom line is that Google has become much smarter and continues to advance its algorithm every day.

For this reason, SEO is constantly evolving and our strategies for on-page content optimization must do so, as well.

The downside?

Doing white hat SEO takes a lot of time and expertise.

It requires:

  • A well-constructed strategy.
  • A variety of on-site and off-site initiatives.
  • Quality content.
  • The ability to adapt to a constantly changing landscape.

But, the payoff is real.

In this article, I’ll be looking at some of the many old-school SEO techniques that you should avoid – or else risk getting penalized.

1. Keyword Stuffing

Offering value to readers should be at the forefront of all SEO and content marketing tactics.

In the past, marketers could get away with producing a ton of thin content that didn’t provide much value to their users but would still get them at the top of search results.

Keyword stuffing was one of the most common content tactics because well… it was so easy!

Marketers would simply identify the keyword(s) they want to rank for.

Then produce content with a high-level focus on the topic but not too in-depth – stuff content with the exact-match keyword(s).

And make sure the title, page tagging, and headers were stuffed with the keyword as well.

For example:

Google is now much better at understanding what content provides value and answers the questions that users are asking.

For this reason, this on-page content optimizations tactic was made obsolete with the Panda update in 2011.

How to Avoid It

To help ensure you’re offering real value and not participating in the act of keyword stuffing, here are a few questions that you should be asking yourself when crafting content:

  • Does the content on this page truly align with the title and page tagging?
  • What types of content are showing up in top search results around this keyword? Is my content better or is it just providing more noise?
  • How many times did I use the keyword on the page? Are there any uses that don’t read naturally?
  • What actionable next steps could readers take from this content?
  • Could this content have been added to an existing page instead?

2. Only Optimizing for Desktop

Mobile optimization was not always a big focus for SEO pros in the past.

However, there is no denying that it will be fundamental for the future.

Google first announced that it was experimenting with mobile-first indexing in 2016.

And ever since then there’s been a ton of conversation around the topic.

Most recently, John Mueller confirmed that Google’s mobile-first index will exclude all desktop content as of March 2021.

While he provided quite a bit of information in his recent Pubcon presentation, the most important takeaways are that desktop-only sites will be dropped from Google’s index, and any images or other assets that are in desktop version of a site will also be dropped.

How to Avoid It

Mobilegeddon is in play, and marketers who aren’t optimizing for mobile will surely be left behind.

Make sure that the mobile version of your website correctly reflects the content that you want Google to index and rank in search results.

M-dot sites may experience having desktop users sent to the mobile version of their site from SERPs.

For those with an M-dot site, it’s recommended that you identify desktop users and redirect them to the correct desktop version of your website.

3. Targeting Keywords for Traffic, Not Intent

In the old days, marketers would include popular (and often inappropriate) keywords in their content for the sole purpose of gaining traffic.

However, once the users arrived on the page, they would not convert due to the simple fact that what was provided on the page was not aligned with what they were searching for.

While high volume keywords present the opportunity to drive traffic to your website, targeting keywords that aren’t relevant to your brand can cause serious harm.

Not only will you have the wrong eyes on your content, but you will also confuse search engines and get penalized.

How to Avoid It

Stick to your niche.

Understand that as Google is indexing pages on your website, it takes into consideration all the content and topics that you address across the domain (beyond just that page) then ranks it for the queries it deems relevant.

My suggestion is to write about what you want to be known for – by users and by search engines.

If you are a software organization, don’t write a series of controversial articles about the presidential election.

Content that is irrelevant to your brand will only create more noise and confusion for search engines.

4. Unnatural Internal Linking

Internal linking can be an extremely powerful strategy for SEO if done properly.

Ideally, it allows you to connect webpages and create an ideal structure or path for search engines and users to follow.

Both users and search engines reference links to find content on your website.

Google uses links to understand the relationship between content and attribute link value while readers use links to find valuable, related information.

Unfortunately, marketers often walk a fine line when it comes to internal linking.

If you are repeatedly linking to your site’s top pages with keyword-rich anchor text and in ways that don’t make sense or feel natural, this may be flagged by Google.

Getting hit with this type of over-optimization penalty is far from ideal.

At one point in time, marketers could dramatically and almost instantly improve keyword rankings by implementing internal links from keyword-rich anchor text.

As shown above, all you would have to do is link off to the same URL from several keyword variations.

However, similarly to the spammy tactics we’ve covered already, Google caught on!

How to Avoid It

Internal links should provide users with relevant information around the anchor text and topic at hand.

Ask yourself:

  • Does this link provide real value to users?
  • Does this link make sense with the anchor text that I am linking out from?
  • What other content might users want to explore after reading this page?
  • Would users expect to land on a page with this type of information, or is the anchor text misleading?
  • How often am I linking off to this page (on this page and other content)?

5. Unique Pages for Every Keyword Variation

While keywords are by no means dead, they don’t play the same role in SEO that they used to.

For a long time, keywords were the end all be all of getting your pages to rank in top SERPs.

Though making the proper keyword selections is important, you can no longer stuff your site with multiple keyword variations and expect it to boost rankings.

At one point, obtaining a top position in search results required creating unique pages for every keyword variation you wanted to rank for.

For example, if you wanted to rank for the following keywords, you would need to create a dedicated page for each:

  • “Personalized birthday gifts.”
  • “Custom personalized birthday gifts.”
  • “Custom birthday presents.”

This SEO method allowed marketers to target keywords individually and rank for every variation of a keyword.

While it was 100% acceptable years ago, today, this tactic would lead to some serious keyword cannibalization and certainly cause more harm than good.

How to Avoid It

Instead of creating unique pages for every keyword variation you want to rank for, focus on identifying keywords with the right intent.

Create one amazing piece of content and see if it ranks for multiple keyword variations.

If it doesn’t, you can always revisit it for further optimizations but make sure to give it enough time to take off.

Again, be cautious that you aren’t over-optimizing your website.

Panda, Hummingbird, RankBrain, and Fred are constantly looking to take down sites that abuse keywords.

6. Spammy Footers

Footers are so important to help users:

  • Learn more about your brand.
  • Navigate key sections of your website.
  • Find the information they are looking for.

It also allows the chance to provide essential content like location, contact information, social media profiles, privacy policies, copyrights, and much more.

For example, here’s an optimized footer:

Here you see IBM’s logo with links to key locations on the website like Products, Services, Industries, Demos, etc.

The footer also features a whole section dedicated to learning more about the brand – careers, events, news, partnerships, and much more.

It also provides other ways to connect with the brand including support, social media channels, and a way to contact IBM.

It’s a pretty intuitive experience.

Now, instead, imagine if that website footer was stuffed with hundreds of links and tags.

How do you feel about the footer below?

Yuck!

It’s no wonder why websites that were using this spammy tactic were penalized by two algorithm updates – Panda (targeting poor site structure) and Penguin (aimed at sites participating in link and tag manipulation).

How to Avoid It

Keep the user experience top of mind when optimizing both the footer and header of your site.

Make sure your footer provides vital information like:

  • Key navigational pages
  • About the company.
  • Contact information.
  • Copyrights/policies.
  • Social media channels.
  • Any subscription fields and more.

But, please don’t use it to spam search engines!

You will not succeed.

7. Cloaking

Now, this one is a throwback in the world of SEO.

This old school tactic is strongly forbidden by search engines for very good reasons.

The art (or lack thereof) of cloaking refers to the method of delivering a certain page to search engine crawlers, while delivering a completely different page to the human eye.

This was an advanced black hat SEO method.

Marketers would identify search engine crawlers by IP in order to serve them different web page content.

It also called for abusing server-side scripts and code on the backend.

Why?

To manipulate search engines.

Marketers could get their pages to rank, while still providing an ideal experience for users.

Additionally, some marketers would combine this tactic with targeting irrelevant keywords for traffic (discussed above).

For example, you would search “cute otter pictures” and end up on a real estate site.

How to Avoid It

This one is pretty straightforward – don’t cloak.

Avoid learning to abuse server-side scripts and “behind the scenes” code all for a quick boost in SERPs.

8. Content Swapping

Last but not least, let’s talk about another advanced black hat SEO method – content swapping.

This was another way of manipulating Google’s algorithm to get content to rank.

It went a little something like this:

  • Publish content to your website.
  • Wait for Google to crawl and index that content.
  • Verify that the page is being displayed in search results.
  • Block the page (or even entire site) from being indexed.
  • Swap content to what you actually want to appear.

A classic example of this is a page that originally covered tobacco pipes but was exchanged with content that featured banned substances.

Google wasn’t always as quick as it is today.

This was always a forbidden SEO tactic, but black hatters got away with it because Google was once slow to reindex websites.

Now, Google almost always immediately cuts pages that are blocked from the index.

How to Avoid It

Obviously, content swapping is no longer a workable strategy and can result in serious penalties.

Remember that quality is king.

If you want Google to rank your page, make sure that you are providing top-notch content that offers significant value to your readers.

Conclusion

Avoid these on-page optimization techniques that Google hates.

Instead, implement the best practices provided.

When in doubt, think about what’s best for the user experience and you’ll be on your way to boosted rankings and a brighter future.

More Resources:

Image Credits

All screenshots taken by author, November 2020

#