How to Perform an In-Depth Technical SEO Audit

I’m not going to lie: Conducting an in-depth SEO audit is a major deal.

And, as an SEO consultant, there are a few sweeter words than, “Your audit looks great! When can we bring you onboard?”

Even if you haven’t been actively looking for a new gig, knowing your SEO audit nailed it is a huge ego boost.

But, are you terrified to start? Is this your first SEO audit? Or, maybe you just don’t know where to begin?

Sending a fantastic SEO audit to a potential client puts you in the best possible place.

So take your time. Remember: Your primary goal is to add value to your customer with your site recommendations for both the short-term and the long-term.

In this column, I’ve put together the need-to-know steps for conducting an SEO audit and a little insight into the first phase of my processes when I first get a new client. It’s broken down into sections below. If you feel like you have a good grasp on a particular section, feel free to jump to the next.

When Should I Perform an SEO Audit?

After a potential client sends me an email expressing interest in working together and they answer my survey, we set-up an intro call (Skype or Google Hangouts is preferred).

Before the call, I do my own mini quick SEO audit (I invest at least one hour to manually researching) based on their survey answers to become familiar with their market landscape. It’s like dating someone you’ve never met.

You’re obviously going to stalk them on Facebook, Twitter, Instagram, and all other channels that are public #soIcreep.

Here’s an example of what my survey looks like:

Here are some key questions you’ll want to ask the client during the first meeting:

  1. What are your overall business goals? What are your channel goals (PR, social, etc.)?
  2. Who is your target audience?
  3. Do you have any business partnerships?
  4. How often is the website updated? Do you have a web developer or an IT department?
  5. Have you ever worked with an SEO consultant before? Or, had any SEO work done previously?

Sujan Patel also has some great recommendations on questions to ask a new SEO client.

After the call, if I feel we’re a good match, I’ll send over my formal proposal and contract (thank you HelloSign for making this an easy process for me!).

To begin, I always like to offer my clients the first month as a trial period to make sure we vibe.

This gives both the client and I a chance to become friends first before dating. During this month, I’ll take my time to conduct an in-depth SEO audit.

These SEO audits can take me anywhere from 40 hours to 60 hours depending on the size of the website. These audits are bucketed into three separate parts and presented with Google Slides.

  • Technical: Crawl errors, indexing, hosting, etc.
  • Content: Keyword research, competitor analysis, content maps, meta data, etc.
  • Links: Backlink profile analysis, growth tactics, etc.

After that first month, if the client likes my work, we’ll begin implementing the recommendations from the SEO audit. And going forward, I’ll perform a mini-audit monthly and an in-depth audit quarterly.

To recap, I perform an SEO audit for my clients:

  • First month.
  • Monthly (mini-audit).
  • Quarterly (in-depth audit).

What You Need from a Client Before an SEO Audit

When a client and I start working together, I’ll share a Google Doc with them requesting a list of passwords and vendors.

This includes:

  • Google Analytics access and any third-party analytics tools.
  • Google and Bing ads.
  • Webmaster tools.
  • Website backend access.
  • Social media accounts.
  • List of vendors.
  • List of internal team members (including any work they outsource).

Before you begin your SEO audit, here’s a recap of the tools I use:

Conducting a Technical SEO Audit

Tools needed for technical SEO audit:

  • Screaming Frog.
  • DeepCrawl.
  • Copyscape.
  • Integrity for Mac (or Xenu Sleuth for PC users).
  • Google Analytics (if given access).
  • Google Search Console (if given access).
  • Bing Webmaster Tools (if given access).

Step 1: Add Site to DeepCrawl and Screaming Frog

Tools:

  • DeepCrawl.
  • Copyscape.
  • Screaming Frog.
  • Google Analytics.
  • Integrity.
  • Google Tag Manager.
  • Google Analytics code.

What to Look for When Using DeepCrawl

The first thing I do is add my client’s site to DeepCrawl. Depending on the size of your client’s site, the crawl may take a day or two to get the results back.

Once you get your DeepCrawl results back, here are the things I look for:

Duplicate Content

Check out the “Duplicate Pages” report to locate duplicate content.

If duplicate content is identified, I’ll make this a top priority in my recommendations to the client to rewrite these pages, and in the meantime, I’ll add the tag to the duplicate pages.

Common duplicate content errors you’ll discover:

  • Duplicate meta titles and meta descriptions.
  • Duplicate body content from tag pages (I’ll use Copyscape to help determine if something is being plagiarized).
  • Two domains (ex: yourwebsite.co, yourwebsite.com).
  • Subdomains (ex: jobs.yourwebsite.com).
  • Similar content on a different domain.
  • Improperly implemented pagination pages (see below.)

How to fix:

  • Add the canonical tag on your pages to let Google know what you want your preferred URL to be.
  • Disallow incorrect URLs in the robots.txt.
  • Rewrite content (including body copy and metadata).

Here’s an example of a duplicate content issue I had with a client of mine. As you can see below, they had URL parameters without the canonical tag.

These are the steps I took to fix the issue:

  • I fixed any 301 redirect issues.
  • Added a canonical tag to the page I want Google to crawl.
  • Update the Google Search Console parameter settings to exclude any parameters that don’t generate unique content.
  • Added the disallow function to the robots.txt to the incorrect URLs to improve crawl budget.

Pagination

There are two reports to check out:

  • First Pages: To find out what pages are using pagination, review the “First Pages” report. Then, you can manually review the pages using this on the site to discover if pagination is implemented correctly.
  • Unlinked Pagination Pages: To find out if pagination is working correctly, the “Unlinked Pagination Pages” report will tell you if the rel=”next” and rel=”prev” are linking to the previous and next pages.

In this example below, I was able to find that a client had reciprocal pagination tags using DeepCrawl:

How to fix:

  • If you have a “view all” or a “load more” page, add rel=”canonical” tag. Here’s an example from Crutchfield:
  • If you have all your pages on separate pages, then add the standard rel=”next” and rel=”prev” markup. Here’s an example from Macy’s:
  • If you’re using infinite scrolling, add the equivalent paginated page URL in your javascript. Here’s an example from American Eagle.

Max Redirections

Review the “Max Redirections” report to see all the pages that redirect more than 4 times. John Mueller mentioned in 2015 that Google can stop following redirects if there are more than five.

While some people refer to these crawl errors as eating up the “crawl budget,” Gary Illyes refers to this as “host load.” It’s important to make sure your pages render properly because you want your host load to be used efficiently.

Here’s a brief overview of the response codes you might see:

  • 301 – These are the majority of the codes you’ll see throughout your research. 301 redirects are okay as long as there are only one redirect and no redirect loop.
  • 302 – These codes are okay, but if left longer than 3 months or so, I would manually change them to 301s so that they are permanent. This is an error code I’ll see often with ecommerce sites when a product is out of stock.
  • 400 – Users can’t get to the page.
  • 403 – Users are unauthorized to access the page.
  • 404 – The page is not found (usually meaning the client deleted a page without a 301 redirect).
  • 500 – Internal server error that you’ll need to connect with the web development team to determine the cause.

How to fix:

  • Remove any internal links pointing to old 404 pages and update them with the redirected page internal link.
  • Undo the redirect chains by removing the middle redirects. For example, if redirect A goes to redirect B, C, and D, then you’ll want to undo redirects B and C. The final result will be a redirect A to D.
  • There is also a way to do this in Screaming Frog and Google Search Console below if you’re using that version.

What to Look for When Using Screaming Frog

The second thing I do when I get a new client site is to add their URL to Screaming Frog.

Depending on the size of your client’s site, I may configure the settings to crawl specific areas of the site at a time.

Here is what my Screaming Frog spider configurations look like:

You can do this in your spider settings or by excluding areas of the site.

Once you get your Screaming Frog results back, here are the things I look for:

Google Analytics Code

Screaming Frog can help you identify what pages are missing the Google Analytics code (UA-1234568-9). To find the missing Google Analytics code, follow these steps:

  • Go to Configuration in the navigation bar, then Custom.
  • Add analytics.js to Filter 1, then change the drop-down to Does not contain.

How to fix:

  • Contact your client’s developers and ask them to add the code to the specific pages that it’s missing.
  • For more Google Analytics information, skip ahead to that Google Analytics section below.

Google Tag Manager

Screaming Frog can also help you find out what pages are missing the Google Tag Manager snippet with similar steps:

  • Go to the Configuration tab in the navigation bar, then Custom.
  • Add