E-A-T (Expertise, Authoritativeness, and Trustworthiness) is a concept Google first published in its 2014 edition of the Search Quality Guidelines.
These guidelines are used during Google’s search quality evaluations, in which it hires thousands of quality reviewers who are tasked with manually reviewing a set of webpages and submitting feedback about the quality of those pages to Google.
The raters’ feedback is then benchmarked and used by Google to improve its algorithms. E-A-T serves as Google’s criteria for these reviewers to use to measure the extent to which a website offers expert content that can be trusted.
According to the guidelines:
“For all other pages that have a beneficial purpose, the amount of expertise, authoritativeness, and trustworthiness (E-A-T) is very important.”
Google instructs its evaluators to consider:
- The E-A-T of the main content of the webpage they are analyzing.
- The website itself.
- The creators of that website’s content.
In the current version of the Quality Guidelines, E-A-T is mentioned 137 times in 175 pages.
Within the past year, E-A-T has become a major topic of discussion within the SEO industry, particularly as it relates to organic traffic performance changes due to Google’s core algorithm updates beginning on August 1, 2018.
SEO professionals began speculating (and Google later confirmed in a Webmaster Central blog) that E-A-T played a major role in the updates, which seemed to overwhelmingly affect YMYL (your money your life) websites with significant E-A-T issues.
As is often the case with the exchange of ideas within the SEO community, the discussion around E-A-T quickly led to confusion, misunderstanding, and misconstruing of facts.
Many of these misconceptions stem from a disconnect between what is theory and what is currently live in Google’s algorithm.
Surfacing results with good E-A-T is a goal of Google’s, and what the algorithms are supposed to do, but E-A-T itself is not an explanation of how the algorithms currently work.
This post aims to debunk 10 myths and misconceptions surrounding the topic and clarify how E-A-T actually works and how Google is using it.
1. E-A-T Is Not an Algorithm
E-A-T is not an algorithm on its own.
According to Gary Illyes during a Q&A at Pubcon, “Google has a collection of millions of tiny algorithms that work in unison to spit out a ranking score. Many of those baby algorithms look for signals in pages or content” that can be conceptualized as E-A-T.”
So while E-A-T is not a specific algorithm, Google’s algorithms look for signals both and on off-site that correlate with good or bad E-A-T, such as PageRank, “which uses links on the web to understand authoritativeness.”
2. There Is No E-A-T Score
In the same Q&A, Illyes confirmed there is “no internal E-A-T score or YMYL score.”
Not only do Google’s algorithms not assign an E-A-T score, but neither do quality raters, who analyze E-A-T in their evaluations, directly affect the rankings of any individual website.
3. E-A-T Is Not a Direct Ranking Factor – Expertise, Authoritativeness & Trustworthiness Are Also Not Individual Ranking Factors
This is more of a discussion about semantics than it is to say that E-A-T isn’t an important consideration for rankings.
Google has at least 200 ranking factors, such as page speed, HTTPS, or the use of keywords in title tags, which can directly impact the rankings of a given page.
E-A-T doesn’t work this way; its role in rankings is more indirect:
Is E-A-T a ranking factor? Not if you mean there’s some technical thing like with speed that we can measure directly.
We do use a variety of signals as a proxy to tell if content seems to match E-A-T as humans would assess it.
In that regard, yeah, it’s a ranking factor.
— Danny Sullivan (@dannysullivan) October 11, 2019
According to AJ Kohn, when asked about how E-A-T factors into the current algorithm:
“I feel too many SEOs are thinking expertise, authoritativeness, and trustworthiness are ranking factors, but they are not how the algorithm works; they just approximate what it should do. A far better conversation would be around, for example, what would Google do algorithmically to impact those things? When it comes to, say, health – would Google employ BioSentVec embeddings to determine which sites are more relevant to highly valuable medical texts? I’m not sure they are (I tend to think they’re experimenting here) but either way, this is a far better conversation than say, should I change my byline to include ‘Dr.’ in hopes that it conveys more expertise?”
4. E-A-T Is Not Something That Every Site Owner Needs to Heavily Focus On
Google is explicit in its Quality Guidelines that the level of E-A-T expected of a given website depends on the topics presented on that website, and the extent to which its content is YMYL in nature.
For example, “high E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation.”
However, a website about a hobby, such as photography or learning to play guitar requires less formal expertise and will be held to a lower standard in terms of E-A-T analysis.
For companies who discuss YMYL topics – which can have a direct impact on readers’ happiness, health, financial success or wellbeing, E-A-T is of the utmost importance.
It is also important to note that ecommerce sites are considered YMYL by definition because they accept credit card information.
This E-A-T meter helps illustrate the extent to which E-A-T matters for websites in different categories.
5. Focusing on E-A-T Is Not a Replacement for Technical SEO Auditing or Any Other SEO Objective
Addressing E-A-T does not improve SEO performance in a vacuum.
All the traditional initiatives that go into a successful SEO strategy, such as on-page optimization, earning high-quality links, and technical SEO, must also be executed for E-A-T efforts to be successful.
For sites that have been negatively impacted by algorithm updates, E-A-T is just one area to consider.
Recovering from core updates requires improvements across many different areas of the site, such as improving overall site quality, addressing user experience issues, reducing technical SEO problems, and improving website architecture.
Furthermore, if a site contains severe technical issues such as poor page load times, or issues with crawling or rendering content, Google may not even be able to properly index the site.
Prioritize E-A-T among other SEO efforts in accordance with how severe the other issues are that may be affecting your website performance.
6. E-A-T Is Not New – Neither Is Google’s Fight Against Misinformation
With all the new content about E-A-T, some SEO professionals have claimed that E-A-T is a recent initiative by Google that started around the time of the August 1, 2018 core algorithm update.
However, E-A-T was first introduced in the 2014 version of the Google Quality Guidelines.
In addition, I conducted research focused on E-A-T and discovered that 51% of analyzed websites that saw performance declines during the 2018-2019 core updates were also negatively affected by the “Fred” update in March 2017.
Google’s efforts to reduce misinformation and surface high-quality, trustworthy content predates the August 1 update.
Google has also engaged and invested in several initiatives aimed at improving the trustworthiness and transparency of its search results and reducing fake news, such as:
7. The August 1, 2018 Update Was Not Officially Named ‘Medic’ or ‘the E-A-T Update’
Although the August 1 update was informally named the “Medic” update by Barry Schwartz, generally speaking, core algorithm updates no longer seem to have official names by Google.
Some digital marketers refer to the August 1 update as “The E-A-T Update,” which is not only incorrect but also misleading, given that not E-A-T was not the only issue causing performance declines during that update.
8. Adding Author Biographies Is Not in & of Itself a Ranking Factor (Google Is Not Able to Recognize or Retrieve Information About Every Author)
One of the most common recommendations to improve E-A-T is to ensure all content contains a byline for the author who wrote it, and ideally, each author has a biography or a dedicated page explaining who they are and why they can be trusted to provide high-quality content.
In the Quality Guidelines, Google repeatedly recommends that quality raters should look at individual author biographies as a way to determine the extent to which the authors are experts on the topics they write about.
However, in a Webmaster Hangout, John Mueller suggested that author biographies are not a technical requirement, nor do they require a specific type of Schema markup to be effective. However, he did recommend the following:
“With regards to author pages and expertise, authority and trustworthiness, that’s something where I’d recommend checking that out with your users and doing maybe a short user study, specifically for your set up, for the different setups that you have, trying to figure out how you can best show that the people who are creating content for your website, they’re really great people, they’re people who know what they’re talking about, they have credentials or whatever is relevant within your field.”
In his Pubcon Q&A, Illyes also stated that:
“In web search, we have entities for very popular authors, like if you were an executive for the Washington Post, then you probably have an entity. It’s not about the author, it’s about the entity.”
So while Google is able to recognize established authors in their Knowledge Graph, it may not have the same capabilities for recognizing all authors.
However, Google has run a variety of initiatives related to authorship in the past several years, so it may be working on this capability.
9. YMYL Sites Are Not the Only Websites Being Affected by Core Algorithm Updates & E-A-T Is Not the Only Issue Causing Performance Declines After Algorithm Updates
While recent core updates have overwhelmingly affected YMYL sites – particularly sites in the health or medical space – there are other categories that have felt the impact.
For example, recipe sites saw enormous fluctuations after each core update since August 1, 2018. However, most of these sites have similar levels of E-A-T: they are usually run by cooking enthusiasts who are all equally qualified to post recipes online.
Four competing recipe sites saw major performance impacts during recent core algorithm updates, despite having similar levels of E-A-T.
However, many recipe sites face a unique set of SEO challenges that extend beyond E-A-T, such as site architecture issues, overwhelming ads, and poor page load times.
These other issues can certainly be responsible for performance declines during algorithm updates.
10. E-A-T Is Not Something You Can ‘Plaster on Your Site’ & Expect Immediate Results – Addressing E-A-T Takes Time
With certain SEO tactics, such as optimizing metadata or fixing technical issues, it’s possible to see immediate performance increases once Google re-crawls and indexes the updated content.
E-A-T doesn’t exactly work this way, given that it is not a direct ranking factor.
Improving the perceived trustworthiness of your site is a resource-intensive task that takes a significant investment of time and effort to complete.
It takes a while to improve trust with your users, and it can take even longer for search engines to process those changes. This is especially true for sites that have been hit by algorithm updates due to E-A-T issues.
Google often doesn’t do major reassessments of the overall site quality until the next core update rolls out – so any work that was done to improve E-A-T might take at least several months to be reassessed.
However, the benefits of improving E-A-T extend beyond just SEO: E-A-T updates can enhance user experience as users feel more confident that they can trust your website, your authors, and your brand.
Image Credits
Featured image: Paulo Bobita
All screenshots taken by author