In search marketing there are many different ideas of how Google ranks sites. Who do you believe? Nobody can say they 100% know Google’s algorithm. But we can know 100% that some explanations of how Google ranks sites are more trustworthy than others.
Some ideas are more true than others. What makes one idea more trustworthy is the evidence.
However, not all evidence is trustworthy. There are two poor and untrustworthy sources of evidence:
Anecdotal refers to ideas that are based on the personal experience of one or more people, but without actual research and testing to confirm the idea.
The early years of SEO were dominated by hypotheses created by anecdotal evidence. One of the earliest examples is when affiliate marketers noticed that Google was consistently banning affiliate sites. That led to the hypothesis that Google hated affiliate sites and was actively going after them.
That’s an example of anecdotal evidence (a group of affiliate marketers noticed they all lost rankings) which then led to the idea that Google was “targeting” affiliate sites.
Of course, they were wrong. Not only have Googlers stated that Google does not treat affiliate sites as lower quality, but there is no research papers or patents to show that Google had researched algorithms that “targeted” affiliate sites or any specific kinds of marketers other than spammers.
There was no factual evidence to support the anecdotal evidence. Factual evidence, in my opinion, is how to identify a flimsy opinion from an evidence based insight.
It doesn’t matter what your favorite SEO guru tells you about Google. It doesn’t matter if that Guru is ranked at the top of Google, that doesn’t prove anything. What matters is the factual evidence to support the idea.
The SEO industry is being exposed to less and less of this kind of SEO information. At one time, a correlation study based on millions of search results resulted in lots of links and attention. But the information was bad.
Just because all the top ranked sites have active social media presence does not mean that social media presence is a ranking factor.
That kind of correlation is especially wrong if there is absolutely no research on that kind of ranking factor by any search engine or university anywhere on earth.
But that’s the kind of correlation nonsense the SEO industry fell for during the mid-2000’s. And for a time many businesses wasted money doing things like trying to attract likes to their Facebook page.
The days when the SEO industry believed the outcomes of correlation studies was a dark period in the SEO industry.
While there are STILL some SEOs who are publishing correlation studies, many SEOs are increasingly skeptical and ignoring them, as well they should.
In my estimation, there are arguably three levels of SEO knowledge. At the top is canonical level information, followed by citation based knowledge and experience based knowledge.
1. Canonical SEO Information Confirmed by Google to be true.
2. Citation Based Knowledge This is information that is supported by reliable evidence such as patents and research papers.
3. Experience Based SEO Professionals who are actively creating websites and ranking them can be considered authoritative. You can’t argue with success, particularly with a person who is actually doing the work and succeeding with it.
The word Canonical means what is official or generally accepted as true.
So when I say Canonical level knowledge, I mean information that has been officially vetted to be true. The only kind of SEO knowledge that can fit this definition is information that comes directly from Google.
The next level of SEO knowledge is the kind that is documented to be probable. This is in the form of research and patents. The best kind of research and patents are the ones created by Google itself.
However research by universities for their computer science such as University of Massachusetts, Cornell, MIT and Stanford are useful as well. PageRank itself was developed at a university. Many search engines license technologies from universities.
So a research paper published by a university can be useful to show that something is possible.
But, Google does not often confirm the use of a specific technology and when it does, it sometimes may give it a different name than from what is described in a patent or research paper. For example, there are no research papers that are specifically about something called RankBrain.
Something like RankBrain is likely one or more algorithms working together, algorithms that are described in research and patents but not under the name of RankBrain itself.
The point is, if during the course of your SEO research you read something like LSI keywords are important or that the Penguin algorithm is a “trust” algorithm, check their citations (and links). Are they basing this information on research or patents? Or are they linking to something another SEO said?
Knowledge based on experience is trustworthy, especially if there is also citation based evidence to explain the success.
But experience based SEO information can also be mistaken. For example, there are many who say that fixing technical SEO issues like Page Speed can help a site recover from a Google core algorithm update.
That experience has to be considered against the higher level evidence consisting of statements from Googlers that Page Speed is a relatively minor ranking factor and that in a core algorithm update there is nothing to fix.
That “there’s nothing to fix” statement from Google makes sense only if you consider it from the point of view that Google’s algorithm is about ranking web pages that are relevant and useful to users. So if a site lost rankings, that’s likely going to be because of a change in how Google identifies what is relevant and useful.
Indeed, almost all of the named algorithm improvements like BERT, Neural Matching and RankBrain were all about relevance and not about technical issues like page speed.
In the opinion of many, including myself, the best SEO theories are supported by statements from Google or are supported by documentation such as patents and research.
A reliable source of SEO knowledge is from someone who is successfully monetizing websites themselves.
I asked Bill Slawski of GoFish Digital for his opinion:
That part about logical fallacies is very important. It’s a reference to flawed reasoning that can result in false conclusions (read more about logical fallacies here).
Bill Slawski then shared how he judges the authority of a patent:
“When I look at a patent, rather than just rely upon what they patent says, I will also look at the other patents invented by the same people, and if any of those are related to the one that i am looking at. I will look at their LinkedIn pages to see where else they have worked, and what they have worked upon. I will look at publications they may have written, and if any of those are on related topics, where they have been published, and if there are any citations for the papers that they have written or the patents. Has an author of a patent been awarded anything for work they have done, or held a position such as Jeff Dean has as the head of the Google brain team or Tristam Upstill, as the head of Google’s Core Ranking Team? Those positions tell me that they have expertise in those fields.”
I next turned to Debra Masteler of Alliance Link who is known for her link building expertise.
Debra is one of the most respected names in search marketing so I was delighted when she agreed to share her thoughts.
Debra is correct. Although my experience encompasses a wide range of search marketing from coding, links, and content, many people know me from the context of links.
My almost two decades of experience with link building has taught me that tactics that veer from marketing based strategies tend to rely on sketchy ideas that may not result in quality traffic that results in more sales.
Lily brings up a good point about long-term SEO success. It’s understood that certain practices may yield positive results in the short term but the long term effect can be catastrophic.
Being able to discern between good and poor advice is important. Some poor advice makes sense, which makes it hard to judge.
But knowing what to look for, like valid citations or logical fallacies, can help you make better online marketing decisions and avoid spending time and money on activities that may not help.