Give someone a fish and they’ll EAT for one day. Teach someone to fish and they’ll EAT for a lifetime. Yes, that’s an SEO pun. It’s also the goal of this article.

If you pop into either of the fantastic SEO communities on Twitter or LinkedIn, you’ll inevitably encounter some common SEO myths:

  • “Longer dwell time means a good user experience, so it must be a ranking factor”
  • “A high bounce rate indicates a bad user experience, so it must be bad for SEO”

Social media posts like these get tons of engagement. As a result, they amplify the myths we try to squash through repetition, false evidence, and faulty logic. The problem isn’t limited to social media, either. There are plenty of high-profile websites that package hypotheses as facts because readers eat them up.

These myths are a huge problem because they’re red herrings. They cause marketers to prioritize projects that won’t improve the content, user experience, or Google search performance.

So how can the SEO community rally around the truth? We can start by doing two things:

  1. SEOs must admit our personalities and professions hardwire us to believe myths. We have a deep desire for answers, control, and predictability, as well as a fierce distrust of Google.
  2. We need to recognize the psychological and environmental factors that influence our ability to sort fact from fiction.

So rather than busting individual myths, let’s ask ourselves “why?” instead. In other words, let’s learn to fish.

Internal reasons we believe SEO myths

Let’s dig into some internal factors, such as our thoughts and feelings, that influence our beliefs.

1. SEOs need structure and control

SEO is a fascinating branch of marketing because our performance is driven by a constantly evolving algorithm that we don’t control. In fact, there were more than 5,000 Google algorithm updates in 2021 alone.

In other words, SEOs live in a world of crippling dependency. Even the top-ranking signals that we know about can fluctuate based on the industry, query, or available content within Google’s index. For example, if you manage websites in the finance or health space, E-A-T is critical. If you publish news content, then recency is very important.

To gain a sense of structure and control, we look for more ways to influence outcomes. But there are two problems with that approach:

  • We overestimate the impact of individual ranking factors
  • We falsely believe something is a Google ranking factor that is not

Our need to amplify our own level of control is supported by psychology. A 2016 study revealed an individual’s need for structure made them more likely to believe in a conspiracy theory.

“The human tendency to recognize patterns even when none exist is shown to have applications in consumer behavior. The current research demonstrates that as one’s personal need for structure (PNS) increases (that is, requiring predictability and disfavoring uncertainty), false consumer pattern perceptions emerge.”

If you find yourself waffling between fact and fiction, don’t let your desire for control dictate your final decision.

2. The primal need to recognize patterns

The human brain is excellent at recognizing patterns. Throughout history, we’ve relied on that ability to make better decisions and ensure the survival of our species. Unfortunately, we’re so good at spotting patterns that we also fabricate them.

False pattern recognition has several drawbacks –

  • It might influence SEO decisions that could have a sitewide impact
  • If you overstate the connection publicly, others might misinterpret it as fact

An excellent example surfaced on Twitter recently. Google’s John Mueller was asked if adding too many links to your site’s main navigation could impact Google Discover traffic. The individual who asked the question ran several tests and saw positive results, but Mueller said it was merely an interesting correlation.

“I’d still go with ’unrelated’. As mentioned in our docs: Given the serendipitous nature of Discover, traffic from Discover is less predictable or dependable when compared to Search, and is considered supplemental to your Search traffic.”

Fortunately, this individual went straight to the source for an answer instead of publishing a case study that could have had serious implications for website navigation decisions.

3. Confirmation bias

It’s well-documented that people accept information that supports their beliefs and reject information that doesn’t. It’s a primordial trait that evolved when we began to form social groups. Early humans surrounded themselves with others who thought and acted the same way to ensure their survival.

One of the most famous confirmation bias studies comes from Stanford. For the study, researchers segmented students into two opposing groups based on their beliefs about capital punishment.

One group supported capital punishment and believed it reduced crime. The other opposed it and believed it had no impact on crime.

Each group was asked to react to two studies, one which supported their views, and one which contradicted them. Both groups found the study that aligned with their beliefs much more credible, and each became more entrenched in their original beliefs.

SEO practitioners are particularly prone to confirmation bias because we’re terrified of being wrong. We hypothesize, test, build, optimize, and iterate. If we’re wrong too often, we’ll waste time and money, and we could risk our reputation and our jobs.

We need to be right so badly that we may accept myths that confirm our beliefs rather than admit failure.

4. Lack of trust in Google

It’s safe to say most SEOs don’t trust Google. That has led to some of the longest-running SEO myths I could find. For example, even after seven years of repeated rejections from Google, many SEO experts still believe engagement is a ranking signal.

Here’s John Mueller shooting down the engagement myth in 2015:

“I don’t think we even see what people are doing on your website. If they are filling out forms or not, if they are converting and actually buying something… So if we can’t see that, then that is something we cannot take into account. So from my point of view, that is not something I’d really treat as a ranking factor.”

Nearly seven years later, in March 2022, John was asked the same question again, and his response was pretty much the same:

“So I don’t think we would use engagement as a factor.”

And yet, the SEOs piled on in the comments. I encourage you to read them if you want a sense of the intense level of mistrust. Essentially, SEOs overanalyzed Mueller’s words, questioned his honesty, and claimed he was misinformed because they had contradictory insider information.

5. Impostor syndrome

Even the most seasoned SEO professionals admit they’ve felt the pain of impostor syndrome. You can easily find discussions on Reddit, Twitter, and LinkedIn about how we question our own level of knowledge. That’s especially true in public settings when we’re surrounded by our peers.

Not long ago Azeem Ahmad and Izzie Smith chatted about impostor syndrome. Here’s what Izzie said:

“It’s really hard to put yourself out there and share your learnings. We’re all really afraid. I think most of us have this impostor syndrome that’s telling us we’re not good enough.”

This contributes to SEO myths in several ways. First, it erodes self-confidence, which makes individuals more prone to believe myths. Second, it prevents folks who might want to challenge inaccurate information from speaking out publicly because they’re afraid they’ll be attacked.

Needless to say, that enables myths to spread throughout the broader community.

The best way to combat impostor syndrome is to ensure SEO communities are safe and supportive of new members and new ideas. Be respectful, open-minded, and accepting. If more folks speak out when something doesn’t feel accurate, then we can keep some troublesome myths in check.

External reasons we believe SEO myths

Now let’s explore the external forces, like peers and publishers, that cause us to believe SEO myths.

1. Peer pressure

Peer pressure is closely related to impostor syndrome, except it comes from the outside. It’s a feeling of coercion from peers, whether a large group of SEOs, a widely known expert or a close mentor or colleague.

Because humans are social creatures, our urge to fit in often overpowers our desire to be right. When something doesn’t feel right, we go with the flow anyway for fear of being ostracized. In fact, social proof can be more persuasive than purely evidence-based proof.

I asked the Twitter SEO community if anyone ever felt compelled to accept an SEO ranking factor as fact based on popular opinion. Several folks replied, and there was an interesting theme around website code.

“Back in 2014, a web developer told me he truly believed text-to-code ratio was a ranking factor. For a while, I believed him because he made convincing arguments and he was the first developer I met who had an opinion about SEO.”

—  Alice Roussel

“Years and years ago I wanted code quality to be a ranking factor. Many thought it was because it made sense to reward well-written code. But it never was. Browsers had to be very forgiving because most sites were so badly built.”

—  Simon Cox

Similar to combatting impostor syndrome, if we develop a more tolerable SEO community that’s willing to respectfully debate issues, we’ll all benefit from more reliable information.

2. Outdated information

If you publish content about SEO, then you’ll be guilty of spreading SEO myths at some point. Google updates its algorithms thousands of times each year, which means assumptions are disproven and once-good advice becomes outdated.

Trusted publishers have a duty to refresh or remove inaccurate content to prevent SEO misconceptions from spreading.

For example, in 2019 Google changed how it handles outbound links. It introduced two new link attributes into the nofollow family, UGC and sponsored, and began to treat all three of these as hints instead of ignoring nofollow links.

So if you wrote about link attributes prior to September 2019, your advice is probably out of date.

Unfortunately, most SEOs update content because it’s underperforming, not because it’s wrong. So perhaps publishers should put integrity above performance to strengthen our community.

3. Jumping on trends

Sometimes SEO myths explode because the facts can’t keep up with the virality of the myth. One of my favorite examples is the LSI keyword trend. This one pops up on Twitter from time to time, and thankfully Bill Slawski is quick to quash it.

Trend-based myths go viral because they tap into the fear of missing out (FOMO), and SEOs hate to miss out on the opportunity to gain a competitive advantage. They also resonate with SEOs because they appear to offer a secret glimpse into Google’s black box.

Although trends eventually fade, they will remain a thorn in our side as long as the original sources remain unchanged.

4. Correlation vs causation

The most difficult myths to bust are those backed by data. No matter how many times Google debunks them, they won’t die if folks come armed with case studies.

Take exact match domains (EMD) for example. This article lists several reasons why EMDs are good for SEO, using Hotels.com as a case study. But it’s a classic chicken and egg argument. Does the site rank number one for “hotels” because it’s an EMD? Or is it because the owner clearly understood SEO strategy and prioritized keyword research, link building, internal links, page speed, and high-quality content marketing for the last 27 years?

We also can’t discount the fact that the domain has 42 million backlinks.

But if you want to hear it directly from the horse’s mouth, Google’s John Mueller says EMDs provide no SEO bonus. Here’s what he said on Reddit:

“There’s no secret SEO bonus for having your keywords in the domain name. And for those coming with “but there are keyword domains ranking well” — of course, you can also rank well with a domain that has keywords in it. But you can rank well with other domain names too, and a domain won’t rank well just because it has keywords in it.”

This is obviously correlation, not causation.

To be clear, I fully support running SEO tests to learn more about Google’s algorithm. But it’s incredibly difficult to create a signal vacuum that prevents outside influences from skewing your results. And even if you manage to isolate one ranking factor, you have no way of knowing how strong the signal is in relation to other signals. In a total vacuum, one signal may win. But in the wilderness of Google, it may be so weak that it’s virtually nonexistent.

Furthermore, the signal may only apply to certain types of content. We’ve seen signal fluctuations before regarding product reviews and E-A-T in YMYL spaces. So even if data suggests something might improve organic rankings, how reliable is the information, and how important is the signal?

All this is to say that we should be very careful when proclaiming new ranking factors, especially if they contradict Google’s statements or stray too far from universally measuring user experience.

5. It’s plausible, but not measurable

This group of myths is rooted in logic, which makes them particularly dangerous and sticky. Usually, they follow a simple formula: if A = B, and B = C, then A = C.

Here’s an example:

  • Google wants to rank content that provides a good user experience
  • If a webpage has a high bounce rate, it must provide a bad user experience
  • Therefore, a high bounce rate is bad for SEO

This seems to make sense, right? Yet, Google has said many times they can’t see what users do on your website, and they don’t look at bounce rate.

I’ve seen the same argument applied to dwell time, time on page, SERP click-through rates (CTR), and so on. To be clear, Google says CTR  does not drive organic search engine rankings because that would cause results to be overrun with spammy, low-quality content.

Most often these myths stem from competing views about what a good user experience looks like and how to measure it. What constitutes a good experience for one type of search query might be a terrible experience for another. This lack of consistency makes it virtually impossible to identify metrics that can be deployed universally across all websites.

In other words, if potential user experience signals depend on too many factors, Google can’t use them. That’s why they launched the page experience update in 2021 which quantifies user experience with specific, universal metrics.

Here’s your fishing pole

In many cases, SEO myths fall into more than one of the above categories which makes them even more difficult to dispel. That’s why we keep seeing social media posts falsely identifying ranking factors like keyword density, domain authority, conversions, and meta keywords.

If you understand a few basic concepts about ranking factors, you’ll be better equipped to sort fact from fiction and prioritize SEO initiatives that drive more organic traffic.

Ask yourself these five questions when you smell the stench of a myth:

  • Is it quantifiable and measurable?
  • Is it scalable?
  • Is it broadly or universally true, or does it depend on the user?
  • Does it support Google’s goals of delivering a better user experience?
  • Has Google confirmed or denied it publicly?

If you can check each of those boxes, then you may have a valid ranking factor on your hands. But don’t take my word for it. Run some tests, ask some friends, use logic, and confirm your theory. And if all else fails, just ask John Mueller.


Jonas Sickler is a published author and digital marketer. He writes about SEO, brand reputation, customer attention, and marketing. His advice has appeared in hundreds of publications, including Forbes, CNBC, CMI, and Search Engine Watch. He can be found on Twitter @JonasSickler.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.



Source link