You don’t need to search far to find bad SEO advice. In the current search landscape, businesses need to be constantly on their guard against incomplete, irrelevant and downright incorrect ranking strategies.
Knowing what not to do is just as important as having an effective positive approach for driving traffic from organic search.
In this post, we’re going to look at eleven of the most pervasive SEO myths. If you’ve been optimizing your site for any length of time, you’ve likely encountered them. And recognizing them for what they are will give you a distinct competitive advantage when driving organic traffic to your site.
How to Identify and Avoid SEO Myths
Taking a handful of straightforward steps will protect your team from falling victim to common SEO myths.
Keep the following suggestions in mind when formulating an ongoing SEO strategy:
- Pick your information sources well – Outlets like Search Engine Land and Search Engine Journal publish articles written by seasoned professionals and industry insiders. More technical blogs like SEO by the Sea are also worth keeping up to date with. And let’s not forget your good ol’ twice-monthly roundup from BrightEdge to help keep you abreast of the latest developments.
- Follow official channels – It’s a myth that search engines don’t publish information about how their algorithms work. Google regularly provides updates for web admins on Google Search Central, its search-focused podcast Off the Record, and its principal company blog, The Keyword.
- Keep testing – There is no substitute for in-house testing. Tools like Data Cube and Site Report from BrightEdge allow you to track your rankings over time and correlate changes with new optimization strategies.
Now, Onto the SEO Myths…
1. SEO Is Dead
The phrase “SEO is dead” is one of the most overused clickbait headlines in the industry. It’s a tried-and-true hook for generating interest, and could not be farther from the truth with 74% of companies investing in SEO in 2021.
Another variation of this myth runs along the lines of “just focus on creating great content” or “SEO doesn’t matter anymore.”
It’s difficult to say exactly where this myth comes from, but it’s likely a result of a combination of two mistaken ideas: that search algorithms are essentially black boxes—and therefore nothing can be known about them— and that it’s not possible to compete with high-authority sites. Both of these premises are false.
Search engines like Bing and Google provide extensive documentation for web admins. And Google Search Advocate John Mueller has publicly said that while it may be hard for small sites to compete with bigger ones, it’s by no means impossible.
2. The Google Algorithm Is a Complete Mystery
Google has been vocal about many aspects of how its algorithm works, with regularly updated documentation, Youtube videos, blog posts, tweets, and more. Google Search Guidelines are easily accessed on the web.
In-house testing from companies like BrightEdge that have access to large sets of data and regularly publish their findings also adds further clarity to the inner workings of search engine algorithms like Google’s.
3. Short Content Doesn’t Rank
Have you ever typed a simple question into a search engine—something like “How long does it take to bake a potato?”—only to be confronted with a wall of 12,000 words of largely irrelevant information?
Sometimes, publishing long-form content is the best way of catering to searcher intent. In other cases, it’s not. Google’s emphasis on EAT (Expertise, Authority, and Trust) factors shows a clear move toward user experience and satisfaction as a primary ranking signal. The length of content should be determined by usefulness to the reader, not an arbitrary word count.
4. Long-tail Keywords Are Phrases of More Than Three Words
Long-tail keywords are often defined as phrases of three or more words. In actual fact, length has nothing to do with what makes a keyword long-tail.
A long-tail keyword is simply a keyword that sits on the “long tail” of a graph that maps individual keyword volume against keyword variations.
It’s also something of a myth that all long-tail keywords are always easy to rank for. Often, “short tail” keywords provide more cost-effective ranking opportunities, depending on how competitive they are. A robust keyword strategy should incorporate both long-tail and short-tail keywords.
5. Google Hates Pop-Ups
Several years ago, Google announced that it would penalize sites for using mobile interstitials. Panic quickly ensued among web admins, with many worrying that they would have to discard a valuable lead-generation tool.
Google advises against intrusive interstitial pop-ups, especially on mobile devices. But this doesn’t mean that all pop-ups need to be avoided, they should just adhere to search guidelines. Google uses the term “unintrusive dialogues” to describe acceptable pop-ups.
6. Meta Tags Are Irrelevant for SEO
Look for information about how to use meta description and title tags effectively and you’ll find a wide spectrum of opinions. Some argue that meta tags are redundant while others prescribe an exact keyword density.
John Mueller has said that Google does use title tags to understand what a page is about. How much this still applies today is unknown. But there’s no basis for saying they’re irrelevant. On the flipside, Google also stated back in 2009 that they don’t recognize Meta Keywords as a ranking signal anymore, although other search engines around the world still might consider them a factor.
We recommend optimizing title tags for the algorithm, but also for click-through rates. This is because title tags are one of the primary elements at your disposal for controlling how your pages appear in SERPs.
7. Generic Top-Level Domains Improve Rankings
Have you ever been told that you need a .com, .org or .net domain to achieve top rankings in search engines?
Well, it’s been said numerous times by Google officials that the type of domain a site uses doesn’t affect its ranking potential.
There may be a slight advantage to a .com name from a usability angle as they’re easier for some to remember, but they don’t have any value from an SEO perspective. However, it is worth keeping in mind that links from .gov or .edu sites are valuable because they carry a high level of authority.
8. Google Dislikes Duplicate Content
Google won’t penalize a site for duplicate content in and of itself. The search algorithm has become much better at understanding which types of websites consistently produce similar content. Canonical tags should be used wherever possible to tell search engines about redundancy and to help assist crawlers in understanding your site.
9. Domain Age Is a Ranking Factor
This has been a long stated myth for some time in the SEO community, especially when it comes to link builders and offering their services to target specific TLDs like .gov and .edu. John Mueller has said in no uncertain terms that domain age doesn’t matter to Google. Open and shut.
10. Link Building Should Always Be Avoided
Speaking of, link building has been a topic of contention for decades. Algorithms were originally built on the premise of tracking links to determine a site’s authority. Every link was essentially a vote, with links from high-authority sites carrying greater weight. This process is still a major component of how sites and individual pages are ranked.
Through the years, people have learned ways to abuse this system. As a result, Google is constantly on the lookout for link farms, directory websites, and otherwise spammy signals that indicate questionable link-building practices.
But this doesn’t mean that all link-building needs to be avoided. Links are still important. And while the majority of a site’s links should be organic, Google has also said that small numbers of guest blogs and link exchanges are OK. After everything, it is indisputable that high-quality links positively impact rankings.
11. Google Personalizes Results Based on Extensive Search History
It’s simply not the case that Google serves highly-personalized results for search queries. This debate resurfaced recently in a Twitter exchange between Google Search Liaison Danny Sullivan and UCLA Professor Ramesh Srinivasan. The short of it? Google personalization is very limited. If users do experience different results for the same term, they tend to be for non-personalized reasons.
While there are many factors that can influence rankings, the overarching fundamentals and principles behind SEO do not change. While the number of updates from the likes of Google, Bing, Yandex, and others over the last two decades is almost uncountable, search engines have remained committed to the task of serving the most useful, relevant web pages for search queries. As a result, ranking factors have remained fairly consistent. Although, without the presence of someone keeping their hand on the wheel it’s bound to have a negative impact on performance – it’s not quite a set and forget.
Questioning third-party advice is essential when attempting to rank sites. By choosing your sources of information well, consulting documentation from search engines wherever possible, and leveraging in-house testing, you will lay the foundation for an effective long-term SEO strategy.
More Resources from BrightEdge