The Biggest SEO Mistakes: 6 Search Engine Disasters To Avoid (and Tips for Fixing Them Fast)

Share This
Andy Crestodina
Share

Little things make a big difference. In digital marketing, this is especially true. Digital is about doing a hundred little things right.

Some of those things relate to search engine optimization and rankings. And some mistakes in this area cause huge, deal-breaking, rank-destroying disasters.

This post is a quick roundup of some of the worst SEO mistakes we’ve seen. These are the little things that can sink the ship, pulling you down in search engines, down to the abyss of SEO irrelevance.

Mistake #1: Removing Your Own Site From Google

What’s the opposite of search friendly? Search rude. Here’s how to be rude to Google and tell them to ignore your website.

This is a surefire way to sink your own site.

There’s a venue here in Chicago that books all kinds of shows. People love it. But ask Google for showtimes and you get this…

robots_SEO_disaster

That’s right. This is what that little message says…

A description for this result is not available because of this site’s robots.txt

The robots.txt file is just a place to talk to search engines. Every site has one (or should). To see yours, just go to www.YourWebsite.com/robots.txt. While you’re there, make sure it doesn’t look like this.

robots_SEO_disasters_2

This file is telling every search engine (User-agent: *) to ignore everything (Disallow: /).

That’s bad.

This is an especially big SEO mistake because Google is now so good at helping people find showtimes. This is what it could look like.

music_box_showtimes

Nice, right? A beautiful, simple way to help your audience find out what’s showing.

Actually, the biggest SEO mistake also appears in robots.txt files: “noindex.” This will completely remove your site from Google. You won’t show up at all.

This could be useful to add to an extranet or a login area. But generally, you don’t want to torpedo a marketing website.

The Fix:

Make sure your robots.txt file allows Google to properly crawl and index your website. Here’s a simple guide to robots.txt best practices that might help.

Let’s move on to the next potential disaster.

 

 

 

 

Mistake #2: Targeting Phrases No One Is Searching For

Ranking for a phrase that doesn’t bring in traffic might feel good, but really it’s just vanity.

Enter the possible keyphrase into the Google Keyword Planner. If you see a dashed line, rather than a number, then the phrase was searched for fewer than 10 times per month on average over the last 12 months.

keyword popularity

This doesn’t actually mean zero, it just means very low. If there is other evidence that there is demand for the topic (for example, Google suggests the phrase when you type related phrases into the search box) then it still might be worth targeting.

The Fix:

Target phrases when you have some indication that people are searching for them, either because they have more than “-” searches according to the Keyword Planner or they are suggested search phrases in Google.

Ready to dig into keyword research? Here’s our 10-step guide to increasing targeted traffic using keywords. It explains everything.

Mistake #3: Targeting Phrases That Are Too Competitive

This is a more common SEO mistake: targeting the super popular phrase that you don’t have a chance for. When does it make sense to target a low-volume phrase? When the more popular phrases are too competitive.

Here’s what most people don’t understand about search engine optimization: If the other high-ranking web pages for the target keyphrase are much more authoritative than yours, you don’t have a chance of ranking.

Target a keyphrase only if your authority in the same range as the authority of the high-ranking websites.

  • How can you check your own authority? Use Link Explorer.
  • How can you check the authority of the high-ranking sites for the phrase? Install MozBar (a Chrome extension). Turn it on and search for the phrase.

SEO competition

The Fix:

Pick your battles. Every phrase is a different competitive set. Some topics are extremely competitive (and worth millions of dollars) and others are relatively simple to rank for, taking little effort.

If you’re not in their league, target a different phrase!

Want to learn more about how domain authority works? We made a video for you here.

Once you’ve picked your phrases, make sure to avoid this next SEO mistake.

Mistake #4: Keyword Stuffing

The problem with keyword stuffing is that keyword stuffing is very obvious to search engines, which notice keyword stuffing very easily and can penalize web pages stuffed with keywords.

Keyword stuffing and keyphrase stuffing is also bad for visitors, who may read a page stuffed with keywords and think it’s strange that the webpage has so many of the same keywords.

So stuffing and filling pages with keywords and keyphrases is bad for search engine optimization (SEO) and for visitors.

Mistake #5: Generic Home Page Title Tag

The title tag of your homepage is the single most important piece of SEO real estate on your website. If your website were a book, this would be the text on the cover.

It’s important not just for ranking high, but for getting clicks if you do rank. It affects click through rates because it often appears as the link in Google search results.

home page title

The all time worst home page title tag? You guessed it… “home”

Why? It says nothing about what you do. It doesn’t help you rank or communicate with potential visitors.

The Fix:

Write a title that tells Google and people what you do. Here are some quick guidelines.

  • Include a keyphrase for the main category for your type of business
  • Include the company name at the end, after the keyphrase
  • Is no more than 55 characters (if it’s longer, it will get truncated)

Mistake #6: Not Excluding Search Robots from Your Analytics

Some of your traffic is actually search engine robots, not human visitors. For most sites, I don’t believe the (out-of-date) research that suggests that 61% of traffic is robots, but it is possible that your Analytics are a bit skewed.

If you have a very low traffic site, it’s possible that your Analytics is completely overrun. Look at the Analytics for this site:

Bot Traffic 1

60% of the visitors are referrals. They’re coming from other websites. Now let’s look at which sites they’re coming from:

Bot Traffic 2

Looks fishy. Are those are real visitors coming from 100dollars-seo.com and best-seo-offer.com? Doubtful. These are robots. So this is an example where 60% of the traffic really is from robots!

The Fix:

Getting the bots out of your Analytics is easy. Just go to the View Settings in the Admin section and check the box under “Bot Filtering.”

Bot Traffic 3

This might not remove “ghost bots” and “zombie bots” (scary, right? more on that here) but it will remove all the robots known to the IAB Spiders & Robots Policy Board (yes, that’s a thing).

Sink or Swim?

Search rankings are like everything else in digital marketing: it’s a hundred little things that make the difference. Hopefully, checking these eight on your site will make a difference between floating up and sinking down.

How many of these had you already checked? Anything we missed? Add your insights below and your fellow readers might thank you…

Share This

What are your thoughts?

By signing up you are agreeing to our Privacy Policy.

Comments (25)
  • These are really great and informative tips, I really care about these now. Thank You for these tips Andy.

    • Thanks for the comment, Muhammad. I’m glad this was useful!

  • Thanks Orbit Media Studios – appreciate the SEO commentary!

    • Thanks, Mike! We thought about making this more of an Op-Ed piece with some feedback on a lot of the myths, but we dropped those items in favor of more practical tips. But we might publish the commentary pieces later…

  • Andy, this is great stuff — some good reminders and new tips. Thanks.

    • Thank you, Carol. I hope there were more new tips than old reminders. Anything that you didn’t know about before that I should expand on? Just let me know!

  • This is one of my favorite posts of yours, Mr. C. Such little mistakes can have massive consequences and they’re so easy to avoid. Great info!

    But really I liked that you managed to include Nebraska in your post and that we may start ranking for “keyword stuffing” 😉

    • The Nebraska reference was specifically for you. Did you see the comment from Kyle Olson on Twitter?

      “Looks like my barnacle scraping business idea in Nebraska is officially dead.”

      🙂

      • You nailed it Andy! #8 really compromises data quality for uninformed B2B sites. I find using the GA bot filtering only removes ~15% of those referrals, thus requiring blocking known ghosts in .htaccess plus use of filters.

        I’m surprised how often I see two GA codes on a site, which can also overinflate sessions and artificially lower bounce rate too.

        And, I see a lot of people confused by the move to https:// where they have a mix of http:// http://www and https:// used in their site maps and navigation creating a big mess for their analytics and SEO efforts.

        Thanks for the helpful article!

  • Wow! Great insights, great stuff. I never noticed any of these mistakes, not when I read your post. Thank you so much! You’re a real lifesaver.

  • Fantastic post, Andy. A few technical items – especially the site links – that we will integrate into our approach. Thanks!

  • Does demoting a URL mean it will no longer appear in search, or just no longer appear as a site link?

    • It just removes it from the Sitelinks. The only way to remove from search is using a “noindex” tag on the page or in the robots.txt file. See mistake #1!

  • thank you

  • Hi Andy! These are really helpful, thanks for the tips. My feedback in on Mistake #8; we have found that simply clicking this checkbox does not completely exclude all known bots and spiders. We were having a lot of issues with skewed data for months and months so we discovered how to create specific filters for the most pesky bots (you know, the ones that keep changing their subdomains to looks like new ones) which has proven to be more accurate.

    • That’s great that you’ve found a better fix for the bots problem. Have you heard of “ghost bots?” This is a huge problem and even Google is struggling with it! Not sure how they’ll solve that problem…

  • Hi, thanks for sharing this great post. I have been studying SEO knowledge for five months, and this article give me a lot of inspiration, especially the “Sitelinks” part.

  • That’s strange. I search for Nebraska barnacle scraping all the time!

  • Great article! Was putting dmoz on the back burner… not anymore! Thanks for sharing these insights

  • This is such a fantastic post, and was thinking much the same myself.
    Another excellent update.

  • Hi Andy

    Indeed a great post about SEO mistakes.

    According to me, Search engine optimization is something which can make our future If we do it the right way.

    You have listed most common seo mistakes which almost every blogger is doing.

    In order to rank instantly in SERPs, people are trying un-ethical SEO strategies like keyword stuffing, building low quality or spam links.

    Keyword stuffing is the most common and worst SEO mistake I have ever seen. People think that filling the article with so many keywords will help them to rank fast.

    But It is wrong, Nowadays Google is very much strict for such webmasters and rolling regular updates to kill these webmaster’s work.

    Copying articles from other site is also a most common seo mistake and I had also did this mistake when I started my blogging carrier But now I have completely avoided it.

    I am completely against keyword stuffing and copying content.

    Thanks for listing all the mistakes here so that people can learn about these mistakes and can try to avoid them. 😀

  • Thanks for the great tips! I’m new to online marketing, and this is really helpful! Since getting started, I’ve been bombarded by “spin writers” and such to create a TON of content quickly, but you seem to say that these search engines have become sophisticated enough to determine when your content is crap. Am I understanding that right?

  • Andy, excellent article! I totally agree that the robots.txt file is just a place to talk to search engines. Not everyone knows main syntax, so I did a comprehensive explanation there – sitechecker.pro/robots-tester/ with the examples. Hope you’ll find it useful!

 
Join over 16,000 people who receive bi-weekly web marketing tips.

By signing up you are agreeing to our Privacy Policy.

Share This