Web Leads LLC

SEO Problems That Still Plague The Industry Throughout The Years

Buying seo leads and web design leads help seo companies to boost their business and continue the fight against spammers and black hat online marketing that continue to plague the industry and give it a bad reputation.

As much as the SEO industry wants to grow up and shed the shady reputation it had in the past, it seems many don’t want to change and still cast a shadow over the industry. For every company doing things right, it seems there are many more who have not updated their practices or still want to take shortcuts.

I know it’s not October and not time for Halloween, but I’m already putting my costume together and was inspired to write this. All of these are true SEO horror stories I have seen over the last couple of years. There are outdated practices, mistakes and some seriously shady stuff that still goes on in our industry.

I’m okay with mistakes, but a lot of these stories involve companies just never updating their practices or intentionally doing things that give all SEOs a bad name. There are many great companies out there, but it seems like for every good one, there are still a few bad apples.

Outdated SEO practices
Believe it or not, I still see cases of keyword stuffing and even hidden text.

Titles look terrible when stuffing; I’ve seen the same term multiple times or every city known to man in the title tag. I recently ran into a home page title that was over 800 characters long, with almost every city in the area!

Hidden text is also a surprisingly common problem, where website content (text, internal links and so on) is barely readable or sometimes intentionally hidden. These aren’t sites launched years ago, either — some of them are less than a year old.

I’m also seeing more websites that use the exact same page content on multiple pages with only the city name swapped. These pages have become so prevalent that I have a hard time telling clients not to do this, but of course I still recommend against it. (If they choose to reuse page content, I ask that that they add something additional that’s relevant and useful.) I even see pages that use obviously spun text still ranking well.

Link spam is the worst. I’m seeing a lot of sites using press release services that go out to local news websites. I see a lot of general directories and article websites in profiles that were recently added. I still see a lot of web 2.0 and video spam. Sadly, I see a lot of obvious footprints from programs like ScrapeBox, XRumer, SEnuke and GSA SER. Sometimes websites are getting away with the spam, but other times, companies have come to me with a penalty, and it’s obvious from the backlink profile what the cause is.

Local is a joke these days, too. The local listings are so full of spam and fake reviews that it’s sickening, and I’ve reached the point that I really don’t trust the reviews anymore. I see people keyword stuffing Google My Business listing names, adding in alternate business names that are keyword-rich, using a high-ranking website or an authoritative profile instead of their website, using UPS store locations, Regus offices or co-working spaces for the address, having multiple listings (or even multiple websites) for the same business, and so much more. It’s like the Wild West all over again.

For those who might have missed it, I highly recommend you check out Joy Hawkins’ “The Ultimate Guide to Fighting Spam on Google Maps,” and have fun reporting the things you’ve seen.

Mistakes
I’m seeing websites blocking crawlers a lot these days. It feels like almost every search has at least one result that says, “A description for this result is not available because of this site’s robots.txt.”

Of course, this can be caused by a noindex tag, as well as robots.txt. Whether people are bringing websites out of development, accidentally clicking wrong boxes, migrating websites or whatever other reason, this seems to get overlooked way more than it should and is a fairly common mistake.

Less common, but growing in popularity, are various JavaScript frameworks like Angular and React where no content is rendered or pages indexed. For anyone whose company is starting to use these frameworks, I highly recommend reading through Adam Audette’s “We Tested How Googlebot Crawls Javascript And Here’s What We Learned” and Jody O’Donnell’s “What To Do When Google Can’t Understand Your JavaScript,” as well as Builtvisible’s AngularJS and React guides.

A pet peeve of mine is when a company will redesign a website without doing redirects. I’ve seen catastrophic drops in traffic as a result of this. I’m giving the benefit of the doubt that this is a mistake, but it’s likely either not part of a company’s process or something they cut it out because it’s time-consuming and they were on a deadline. I also see a lot of redirects done incorrectly when switching from HTTP to HTTPS, including redirect chains and 302s instead of 301s.

Though it’s not always the fault of an SEO company, I’ve seen domain names expire or older domains dropped that have had substantial impacts on various businesses. This isn’t all that common, luckily, but it can be painful when it happens.

To view the rest of the article, click here.

Scroll to Top