Is Negative SEO a Fairly Easy Practice?

“The Missing Link” is a Search Engine Watch exclusive reader-driven Q&A column with veteran content publicist Eric Ward. You can ask questions about all aspects of links and link building and Eric will provide his expert answers. Submit your questions here, and you may be featured in a future installment!

I’m growing more and more concerned about negative SEO, particularly as we’re seeing somany new clients who have all been algorithmically filtered and lost traffic. These aren’t sites with despicable backlink profiles, some are predominantly good with a couple of bad links in there. So if I can identify a footprint myself (even across the relatively small cross-section I see through my own clients) surely negative SEO is a fairly easy practice?

Do you have any experience / case studies / insights?

Perhaps bad links added now will add no value whereas legacy bad links are detrimental?

– Cause for Concern

I agree that spotting a bad link footprint is not as difficult as some would have you think it is, especially when the links make absolutely no sense whatsoever.

For example, an ecommerce site that sells products only in the USA has a web site with links pointing to it from Russia. And sure, we could argue niche cases where this might actually happen legitimately, but when you also notice links to that same site from China, South Africa, and Australia, multiple links from .biz and .info domains, and a few keyword laden links on pages that all use that familiar looking blog template theme, well, something is rotten in Denmark, and it’s more than just a .dk inbound link.

Unfortunately there aren’t many case studies, but you can read Marcela De Vivo’s Negative SEO Case Study.

I think there aren’t more case studies because nobody would admit to successfully sabotaging someone else’s website. They would keep quiet about it and enjoy it.

As to your question/comment “Perhaps bad links added now will add no value whereas legacy bad links are detrimental?” This is certainly possible. We must all consider that Google doesn’t just have backlink data for today. It has backlink data going as far back as they chose to save it.

Google knows what other sites were linking to my site in 1997, and it knows what sites are linking to my site today. More importantly, Google can study everything about my links that has taken place over time and detect any changes to my link seeking habits.

Think of the potential value historical link data gives Google. They know exactly when anchor text started increasing, and when it reached such a critical mass that they needed to devalue it. They know when you started putting out a press release every week. They know when you decided to offer discounts to college students (because you never had a single link from an .edu domain in five years and then in six months you had forty.)

To boil this down, I think the statement we would all agree to is this:

Any decision about the credibility of linking signals that is left up to a machine can’t be 100 percent correct 100 percent of the time.

This then means we could restate the above as follows:

There will be mistakes made by a machine when it is evaluating links.

This then means we could restate the above as follows:

Humans could create links that machines would evaluate incorrectly.

So, given all of the above is true, what we can’t know is the extent to which these incorrect link evaluations occur, and we can’t know the extent to which negative SEO works.

We can only agree that yes, it is possible. I wish I could give you more, but there it is.


Sagamore Hill | National Park Foundation
Going Beyond FTC Paid Inclusion Disclosure Guidelines
Tác giả

Bình luận

Leave a Message

Registration isn't required.

Inform INC News - Breaking News, Latest News and Videos