Bruce Clay on Penguin: When to expect it, who will be affected, and how to prevent penalties

6 comments

Kristi Kellogg interviews Bruce Clay on the much-anticipated Penguin 2.0 update: Bruce Clay on Expecting and Escaping Google Penguin's Wrath.

Among his predictions, Bruce expects the update to roll out before SMX Advanced (i.e., early June) as that is Google's m.o. with timing their updates, and is quoted as saying "...only the top ten sites of ten million will not feel the wrath of Google."

Clay also says "I would bet money that Google rolls out multiple releases together intentionally" so as to confound spammers.

He says a number of interesting things...good interview!

Comments

personally I am looking forward to Penguin 2.0

The original Penguin marked my biggest intake of new clients since starting in the SEO biz about 14 years ago. Shortly before penguin potential clients were email me questions like "what is your cheapest linkbuilding package?" and my personal favourite "Can you linkbuild at less than 5c a link?". Imagine the potential clientel after 2.0, I might have to double my rates again!

As for Bruce, quite frankly I think he is full of it. My favourite quote is :

Yes. The percentage of tolerable bad links will continue to lower. I think over the next year it will fall to as low as ten percent. Ten percent of bad links seems like a reasonable threshold for sites. Ultimately, I think they could drop it to zero percent.

Does he seriously believe those numbers? does anyone else? So what he is telling me is that if my competition is beating me in the serps I should just go buy $200 worth of fiverr Gigs and blast my competition with thsoe bad links and hey presto I am a winner?

These self styled Guru's are frustrating because so many people out there actually believe that they are Guru's.

Reeks of FUD

I understand Bruces stance but to me this interview has a significant whiff of FUD / Internal Google PR. If Google reduced their tolerance for links they have deemed to be manipulative to the ratios suggested then they would likely penalise every site publishing content with a commercial focus. If everyone is penalised then ultimately theres no change. My view is that smaller brands / sites with patchier "quality signals" (think smaller branded traffic volumes / less clear extrapolated topical authority) will get nailed whereas larger brands will navigate the penguapocalypse unscathed. Ultimately Google has to return some sites in the organic rankings.

I still don't understand why,

I still don't understand why, if they can identify "bad links" well enough to penalize a site, why not simply ignore the links to begin with and move on? 

Because they are Google Steve.

My guess is that Googlebot is consuming a lot of resource these days and to keep costs down they are crowdsourcing some of the work Googlebot used to do. Ie find bad sites/links. Google is moving towards a leaner and meaner model! So every saving helps.

@SEOEnquirer I don't believe

@SEOEnquirer I don't believe your view since I think if Google did a cost/benefit analysis around all this, they would just quietly forget about links they don't like without all this song and dance to try to get a very small proportion of website owners to behave better.  It's highly unlikely that, even if they're listening, they will do anything.  You could view all this as PR for Google but I believe its value in at best zero and possibly negative.

@bwelford

I would really like to respond but I haven't got a clue as to what you just said. 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.