Google Slaps Mozilla.Org Over UGC?

10 comments

So the good folks at Mozilla got a manual penalty from Google. But the message was as cryptic as ever, 

Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles.

As a result, Google has applied a manual spam action to your site.

Well that is certainly helpful and transparent. Even the US goverment gives you a pass on what your users create as long as you correct it when it's called out. But how can you correct something that is as elusive as a unicorn? 

Google has ruined the original functionality of the WWW and links. The rule is no longer, build for your users, it's build for your users, and us, or we will slap your site. 

https://productforums.google.com/forum/#!topic/webmasters/pg_4FmjEc_8/discussion

Comments

As I've already said, I agree

As I've already said, I agree that Google's so-called transparency is a joke. You can't spank your kid and tell him it's for something he did, without telling him what, and expect any improvement in his behavior. But on the other hand, Mozilla was allowing some really egregious shit to stay on their site, making no effort whatsoever to control it. That's on them, as far as I'm concerned. They deserved to get hammered.
But yeah, they deserved to be told what they were hammered FOR, too.

more info

I see Matt has responded there as well, saying that like the BBC, this was a granular action against one URL:

"I saw that the URL has been considerably cleaned up now--thanks for that. I wouldn't take the message as an criticism of mozilla.org; it was just a heads-up that we'd taken granular action (in this case, on a single URL) because we saw spammy user content on the site."
https://productforums.google.com/forum/#!topic/webmasters/pg_4FmjEc_8/discussion

Would be nice if a "heads up"

Would be nice if a "heads up" included a link to the offending content though, doncha think?

If Google ignores it, why isn't that the end of it. ...

I came up with this wild idea that perhaps if Google knows to ignore it, then they should just do it and take no further action. Nor should the website owner be required to do anything either.  Doesn't everyone win with this simple approach? :)

You are presuming, of course,

You are presuming, of course, that Google's algo can actually tell the difference and actually ignore it. The more they go after webmasters to take things down, the more I think that their algo has become human and relies on webmasters to fix issues it can't handle.

Google's algorithm can't be human

I beg to differ, Steve.  A human algorithm isn't scaleable so Google has got to do as they say which is to rely on mechanical (non-human) methods.  If that is so, then I don't see why the KISS approach would save us and Google a lot of time and enormous human effort.

I somewhat agree Barry, but

I somewhat agree Barry, but their current algo isn't doing a very good job of things as it is and we are seeing more and more 'manual' penalties being levied. At least from the outside that is the way it appears. 

I just don't understand how hard it is to simply ignore link directories all using the same category topics, wp blogs running the default sample / about us pages, etc. etc. etc. Why would a site that still has a page that says 

This is an example of a WordPress page, you could edit this to put information about yourself 

be given any value at all. and more importantly, why would it generate a cryptic warning? How hard would it be to include 

we have found content on your website such as 'link' and feel that it should be addressed to stay within the Quality Guidelines seen here 'link'. they obviously know which pages tehy don't like by putting in a penalty (whatever you want to call it). Why can't they lead the webmaster a bit and point out the issue more directly so that it 'can' be addressed.

Google should let the offenders stew. ...

If Google gave no indication then the webmaster will never know and will continue to assume his bad behaviour, which is negatived by Google anyway, is worth pursuing. Only he or she is the loser.  So much less work for Google.  They've put up the rules of good behaviour.  If we find our website performance is less than we would like, then we've got to consider where we might be going wrong and how we might improve.  Google avoids the burden of giving warnings.  What's wrong with that?

Aaron Wall's take on Google & spammy UGC

Hey all! Lively conversation here and I see some validity to all of the points made.

I especially like Doc's analogy to spanking a kid but not explaining for what action, then expecting improvement in behavior. And I agree, if Mozilla let truly "egregrious shit" (love it!) stay on their site, then absolutely - it is on them & they should suffer consequences.

So I see both parties culpable in this instance, in short. 

In a related thread, just before seeing this one, I posted Aaron Wall's piece on Google and spammy UGC (under Content). As he is wont to do (and I appreciate him for this), Aaron takes Google to task for "encouraging user generated spam on social media sites" - or at least, turning a blind eye.

Here's the link: http://www.seobook.com/getting-granular?utm

Google turned a blind eye to

Google turned a blind eye to crap on the net for a very long time. Now they are in the position of being forced to try to clean  it up so that their algo actually works again. There are some pretty awful serps out there these days because of the behavior that Google encouraged over the years. Or at the very least, did nothing to discourage it. (MFA anybody?)

I've not be a fan of Google for a long time, but lately I've started taking a rather serious dislike to their methods. 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.