NoFollow Nastiness - a Big Boo for Google!

38 comments

I recently turned on the highlight nofollow links function in the SearchStatus extension. It makes visible links that are using nofollow by highlighting them and surrounding them with a dotted red outline.

It seems asinine to me that Mozilla Developer articles use nofollow when referencing resources.

Wikipedia is shafting Creative Commons.

And Technorati is using NoFollow on their own internal links? WTF?

Webmasters are afraid to vote for other legitimate resources when they cite them, and some are even afraid to vote for their own sites. These are big sites here too...not just Joe mom and pop sites.

Do we really want to kill off legitimate citations? Take away the editorial ones and what does Google have left to work from? And yet Matt Cutts has to scare people in his blog.

How fragmented does the web have to be before Google realizes advertising is a natural part of the business world? I mean, it is only 99% of their revenue, and they even have special pages up on how to use link manipulation to make Google more money.

When will the NoFollow nastiness end?

Comments

Webmasters are afraid to

Webmasters are afraid to vote for other legitimate resources when they cite them

Nah, it's just people thinking they can be greedy with "Googlejiuce".

It's an old habit in SEO to "preserve" PageRank by killing link benefits of outbounds, but with the introduction of nofollow, we simply have a tool more accessible for ordinary webmasters to do the same.

And because of that, it

And because of that, it won't be long before search engines will have to follow nofollow links (whether or not they do now) because it'll be a stupid spammer's tool for hoarding PR (pffft) so they'll follow the links in order to counter the tactic.

As far as I'm concerned, if you don't trust a link on your site, take it off your site. I'm afraid suggesting the use of nofollow was a pretty good thought that just didn't work in the real world, and trying to scare people into using it is a dumb mistake.

slashdot too

Slashdot too is using no follow on the users link on stories and on comments.

Really this no follow and all link are bad(yes all of them) getting ridiculous.

Why don't we stop using links all together, and blame google for the mess.

Greed and Wariness

Given that most people who have sites will have no idea that nofollow even exists, it stands to reason that only those who, theoretically at least, know what they are doing use it.

Whilst I do not use it myself, were I to do so I could care less about hogging the juice, my concern would be the danger of linking to a bad site.

To that end I rarely link out on any of my sites anymore, sad, and silly, but true!

Mozilla sold out

to G$ a long time ago, a stock FF installation is pretty much the defacto Gbrowser.

it is only 99% of their revenue

Exactly. What a compelling cause to discourage the sale of and confound the other values of setting links by the greatest competitor that GOOG has - the bazillion other websites on the Internet.

And by remaining somewhat vague about its application of policy regarding links and necessarily secretive about its algorithmic evaluation of links, Google can chuckle all the way to the bank while those bazillions of website owners accept that discouragement and comply via action or inaction or recalcitrance.

I laugh too. But, I am not going to the bank.

Misinterpretation of the Specifications

I distinctly remember the day that Google, Yahoo! and MSN came out with their announcements for support and implementation for the nofollow attribute. Personally, I've never used it as I don't have a blog or similar outlet.

Quote:
Q: What types of links should get this attribute?
A: We encourage you to use the rel="nofollow" attribute anywhere that users can add links by themselves, including within comments, trackbacks, and referrer lists. Comment areas receive the most attention, but securing every location where someone can add a link is the way to keep spammers at bay.

http://googleblog.blogspot.com/2005/01/preventing-comment-spam.html

Since then, many have found other creative ways to use it. And, most of the "many" are SEO types. For me, it would be an indicator and/or flag if I saw those nofollow attributes on anything other than what it was intended to be used for. Think Footprints.

Misinterpretations...

well one Matt Cutts can be thanked for adding another interpretation to the fold when he stated that "advertising" text links should be tagged with nofollow (ala the zawodny debacle).

it's such bullshit when that attribute is used for anything but user generated content. if you don't even trust the links you're adding to your site... wtf?? And, tbh if you have to have them on your user-genned content then is it really even quality content? 'Cause that just says insufficient ficient moderation resources have been dedicated and instead we've hired the cheap nofollow babysitter.

He's asking for identification...

Quote:
Yes, if you sell links, you should mark them with the nofollow tag. Not doing so can affect your reputation in Google.

The above from Matt's blog. He's basically asking the text link brokers to flag their networks. ;)

Nuevojefe

Quote:
it's such bullshit when that attribute is used for anything but user generated content. if you don't even trust the links you're adding to your site... wtf??

Absolutely. And some people would even include adverts in that. If you don't accept ads from sites you wouldn't recommend, then why would you indicate that you don't trust your advertisers?

Quote:
And, tbh if you have to have them on your user-genned content then is it really even quality content? 'Cause that just says insufficient ficient moderation resources have been dedicated and instead we've hired the cheap nofollow babysitter.

I don't completely agree with you there. I only work on one blog right now (a client's), and I personally delete over 100 junk comments and trackbacks a day. I think they take care of some more of them when I don't happen to be looking. But even the comments and trackbacks we approve for publishing get a nofollow, and I think that makes sense. We only want people commenting for the sake of keeping the conversation going. If they get traffic from the link they post, that's fine, since it's also good for the conversation. But we don't want people posting in order to get a link.

Besides, if they post some really good comments and it turns out they write a great blog that would be of interest to our readers, they're going to get listed among the featured blogs, which will appear on every page, and we don't stick nofollow on those. I think that's fair. We're not using nofollow to say we don't trust the link; we're just indicating that you should only post a link if it's good for everyone involved and let us judge it from there.

advertise w me, I will call you a piece of...

why would you indicate that you don't trust your advertisers?

Exactly. Given the browser extensions like SearchStatus and how search engines may treat NoFollow links you may actually be doing your own site AND your advertisers a disservice by marking them with a NoFollow.

it won't be long before

it won't be long before search engines will have to follow nofollow links

Yahoo is following them anyway

or before

a nofollow becomes noted as a "link to a more worthwhile site than mine"

I believe any site that

I believe any site that basically has links going everywhere and anywhere, is a good cause for the nofollow. It is impossible to continually chase down and follow links to ensure a site is doing the right thing all the time. I know they don't, cause I see the crap everyday through directory submissions. Sites get rejected, remove the content, resubmit, get included, then put the crap back up. Others I have watched go out on link building campaigns for a good year, then fill their site up with spam once their pagerank is high... go figure.

Trusting a person behind a website is becoming a big call nowadays, and in many instances of blogs, forums, directories and any site that is quite interactive and having content continually added, the nofollow has a good place within them to protect the owner from being punished for every other stupid dicks greed through spam and misconception.

I have used it now for sometime within directory list, to all listed sites and internally, as the listings are not about voting for the site, they are to provide the users of the directory list a easy reference to submit and find directory resources. Numerous directories have changed their content, or close their directory and redirected the site to another with no relevance, sheerly to 301 existing link and pagerank benefits.

I use it internally in things like "add link" and so forth, as the search engines ignore robots.txt, so the only way to stop your own site getting thrown out for filling their index with thousands of worthless add url pages, is to add the nofollow to the link for those pages. I use it within my forum, as there is no way in hell, could anyone manage all existing links within such an environment.

Threadwatch is a good example... could you tell me that you go back and check every single posted link to ensure the resource is either still current, the site is still around and not closed, someone hasn't purchased the domain when expired and filled it with porn, gambling or pharmo crud, etc etc? No... I doubt very much you could. Link check programs don't tell you all this, just whether a site exists or not. Thats one of the many problems... what about the rest? The task would be so large that running these type sites wouldn't even be viable in the long term. Using the nofollow most certainly has some golden areas of use for the owner of websites IMHO.

I'm not one to give a rats cracker about pagerank or any other SEO BS... but I do care about what damage others have the potential to do without my knowledge!

I don't agree that it should be used for something like site sponsors, for the same reasons as mentioned above... they are generally very few and easily controlled, and you should be voting for them if your taking their money. But saying it doesn't have its place is nothing more than total looking through blinkers, and not seeing the larger picture of abuse that exists today.

If you run a site that serves no interactive or public interaction purpose, then you shouldn't need to use something liek the nofollow, agreed... as these type sites are doing nothing more than attempting to horde pagerank and other silly webmaster bullshit tactics.

robots.txt

Quote:
I use it internally in things like "add link" and so forth, as the search engines ignore robots.txt, so the only way to stop your own site getting thrown out for filling their index with thousands of worthless add url pages, is to add the nofollow to the link for those pages.

I've never seen that happen. They may grab the link while they're crawling the page and then return the page for a site: search, but I've never seen one of the major SEs actually crawl and index a page that was blocked with the robots exclusion protocol.

Sorry qwerty, I should off

Sorry qwerty, I should off added "to an extent". From what I got from Matt Cutts post about this http://www.mattcutts.com/blog/googlebot-keep-out/ he provides times where they have shown the URL with a DMOZ description, and I have seen blocked URL's show up as supplemental results in Google from my own sites, which shouldn't be their. Matt recommend using the actual meta tag to do the job, but then I have seen the instance of what happened to WOW a couple of years ago, where Bruce was using the meta tag of noindex or nocache, one of those two, and then got smashed because he used it too many times, thinking he was helping Google and the SE's... but instead got punished for it for a short while until he determined the problem. I'm pretty sure that was it from memory.

Whilst they acknowledge the robots.txt, and they don't crawl the content, they still have been using the URL and descriptions from other sources, or showing them as supplemental results, which is bullshit IMHO, considering the page shouldn't exist anywhere throughout the SE results. So I simply add the nofollow to those internal links now, just in case they decided to get anal one day and wipe out one of my directories for their stupidity indexing page URL's that I excluded in the first place. Now I just don't bother, and use the nofollow instead.

I should have been more careful in wording that one, sorry. That is the jist of it from my understanding... which could be way out, but I have seen it happen, thus I took the appropriate actions to counter balance and double check.

a great legacy ...

so, this *great* idea had a lifespan of, what? six months? before the whole thing blew up. i thought it was total bs when it was announced, and sure enough ...., here we are today.

why should webmasters be doing a job that is rightly the job of the search engines. they cannot depend on external, voluntary clues to do their job. by definition, it is open to manipulation. and that is what we are seeing now. as sure as day follows night.

ROFL

nofollow was a kneejerk reaction to comment spam.

'Sorry, but our algorithms can't deal with comment spam, will you stupid fuckers help us out? Puhleeze? All you stooopud fuckers that can't trust yourself to link to what you want to link to, and instead, find yourself linking to woody pills or grow-hair-on-my-sack pills, puhleeze, let us know that yer so goddamn dumb you can't even trust outbound links on your own site...

And since yer so kind can you just emblazon SEO on yer pumpkin....

Sorry you fucksticks, but if you've ever used nofollow as anything other than a joke or to fuck someone else, yer an idiot, Just bend over and wrap yer lips around yer own asshole and suck until yer head explodes. At the very least, you'll reduce the number of stupid people that can breed. Follow?

While I normally agree with

While I normally agree with most the stuff you say Anthony I really can't bring myself to agreeing with this part:

Threadwatch is a good example... could you tell me that you go back and check every single posted link to ensure the resource is either still current, the site is still around and not closed, someone hasn't purchased the domain when expired and filled it with porn, gambling or pharmo crud, etc etc? No... I doubt very much you could. Link check programs don't tell you all this, just whether a site exists or not. Thats one of the many problems... what about the rest? The task would be so large that running these type sites wouldn't even be viable in the long term. Using the nofollow most certainly has some golden areas of use for the owner of websites IMHO.

My problems with that are many fold, but can be sumarized like this

  • If I am concerned about sending people at a site that I link to then nofollow does nothing to stop that.
  • Why try to keep the search index cleaner than your own site? If you are not comfortable linking then don't.
  • If we are so worried about the actions of a few outlier that it stops us from doing what we like then we need to restructure what we are doing and why we are doing it.

Nice post DG :)

Nice post DG :)

You raise a good comment

You raise a good comment Aaron... and I agree with some aspects here, and against others, as we all do. At the end of the day, the net spam is getting worse and worse, and I guess the SE's had to take a stand at some point to allow webmasters the right to protect themselves against what the search algorithms have in fact implemented, regardless what we want or think. To me, that simply means I have to move with them, as going against them will do very little in the scheme of things to Google, Yahoo or MSN. None that I know reside here, have enough clout to stop what is implemented, thus we can fight against it, or use it to the best our advantage to not harm ourselves from all these squeezy fuckers screwing up the WWW. I'm not one of these white pointy hat people, and am always up to see some of the best blackhat techniques used to achieve top space... but the average webmaster who has no inclination nor idea on what they are doing, does stupid things that they follow from half arsed sources... just like they do the Google toolbar and Pagerank.

It doesn't matter what any off us here know, it only matters what the search engines have done IMHO, and that I must work with them if I wish to continue evolving sites within their space.

I take a stand about lots of things, this just isn't one of them for me. I do totally agree with some of the comments about this though, as they do fit quite well and people are using it for self indulgent PageRank hording and stupid crap like that. Everything to date is abuse when it comes to rankings, so this is just another obvious one to add into the equation for some.

I don't have the answer... but if the search engines didn't punish for linking to crap, then that would also be nice, and we wouldn't have to worry about silly little things like the introduction of a new tag.

Does the problem lay with the webmaster or the search engines? Definately the search engines... but the source comes from the webmaster... so I guess its a bit of punish each other. We spam their rankings, they dick us around making it harder to rank...

It doesn't matter what any

It doesn't matter what any off us here know, it only matters what the search engines have done IMHO

I feel that if they are manipulating public perception for personal gain and if that is hurting the web as a whole it is worth discussing.

Its time the tail stopping this dog

It's time webmasters who are the creators of web content and web users take back this web from the search engines who are creating a society dominated by paranoia and distrust.

They feed us small tidbits of information occassionally which are pored over as though they are swallows intestines in a roman temple seeking the auguries.

Whenever they ask us to do something or warn us about activity, we all lose the plot.

WTF is happening to us. Please will somebody invent a Open source, open standards search engine so we can take back the web!

Now that would be nice... no

Quote:
WTF is happening to us. Please will somebody invent a Open source, open standards search engine so we can take back the web!

Now that would be nice... no more jumping through hoops!

I think

DG's post should be highlited on the homepage of every SEO site and forum.

Open Source

Quote:
Please will somebody invent a Open source, open standards search engine so we can take back the web!

nutch

add

boxes
bandwidth
branding

no problem.

open source engine

sure boxes, etc But there is more and more bandwidth. Surely we jut need a peer to peer engine that uses all that spare capacity in our network.

I mean, search engines are far to important to allow a small group of Americans ru them.

Also strange that Google

Also strange that Google finance nofollow everything on their front page. Googlenews has no problem with the same source material.

Taking back the web

Quote:
WTF is happening to us. Please will somebody invent a Open source, open standards search engine so we can take back the web!

Actually what's happening slowly is Web 2.0 is screwing Google. The first clue was the 'nofollow tag'. Basically it should have alerted you to the fact that this was the first time Google panicked because they couldn't create a filter to do the job. You should have picked right up on that and asked yourself WTF? Sometime after that you should have realized that dynamic content websites (by definition Web 2.0) are the Acheles heel of Google. You can only tweak the algo so much before someone stumbles upon a core feature that can't be filtered out without breaking the whole thing (impossible massive human prioritizing required).

Here's what lies in the future:
Web 2.0 is creating communities like Myspace, Digg, craigslist etc. that operate completely independently from search. Peer recomendations determine what's a good link and what isn't. Slowly but surely the top listings in all the search engines will fill with these kinds of communities/products that have their own internal worlds. Each one will be niche specific and run their own search and advertising...spam and click fraud free(CPM models). This will gut the Google Adwords listings.

Everyone should have figured out a long time ago that there was no way a mathematical algo could have stood up to the hoardes of blackhats. Too much brain power out here. Web 2.0 is right around the corner and moving in like a freight train. Google is trying to compete with crap like Google Base, but they gotta know in their bunker right now...it ain't working, and it's only a matter of time. So 'chin up' mates. Some things in life are inevitable and this one's easy to predict.

open the windows and air out

open the windows and air out the paint fumes flyboy

Web 2.0 communities fall apart as they scale up. The suggestions become crap as the masses have their run of the place. Plus I can attest to the fact that you can spam the shit out of them successfully with simple programs cause I do it.

great post, flyboy, although

great post, flyboy, although i think G is positioning itself well for the coming change. i think much of the battle remains to be played.

Web 2.0 communities fall

Web 2.0 communities fall apart as they scale up.

but when they stay niche, good ones will be able to very effectively compete with G in that niche. take 10,000 of these web 2.0 community niche things and stack 'em up against G and you have a very interesting battle.

I agree but please point out

I agree but please point out the good community sites that stay niche and don't get overrun by all the fan's. The community site is not a new idea, forums have been around for ages, bbs's, newsgroups, etc. and they all fail as they grow, which they are destined to do if they are any good to begin with.

I agree but please point

I agree but please point out the good community sites that stay niche and don't get overrun by all the fan's.

there is this new one called threadwatch....maybe you heard of it? :)

touche my friend.. While I

touche my friend..

While I have to agree with you there, I will argue that Threadwatch it an exception to the rule.

Spamming Dayz

Wait til you see what's coming down the road WP. It's a little bit different than what you're used to. Your spamming dayz are OVAH! heh heh.

Outside the niche

I realise that once you are members of a niche you rapidly stop using a generic search engine. I mean, imagine using Google to find out what we learn and enjoy at threadwatch - impossible. The team at threadwatch do the finding for me.

Great I was thinking of outside the niche. All those times we dont know where to start. It could be that that the top communities are at the top of the rankings, but that being the case it will not be necessary to create the massive infrastructures of Google et al. Maybe these communities can share some search engine platform that pools the resources together and that uses not the Google brain triust but the web brain trust in the way say that Linux. A sort of creative commons for the search world.

So what i am hoping is that all these disparate web2 communities will get together enough to create a search operating system that we can all understand, whose decisions are not arbitrary and not masked and which does not have to centric to one part of the world.

Craziness

Some people have gone to crazy extremes putting rel='nofollow' on all of their links...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.