Visualizing TrustRank & the Concept of AntiTrust

9 comments
Source Title:
More on TrustRank
Story Text:

Aaron has posted a nice, simple explanation of TrustRank, with pics for the hard of understanding (like me) on more on trustrank. If you're not up to speed, it's well worth a look...

Is TR still theoretical or is it in play?

thanks for the tip lea!

Comments

 

Was talking about this (grid eigen trust) about a year or so ago with relation to the sandbox. I think Brendon was the first to suggest it to me.

Think it makes a lot of sense.

Thank you, thank you....

Thanks for the drop Adam, much appreciated.

So, to explain my ideas on the TrustRank thing (whether you like it or not), it is my contention that TR has been in play to a greater or lesser extent since Florida or something, way back when.

Basically, TR acts as an attenuating factor to the value of any link, based on the linking history of the source site. If a site has a history of linking to "quality" sites, then its links are valuable. If not, then.... not. The main stumbling block to most peoples understanding of TR seems to be the definition of quality.

As far as I can tell, TR has NOTHING to do with whether the site in question has good content. Forget about the text for a moment, this is all about web linking patterns (a measure that G invented to a large degree). Content based relevancy, theming, LSI etc is handled elsewhere in the ranking process. The definition of quality here is based on who you link to, and for how long.

Links that remain stable are quality. Constantly rotating links are bad. Links from hub/authority sites are good. Links from my-spam-directory.com are bad. Links TO hub/authority sites are good. Links TO my-spam-directory.com are bad.*

If you review the white paper, you will see that TR is quite similar to PR, but taking chronology into account. A link you created yesterday has accumulated little trust of its own because you might choose to change it after a day, so has only a little value, but a link you've had since '97 (and not changed) is at the very least not annoying your users, or you would have had requests to change it.

So, new sites that only have incoming links from a known link network (say, an SEO firms standard "new site treatment" network, or known link pimps), or from/to other new sites CANNOT rank well against established sites for competitive terms straight off the bat. If their existing linking patterns remains stable or expands (new links only, don't drop 'em too often), they can quickly acquire some trust and start ranking.

The white paper shows that TR is an iterative process, quite like PR again, using "seed sites" of known quality (think DMOZ, Yahoo, other large, stable directories or other resource type sites), and propagating out from there. Don't forget that Google ranks pages, not sites, so sites that don't have trust WON'T be able to get new pages ranked without links from external (and trusted) sites.

As this little lot matches the "Sandbox" effect quite closely, I think you can see where I'm coming from.

* Note that some of this merely reinforces the "bad neighbourhood" spiel that G have been spouting for years (and I wonder whether some of the bad neighbourhood filters were an early version of TR)

 

Is there any difference between TrustRank and BadRank? Badrank has been around for years.

http://pr.efactory.de/e-pr0.shtml I remember reading this in 2002

thanks

A great read - however I believe that inbound links from bad neighbourhoods can only have 'no positive impact on rankings'.
To allow inbound links to have a negative impact would allow hardened seo's to take out the competition by link bombing from bad networks.

 

is that a link to help this thread or is a little SEO/forum pride on the line here? :)

either way, that thread is not about googlebowling anymore, it is about new terms and how search engines handle them.

I've not seen an evidence that GoogleBowling works.

I've tried this myself - 40,000 inbound links from google indexed pages across four sites. All the same anchor text.
The site only had 3-5 inbounds (two had the same achor text) prior to this as was 2 months old. The four sites I used are off topics and in a different language.

The site now ranks #90 for that term when it was previously +400.

do I need 400,000?
BTW I used my sites

sideways -

Has anyone seen googlebowling in play yet?

Lea
~ ooh, unintentional bad pun!

what kind of bot?

can you please explain exactly what the bot did in seo terms to googlebowl a site out of the SERPs?

thanks

form filling = blog spam?

so it was something like blog spamming then?

but if (as you suggest) complaints were at the root of the ban then this is different to googlebowling.

What we want to know is 'if we blog spam to an extreme level will the target site be removed from the serps?'

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.