Evolution of the SEO Tool Set

10 comments
Source Title:
Patent Paper Set
Story Text:

With the discovery and consequent analysis efforts over the contraversial Google Patent that has fueled small amounts of paranoia and large privacy concerns recently, some SEO's are rethinking their toolsets.

The traditional toolset of SEO’s is quickly becoming obsolete with the large data sets that are essential for important analysis. Methinks clever SEO’s are going to start hoarding large amounts of info similar to the way G does to be effective in the future. It’s going to be less and less useful using tools that only give “one-time” data without any type of trending information over time (since these seem to be among the most important “new” variables.) Better treat those programmers nice.

I know we have more than a few great tool builders, and tool users in the TW membership, and i wanted to talk a little about what tools may be desireable, and undesireable with regard to the patent.

Of course, this does rather assume that you dont think the patent is a complete load of old bollocks devised specifically to put the wind up the SEO community. For my part, i find it hard to imagine such complexity when Google don't appear to be able to even detect enormous amounts of duplicate low grade content unless it's pointed out to them by bloggers.

However, it's worth thinking about what kind of data it would be useful to track, and how that might be done.

So, if you do do this stuff, on any large scale, and can share without giving away all your secrets, let's hear what you think about the evolution of the SEO toolset....

Comments

the page freshness bookmarklet

The one mentioned in the linked post, that is. Okay - i couldn't post it because of "suspicious input data". You can find it here.

I use that one

But it's useless on dynamic sites. It'll just give you the date and time you opened the page.

Tools tools tools

Even the basic SEO tools tracking inbound links, on-page factors, basic whois info, etc take months to build and are very hefty in terms of bandwidth, etc. I can't imagine building a tool that is able to gather/store this much data? You would need multiple servers and very heavy duty support to be able to do this.

Yep

And i know at least one TW member that does exactly that. He's gone all quiet though, despite the bait :)

Its possible and coming

Iam a designer and coder and am currently in the process of buiding a advanced set of seo tools which i belive ill be able to deliver and store alot of data without consumeing to much bandwith because of advancedments in internet technologies by being able to use the clients mechines process power.

Heftiness

All depends on how much ground you want to cover and what you believe makes the big differences. You only need that much oomph if you want to analyse to the nth degree, for most people that is a pointless and distracting exercise, albeit interesting. We are not all people who make this their business like the person Nick hints at ;O) Even though there are 100 elements in the algo for example, only 10% will make the difference for most sites? Do you need to analyse past page 2 of a serp? How many serps are you *really* interested in? If bandwidth costs and processing power are a problem after that you are either dealing in very high competition areas (good luck!) or are using the wrong providers or programmers.

Chop Shop Toolbox or Engineer's Toolbox?

There are two main methods of attack, reverse engineering and prediction modeling. Reverse engineering means you're trying to keep up with what the engines have already implemented, which makes anticipation difficult, while prediction modeling forces you to make assumptions. Naturally using either method requires a different set of tools and poses different risks.

However, prediction modeling works best if you can come close to what is already occuring so that method borrows heavily from the reverse engineer's toolbox.

For prediction modeling, I like lots of language analysis tools combined with maths tools and analytics. Pattern matching, linquistics, semantic analysis, thesauri, and large databases. Trends and forecasts. The best tool in the box though is simple variable isolation in conjunction with testing that adheres to a rigid protocol. The bigger the database, the better.

Data collection has to be kept separate from analysis, which is where processing power and bandwidth issues tend to crop up. It's impossible to analyze all the data on the fly. Collect, sort and then analyze.

The best tool to work with is your own search engine. Start small, when you get decent results, add to the document collection, watch the results turn to shit and then tweak until the results are good again. Repeat ad infinitum. That requires careful analysis of all the previous data you've collected in order to build a predictive engine. Then turn pages into patterns and simply create matching patterns. Simple rules can create complex patterns.

The goal is not to game the engines, but to find out what the engines prefer, then feed the engines a steady diet of those pages. Symbiosis. Anticipating what is best for the engines is the most important part of the entire process. If LSI makes sense, then at some point, the engines will use it, so go ahead and start getting ready for it. If it makes sense for the engines to evaluate link growth patterns, assume that they will use that tool, if not now, in the future.

With all that said, the most used tool I've found is a bottle of Advil. Comes in handy when all your careful research and your pages are tossed out the window because the search engine didn't handle 301s properly. ;)

Not sure who you were referring too Nick, but I haven't been silent. Between the latest round of research and last weekend's project, I've just been busy.

Hmmm..

This is the sort of thread that makes me start believing the doomsday people "SEO is dead" ;O)

My brain hurts..

Are we getting lost in the smoke and mirrors

Having read the analysis of the patent, other than getting a headache, I can't say that its going to have any effect on the way a site is seo'd other than we have to get away from 'big bang' changes to using a more organic approach.

If you look at the way a site would normally progress, then what you would expect is a great deal of activity when it first gets established and then progressive update activity after that. IMHO all Google is trying to do is to detect patterns that don't follow that to the point that they can say that its definitive or potential spam and it gets ranked accordingly.

Dead?

I don't think it is dead, just bildungoptimization, which is fine since we're dealing with bildungsearch and bildungweb. Quite a bit of dung huh?

New tools will be created to confront new challenges. All the changes are a good thing. Stasis is death. Not change.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.