Search engines and a new revamped algo


Some simple facts that make me believe we soon will see the death of the current links based and content algo:

1). Hand passable content can be relatively easily created.

2). Links can be easily created

3). The average search is now nearly 4 words.

4). Domain name purchasing costs are cheap

5). Hosting is cheap.

Hence if you are trying to rank for one of the millions and millions of 3/4/5/etc keyword terms you can easily and cheaply. You can rank for lots and lots of them and just pull a bit of traffic from lots of sites.

With the current algo this is easily doable and we will see lots of it in the next 24 months.

So as an se it is not possible to hand check these, so you need to use an algo. As these sites will have some relevant content they will have users sitting on them for a while, so what will the new algo do to stop this, or will the s's not care.



Hence live bookmarks

Hence live bookmarks becoming a part of the algo's.

Bulldog goes to drawing board on how to create as many live bookmark accounts as possbile.

The thing is, so what if people are creating on topic content for those 3/4 word searches. The SE's claim it is all about relevancy, most of the auto stuff I see is still on topic and the adwords are relevant ;-)


We are within 2 years of real, true, authentic, themed, auto-gen'd content using AI bots. (Actually, I think some are on the brink now, but didn't want to scare Matt & Tim.) While that is mind-boggling prospect, imagine what 2k, 4k, or 256k smart, funny, prolific, and auto-propagated blog-bitches could, no, will do to the serps.

Scrapers and auto generated

Scrapers and auto generated sites are already so good ordinary users make repeat visits and even ask to be listed etc. If it’s only the competition and scraped website owners who complain, and your average user is non the wiser, do Google need to do anything about it?

isnt clickstream data the

isnt clickstream data the search engines response to this? i'm sure many of the auto-generated sites will be okay even if that is the search engine response, but i think a lot wont.

i suppose those who value manual content production will need to focus on personalization, immediacy, and the creation of an identity -- stuff that is more about relationship-building as opposed to content-building. IMO this in turn sets the stage for where auto-generated content will go as well.


I get 10 - 20 emails a day from one site, and believe there is no human behind it !!


Back to Yahoo Directory

The only way I can see that the SEs will ever be able to enforce quality will be using the old money talks and BS walks philosophy so the solution is simple in paid hand-checked inclusiona will probably prevail. The current scrapers with a thousand domains on a single server would probably go out of business if they couldn't make the top 100 results unless they paid the one time inclusion fee of $299 or whatever it is for each domain.

They couldn't afford it and they'd all soon be dust in the wind.

The SEs will probably initially leverage Yahoo's paid inclusion directory content as a huge part of determining who deserves top billing in all the SERPs and those not serious about being in business, ie. never paid to be hand checked for inclusion EVER, will probably sink to the bottom of the listings.

That will make a nice secondary market for niche directories and lesser search engines just to give those that can't afford to pay a shot at visibility, which is how my directory started in the first place as 20,000 people can't all be in the top 10.

See a problem, find a solution, make a buck.

don't bet the farm

as 80% of the above sounds like bollocks to me :-)


...far more than 80% of the people pay you no mind, andy, hhh.


I'm not talking about scraping in the usual sense. Let's see, before it can go legit it'll need an acronym --we'll call it "cognitive robotic reading" or CRR. And, some of the AI apps are already out there, some commercially available, in fact.


got me!

$2000 if they can get it to use a keyboard

otherwise, "artificial intelligence engine using more than 50 algorithms to simulate hormones and sophisticated emotions" is just $200
pleo by furby company (for the record, furbies could get to be a PITA)


Isn't it just a question of a lot of peanuts and corresponding typewriters?

quality site

One day soon, or maybe now, it will be impossible for not only a spider to see the difference between a hand built site and an auto generated one.

Then what is the definition of a quality site.


users decide

>> Then what is the definition of a quality site.

number of people bookmarking
toolbar info like length of stay
toolbar user input like ranking pages for merit
GA info like sources of traffic to and from the page
social networking/tagging (till tagging becomes too spammy)

Isn't it the case that they're working on letting users decide which are quality sites?

are you guys forgetting the dynamic nature of web publishing?

y'all sound like a bunch o' newbies.

Chat rooms were hot. They built AOL/Compuserve. Bots came in all natural-like and ruined it. Did the sky fall? Did the online world end? Did people abandon the Internet even though chat rooms are all but unusable with hundreds of Erotobots hawking webcams or DrPhilBots seeking out fatties to advise about hoodia and SlimTrim?

Do you even know anyone today who participates in "chat rooms" ? How about Blog comment spam? Did it ruin blogging? Email spam ruin email?

I am more afraid of the telcos metering Internet traffic than search engines changing abruptly. We can all flex around most anything on the user side but without bandwidth.... ouch.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.