Google, Yahoo, Live, Ask support Sitemaps auto-submission

7 comments

Now, by just entering the URL to your Sitemaps XML URL listing in your robots.txt file, *all* search engines can automatically discovery it! Much more convenient than manually pinging each engine.

http://googlewebmastercentral.blogspot.com/2007/04/whats-new-with-sitemapsorg.html

Comments

Whoo Hoo!!

This is the best news I have heard all day!

I did however bitch about it, because in essence they are switching the robots.txt file from an exclusionary document, to an inclusionary document... actually it's a mixture of both.. It should of been a differing file.

Nice

It's good to see more movement on standardization, such as the REVERSE->FORWARD DNS thing Google/MSN did and others are doing now to help webmasters stop spoofing as well.

mixing the purpose of Robots.txt

Yes, The Founder, I agree with the fact that they're slightly abusing the intention of Robots.txt -- it's called the Robots Exclusion Standard for a reason.

I admit, though, as a matter of convenience I like it. Perhaps Robots.txt should evolve for the sake helping webmasters, rather than being so narrow in it scope.

What's wrong w/ evolvement? It's just not going to happen timely

Who has never used "Allow:" directives in robots.txt? If really not, who did never had wished that there's a simple procedure to refine sledgehammer "Disallow:" directives on directory-level with smart "Allow:" pinpricks applied to URL-fragments or even parts of query strings?

Ancient standards like the robots exclusion protocol from 1994 must evolve to fit the current needs. There's nothing wrong with implicit allowance hints like pointers to an XML sitemap. The robots exclusion standard was extended already, for example with new values of the robots meta tag (noodp, noydir), and I surely like self-written snippets on the SERPs.

The whole mess it that there are way too many places to put crawler directives, and way too many formats involved. I hope the engines got the message from todays robots.txt summit: get together and standardize the confusing conglomerate of crawler directives.

I'd like to see a "Web-Robot Directives Standard" soon ;)

So Google is making Robots.txt a Very Important File

So Google is making Robots.txt a Very Important File

If a tree falls

in the woods and nobody is there...err...if Ask reads sitemaps and nobody uses their search engine....does it make a sound?

It appears the only way to change standards

...is for large players like this to jump in and make changes happen.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.