Thursday, September 01, 2005

Help Google Crawl Your Site Better

Google has introduced a new service, Google Sitemaps, for website publishers or webmasters to inform Google about their sites. The service, which is still in BETA, will help Google search engine to crawl pages that are hardly accessible thru links. If you do not know about this, all pages of your site are actually crawled and cached by a search engine "spider" through links in your site.

To use this service, you will have to create a sitemap in XML format. You can create it with a sitemap generator (runs on Python), or possibly write a text file with URLs on your own. After you have created the sitemap (an XML file), you will need to upload the file to your site before you can submit its link to Google thru this page. They also accept syndication feeds such as RSS and Atom. A more comprehensive guide is made available by Google at this location.

In addition to that, there's a sitemap services for mobile devices. Read more about it here.

4 Comments:

Anonymous Anonymous said...

I guess that is quite useful for static pages, but for blogs it will be pretty useless I would guess. I might give it ago with one of my sites to see if crawls it better...

3:47 PM  
Blogger EngLee said...

It's always worth to try. :)

Alternatively, I think you can check whether google has cached each of your pages by searching "site:www.yourdomain.com", before you submit your RSS to them.

4:04 PM  
Anonymous Anonymous said...

I meight want to use this in a project. We have preperated a RSS feed with all the content of the website. This will be fead to google. I;ve read somewhere that you can submit your rss feed and google will transfer this to usable content. Smart, efficient, fast and always up to date.

4:48 AM  
Blogger EngLee said...

Yes, should utilize it! :)

7:03 AM  

Post a Comment

<< Home

Can't find what you want? Search it here!


Google
 
Web lifeasprogrammer.blogspot.com