Jump to content

Google Adds Site Map Optimizers


Recommended Posts

Google today has added a new "feature". From their new pages...

 

 

What is Google Sitemaps?

 

Google Sitemaps is an experiment in web crawling. Using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and improve the time to inclusion in our index. By placing a Sitemap-formatted file on your webserver, you enable our crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.

 

Basically, the two steps to participating in Google Sitemaps are:

 

1. Generate a Sitemap in the correct format using Sitemap Generator.

2. Update your Sitemap when you make changes to your site.

 

Some of it is beyond my understanding but I am sure some of you will understand it.

 

Here is the link: https://www.google.com/webmasters/sitemaps/...s/en/about.html

Good Luck, youneverknow

Link to comment
Share on other sites

Ya gotta love it. I think what it breaks down to is pretty simple - Google is not charging you to put your own web site info into their search engine.

And they are not even charging you to create a file that will make their spiders more efficient, their jobs easier and reduce their computing overhead. What sweethearts!!!! :wallbash:

Link to comment
Share on other sites

I read something about there little Perl script (was it python?...I dont' remember)...but someone tried out one of the the first releases and it crashed his server due to so much resource usage...it has more than likely been resolved by now, but just thoguht I would mention that.

Link to comment
Share on other sites

Are they serious? Really?

 

Because I thought there was a meta tag you could add to a page that would do the same thing. Oh, right. "Revisit-after" If I put "Revisit-after" on all my pages, Google should be able to make up their own lists of sites and pages. Which... is sort of what they do for a living. But... if I can't be bothered to put "Revisit-after" on my pages, what in the WORLD makes the Google folks think I am going to stick an entire program on my site and remember to run it every time I make an update? Yeah, yeah. Cron-job. Are they paying me for my computer usage?

 

Bah. This doesn't add a thing to my site that Google shouldn't be doing anyway. Wake me up when they stop shooting off fireworks to bedazzle everyone and decide to provide something that's actually useful for me.

Link to comment
Share on other sites

Because I thought there was a meta tag you could add to a page that would do the same thing. Oh, right. "Revisit-after" If I put "Revisit-after" on all my pages, Google should be able to make up their own lists of sites and pages.

 

Revisit-after is only a suggestion from you to the search engines about how often they should bother to check your site for changes. It is mostly ignored and with or without it the spiders will behave nearly or exactly the same.

 

If a spider can find your site it will go through the whole thing with no special meta-tags or sitemap's help. The thing about this is it sounds like they are doing something special for us but really, when we use their Site Map indexing stuff it means they can hit our site once and not have to travel through it to see what's there - thus saving THEM time, money, resources, etc. and the only thing it does for us is reduces some of the bandwidth they suck up looking around in our sites... which may explain why some sites have reported Googlebot going nuts in their sites :)

Link to comment
Share on other sites

I agree with you Jim. I think it just didn't come out right because my brain is currently bogged down in dialogic theory (thesis fun... not).

 

The only case in which I see this site map being actually useful for me (as opposed to them, the leeches) is if the other major search engines jumped on board and said they would also use it (like robots.txt). Then my work is more worthwhile, because it will affect more than one search engine. But proprietary code? Didn't we learn from the early Netscape/Internet Explorer battles?

 

I also think the site map could provide a huge (unintended?) benefit from a usability point. The file is in XML format, from what I saw. If a few more attributes were added to the file, it would be possible to contain all the data you'd need to pull to create an actual HTML sitemap for your visitors to use. That would be awesome -- If I could improve communication with several major search engines AND improve communiation with my visitors, I'd be at the front of the line to add the code to my site.

 

But like you said--right now it's just making Google's job easier. Forget that.

Link to comment
Share on other sites

Yes, visitor/human readable site maps are a hugely good thing so it would be useful if we could adapt it.

 

As for other engines adopting Google's technology as a new web standard? It is very unlikely they'd do that. As soon as Google's stuff becomes a web standard then the others have lost the PR game. It's like Ford using Toyota engines because they work better -- the engines may indeed be better but they'll never admit it. :)

Link to comment
Share on other sites

I don't see how either of them could be used to describe a site map.

Why not? The file format as proposed by Google is nothing more than an XML-like file with the permalink and the modification date of each page of your site:

 

><url>
<loc>
http://braintags.com/linkdump/2005/06/less_cursing_better_pictures/
</loc>
<lastmod>2005-06-13T16:06:04Z</lastmod>
</url>

This information is already present in the Atom specification, but Google decided not to use existing formats and create their own.

Link to comment
Share on other sites

  • 10 months later...
Google today has added a new "feature". From their new pages...

 

 

What is Google Sitemaps?

 

Google Sitemaps is an experiment in web crawling. Using Sitemaps to inform and direct our crawlers, we hope to expand our coverage of the web and improve the time to inclusion in our index. By placing a Sitemap-formatted file on your webserver, you enable our crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.

 

Basically, the two steps to participating in Google Sitemaps are:

 

1. Generate a Sitemap in the correct format using Sitemap Generator.

2. Update your Sitemap when you make changes to your site.

 

Some of it is beyond my understanding but I am sure some of you will understand it.

 

Here is the link: https://www.google.com/webmasters/sitemaps/...s/en/about.html

Good Luck, youneverknow

I just got an 'invite' from google to use their 'Sitemaps' program. Does anyone have any updated user experiences so I can determine if this is something worthwhile to commit time and resources to?

 

Thanks!

Link to comment
Share on other sites

do it. you won't have to pray to the google gods to find your pages. google will already know what's there. if you're using a cms or a blogging system, there are probably utilities available so that you can automagically create a well formatted xml sitemap for google. you can also just list the urls of your pages in a plain text file if you're not able to do the xml. i like xenu for that.

Link to comment
Share on other sites

I've been using sitemap.xml since December. It took about a month before I noticed a real difference in how Google crawled my site. Now Google pulls the updated pages from my site within 12 hours of an updated sitemap.

 

Plus the statistics and robots.txt file check are nice to use from time to time.

 

They have just this week changed the interface a bit so I'm getting used to the new layout now.

Link to comment
Share on other sites

I’m trying to add a google site map for this url -- http://www.biom.net/index.php/teahouse/sitemap/-- but I’m getting the error message:

 

>Invalid URL. This is not a valid URL

 

I believe the sitemap is supposed to be on the same ‘level’ as the index page and that htaccess needs to be used to serve up the rss feed in such a way that it meets this requirement. Can anyone offer some insights on how to get this done?

 

 

Thanks!

Link to comment
Share on other sites

You must be using IE as your browser. Try View Source in IE, that should show you the XML data. Your sitemap file shows directly in FireFox.

 

Looks like you have your index.php file set to priority 1.0 and always updating.

 

Remember the sitemap file is an .XML file and is not intended to display in a browser.

 

Visit the Google Sitemaps location for the statistics and results after you have submitted it. (It may take 12 hours for Google to read it after you submit)

 

><loc>http://www.biom.net/index.php</loc>
<lastmod>2006-05-04T11:49:01-08:00</lastmod>
<changefreq>always</changefreq>
<priority>1.0</priority>

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...