Google Revolutionizes Search with “Sitemaps”

Share Article

Google, the internetÂ?s most popular search engine, has recently unveiled Sitemaps, which they have dubbed Â?an experiment in web crawlingÂ?. This experiment may very well revolutionize the way that search engines find and display Web sites. In the near future, it is also likely that this new development will begin to affect website rankings on the major search engines.

Sitemaps is a way for Google’s web crawlers to access pages that might otherwise never be indexed or found on Google.com. Web crawlers or spiders are used by search engines to find web pages located online and compile them into an index. The search engine index is then used to direct searchers to the most relevant pages. Frequently, pages are ignored or never indexed by search engines because they are not visited by the search engine’s spider. Quite often, website owners, designers, and programmers are unable to successfully have their Web pages indexed by the major search engines.

According to Google, Sitemaps will allow Web site owners to “inform search engine crawlers about URL’s on your Web sites that are available for crawling.” Sitemaps uses XML, short for Extensible Markup Language, to communicate with Google’s spiders. Web site owners will now have the ability to generate an XML coded page on their server that lists all of the URL's they feel should be included in search engine indexes.

Sitemaps XML format also has added features that allow website owners to let Google know certain specifics about each individual page. Characteristics include parameters such as the date when the Web page was most recently updated, how often the content changes on a Web page, and how important a page is in relation to other pages on that Web site. Information like this will allow Google to prioritize its spidering and indexing activity on Web sites that implement Sitemaps.

A large majority of Web sites currently have a page on their site called a site map that links to important pages located on that website. This definition of “site map” should not be confused with Google’s new protocol “Sitemaps”. Traditional site maps are not part of Google’s new program, and will not work in conjunction with Google Sitemaps. Site maps as we knew them before should not be removed from a Web site. They still provide a useful tool and are important aspects of Search Engine Marketing and user navigation.

An effective implementation of Google Sitemaps will be of great benefit to Web sites that do not have all of their important Web pages and content indexed. For example, if a Web site had 200 web pages, and only 150 were indexed, the site would be missing out on a considerable amount of natural search traffic. Using Google’s Sitemaps protocol will show Google important pages that are not currently being indexed or ranked. This will help a Web site’s overall search engine indexability as well as its rankings.

Google makes it clear that the use of this new protocol will not guarantee anything in the way of rankings or indexing. Google’s Sitemaps has just been introduced, and as of this writing, is still in “beta” format. Google also mentions that Sitemaps will not replace normal spidering on a Web site, but will actually complement it. It is possible however that Web sites which use Sitemaps will have an advantage over those that do not use this protocol. Overall, Google Sitemaps is a notable innovation. Its implementation has the potential to yield great benefits to a successful search engine marketing campaign.

Danny Shepherd represents Titan SEO, Inc. an Escondido, California, based Search Engine Marketing Firm. To subscribe to Titan SEO’s email newsletter, visit http://www.Titan-SEO.com/index-4.html.

# # #

Share article on social media or email:

View article via:

Pdf Print

Contact Author

Danny Shepherd
TITAN SEO
1-800-658-7511
Email >
Visit website