The use of robots.txt and sitemaps in the Spanish public administration
DOI:
https://doi.org/10.3145/epi.2009.nov.05Keywords:
Robots, Crawlers, Sitemaps, Search engines, Information retrieval, Visibility, Web sitesAbstract
Robots.txt and sitemaps files are the main methods to regulate search engine crawler access to its content. This article explain the importance of such files and analyze robots.txt and sitemaps from more than 4,000 web sites belonging to spanish public administration to determine the use of these files as a medium of optimization for crawlers.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Dissemination conditions of the articles once they are published
Authors can freely disseminate their articles on websites, social networks and repositories
However, the following conditions must be respected:
- Only the editorial version should be made public. Please do not publish preprints, postprints or proofs.
- Along with this copy, a specific mention of the publication in which the text has appeared must be included, also adding a clickable link to the URL: http://www.profesionaldelainformacion.com
- Only the final editorial version should be made public. Please do not publish preprints, postprints or proofs.
- Along with that copy, a specific mention of the publication in which the text has appeared must be included, also adding a clickable link to the URL: http://revista.profesionaldelainformacion.com
Profesional de la información journal offers the articles in open access with a Creative Commons BY license.