The use of robots.txt and sitemaps in the Spanish public administration

Authors

  • Bonifacio Martí­n-Galán El profesional de la información
  • Tony Hernández-Pérez
  • David Rodrí­guez-Mateos
  • Daniel Peña-Gil

DOI:

https://doi.org/10.3145/epi.2009.nov.05

Keywords:

Robots, Crawlers, Sitemaps, Search engines, Information retrieval, Visibility, Web sites

Abstract

Robots.txt and sitemaps files are the main methods to regulate search engine crawler access to its content. This article explain the importance of such files and analyze robots.txt and sitemaps from more than 4,000 web sites belonging to spanish public administration to determine the use of these files as a medium of optimization for crawlers.

Downloads

Download data is not yet available.

Published

2009-08-08

How to Cite

Martí­n-Galán, B., Hernández-Pérez, T., Rodrí­guez-Mateos, D., & Peña-Gil, D. (2009). The use of robots.txt and sitemaps in the Spanish public administration. Profesional De La información, 18(6), 625–632. https://doi.org/10.3145/epi.2009.nov.05